Building OpenJDK with Custom Code Pages

I was recently poking around the Issue Navigator – Java Bug System (openjdk.org) for enhancements. I found this interesting issue: [JDK-8268719] Force execution (and source) code page used when compiling on Windows – Java Bug System (openjdk.org). By default, I can build the OpenJDK code without any changes on my system. What is my code page?

Checking Your Windows Code Page

See Code Pages – Win32 apps for an overview of why code pages exist (or start from Unicode and Character Sets – Win32 apps for the complete picture).

A Windows operating system always has one currently active Windows code page. All ANSI versions of API functions use the currently active code page.

Code Pages – Win32 apps | Microsoft Learn

To see your current ANSI code page, use the reg command from command line – How to see which ANSI code page is used in Windows? – Stack Overflow:

C:\> reg query "HKLM\SYSTEM\CurrentControlSet\Control\Nls\CodePage" -v ACP

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Nls\CodePage
    ACP    REG_SZ    1252

C:\> reg query "HKLM\SYSTEM\CurrentControlSet\Control\Nls\CodePage" | findstr /I "CP.*REG_SZ"
    ACP    REG_SZ    1252
    OEMCP    REG_SZ    437
    MACCP    REG_SZ    10000

To change the active code page, go to Control Panel > Region. Click on the “Change system locale…” button in the Administrative tab.

The Region Dialog Box

The Region Settings dialog will pop up. Select a different locale e.g. Japanese (Japan).

Reboot when prompted. You can verify (even before rebooting) that the active and OEM code pages have changed. Locales like Kiswahili (Kenya) and English (India) did not change the code page values (and therefore didn’t prompt to reboot).

C:\> reg query "HKLM\SYSTEM\CurrentControlSet\Control\Nls\CodePage" | findstr /I "CP.*REG_SZ"
    ACP    REG_SZ    932
    OEMCP    REG_SZ    932
    MACCP    REG_SZ    10001
Change System Locale Reboot Dialog

After rebooting, I delete the build directory then configure and build OpenJDK again. This time the build fails with these errors:

ERROR: Build failed for target 'images' in configuration 'windows-x86_64-server-slowdebug' (exit code 2) 
Stopping javac server

=== Output from failing command(s) repeated here ===
* For target hotspot_variant-server_libjvm_gtest_objs_test_json.obj:
test_json.cpp
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(357): error C2143: syntax error: missing ')' before ']'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(355): error C2660: 'JSON_GTest::test': function does not take 1 arguments
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(49): note: see declaration of 'JSON_GTest::test'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(355): note: while trying to match the argument list '(const char [171])'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(357): error C2143: syntax error: missing ';' before ']'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(357): error C2059: syntax error: ']'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(357): error C2017: illegal escape sequence
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(357): error C2059: syntax error: ')'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(363): error C2143: syntax error: missing ')' before ']'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(361): error C2660: 'JSON_GTest::test': function does not take 1 arguments
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(49): note: see declaration of 'JSON_GTest::test'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(361): note: while trying to match the argument list '(const char [174])'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(363): error C2143: syntax error: missing ';' before ']'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(363): error C2059: syntax error: ']'
   ... (rest of output omitted)

* All command lines available in /cygdrive/d/java/forks/jdk/build/windows-x86_64-server-slowdebug/make-support/failure-logs.
=== End of repeated output ===

No indication of failed target found.
HELP: Try searching the build log for '] Error'.
HELP: Run 'make doctor' to diagnose build problems.

To see the command line, cat the .cmdline file shown below. The full command line is at hotspot_variant-server_libjvm_gtest_objs_test_json.obj.cmdline.

cat /d/java/forks/jdk/build/windows-x86_64-server-slowdebug/make-support/failure-logs/hotspot_variant-server_libjvm_gtest_objs_test_json.obj.cmdline

The Visual C++ compiler’s behavior when reading source files depends on whether or not source files have a byte-order mark.

By default, Visual Studio detects a byte-order mark to determine if the source file is in an encoded Unicode format, for example, UTF-16 or UTF-8. If no byte-order mark is found, it assumes that the source file is encoded in the current user code page, unless you’ve specified a code page by using /utf-8 or the /source-charset option.

/utf-8 (Set source and execution character sets to UTF-8)

This can be easily tested using hexdump in Cygwin. Launch notepad and open the test.txt file created by these commands. The File > Save as dialog has an Encoding dropdown that write a byte-order marker for any of the UTF options. Running hexdump will display the byte-order markers.

echo abc123 > test.txt
hexdump -C test.txt

Inspect the OpenJDK source file failing to build confirms that there is no BOM in the file. (can this be done on GitHub?)

$ hexdump -C /cygdrive/d/java/forks/jdk/test/hotspot/gtest/utilities/test_json.cpp | head
00000000  2f 2a 0a 20 2a 20 43 6f  70 79 72 69 67 68 74 20  |/*. * Copyright |
...

Updating CFLAGS

Add the -utf-8 option to TOOLCHAIN_CFLAGS_JVM in flags-cflags.m4.

diff --git a/make/autoconf/flags-cflags.m4 b/make/autoconf/flags-cflags.m4
index c0c78ce95b6..bbb0426c368 100644
--- a/make/autoconf/flags-cflags.m4
+++ b/make/autoconf/flags-cflags.m4
@@ -560,7 +560,9 @@ AC_DEFUN([FLAGS_SETUP_CFLAGS_HELPER],
     TOOLCHAIN_CFLAGS_JVM="-qtbtable=full -qtune=balanced -fno-exceptions \
         -qalias=noansi -qstrict -qtls=default -qnortti -qnoeh -qignerrno -qstackprotect"
   elif test "x$TOOLCHAIN_TYPE" = xmicrosoft; then
-    TOOLCHAIN_CFLAGS_JVM="-nologo -MD -Zc:preprocessor -Zc:strictStrings -Zc:inline -MP"
+    # The -utf8 option sets source and execution character sets to UTF-8 to enable correct
+    # compilation of all source files regardless of the active code page on Windows.
+    TOOLCHAIN_CFLAGS_JVM="-nologo -MD -Zc:preprocessor -Zc:strictStrings -Zc:inline -MP -utf-8"
     TOOLCHAIN_CFLAGS_JDK="-nologo -MD -Zc:preprocessor -Zc:strictStrings -Zc:inline -Zc:wchar_t-"
   fi

The build still fails but this time the error is from the java.desktop tree.

ERROR: Build failed for target 'images' in configuration 'windows-x86_64-server-slowdebug' (exit code 2) 

=== Output from failing command(s) repeated here ===
* For target support_native_java.desktop_libfreetype_afblue.obj:
afblue.c
d:\java\forks\jdk\src\java.desktop\share\native\libfreetype\src\autofit\afblue.c(1): error C2220: the following warning is treated as an error
d:\java\forks\jdk\src\java.desktop\share\native\libfreetype\src\autofit\afblue.c(1): warning C4819: The file contains a character that cannot be represented in the current code page (932). Save the file in Unicode format to prevent data loss
d:\java\forks\jdk\src\java.desktop\share\native\libfreetype\src\autofit\afscript.h(1): warning C4819: The file contains a character that cannot be represented in the current code page (932). Save the file in Unicode format to prevent data loss
d:\java\forks\jdk\src\java.desktop\share\native\libfreetype\src\autofit\afblue.c(257): warning C4819: The file contains a character that cannot be represented in the current code page (932). Save the file in Unicode format to prevent data loss
   ... (rest of output omitted)
* For target support_native_java.desktop_libfreetype_afcjk.obj:
afcjk.c
...

To see the command line, cat the .cmdline file shown below. The full command line is at support_native_java.desktop_libfreetype_afblue.obj.cmdline.

cat /d/java/forks/jdk/build/windows-x86_64-server-slowdebug/make-support/failure-logs/support_native_java.desktop_libfreetype_afblue.obj.cmdline

TOOLCHAIN_CFLAGS_JDK in flags-cflags.m4 needs the -utf-8 compiler flag as well.

diff --git a/make/autoconf/flags-cflags.m4 b/make/autoconf/flags-cflags.m4
index c0c78ce95b6..8655dfe41fb 100644
--- a/make/autoconf/flags-cflags.m4
+++ b/make/autoconf/flags-cflags.m4
@@ -560,8 +560,10 @@ AC_DEFUN([FLAGS_SETUP_CFLAGS_HELPER],
     TOOLCHAIN_CFLAGS_JVM="-qtbtable=full -qtune=balanced -fno-exceptions \
         -qalias=noansi -qstrict -qtls=default -qnortti -qnoeh -qignerrno -qstackprotect"
   elif test "x$TOOLCHAIN_TYPE" = xmicrosoft; then
-    TOOLCHAIN_CFLAGS_JVM="-nologo -MD -Zc:preprocessor -Zc:strictStrings -Zc:inline -MP"
-    TOOLCHAIN_CFLAGS_JDK="-nologo -MD -Zc:preprocessor -Zc:strictStrings -Zc:inline -Zc:wchar_t-"
+    # The -utf-8 option sets source and execution character sets to UTF-8 to enable correct
+    # compilation of all source files regardless of the active code page on Windows.
+    TOOLCHAIN_CFLAGS_JVM="-nologo -MD -Zc:preprocessor -Zc:strictStrings -Zc:inline -utf-8 -MP"
+    TOOLCHAIN_CFLAGS_JDK="-nologo -MD -Zc:preprocessor -Zc:strictStrings -Zc:inline -utf-8 -Zc:wchar_t-"
   fi

   # CFLAGS C language level for JDK sources (hotspot only uses C++)

These 2 changes enable the build to complete successfully. The upstream pull request is 8268719: Force execution (and source) code page used when compiling on Windows by swesonga · Pull Request #15569 · openjdk/jdk (github.com).


Diagnosing Hadoop Native Library Load Failures

Running a Basic Hadoop Command

The instructions for how to run hadoop haven’t changed much since I last used hadoop over 5 years ago (see Setting up Apache Hadoop). Download a recent stable release from one of the Apache Download Mirrors. I picked hadoop-3.3.5-aarch64.tar.gz from https://dlcdn.apache.org/hadoop/common/hadoop-3.3.5/.

mkdir -p ~/java/binaries/hadoop
cd ~/java/binaries/hadoop

curl -Lo hadoop-3.3.5-aarch64.tar.gz https://dlcdn.apache.org/hadoop/common/hadoop-3.3.5/hadoop-3.3.5-aarch64.tar.gz

tar xzf hadoop-3.3.5-aarch64.tar.gz

I used the instructions at Apache Hadoop 3.3.5 – Hadoop: Setting up a Single Node Cluster to test the build by running the grep example. See the Grep source code for the implementation details of the example.

export JAVA_HOME=~/java/binaries/jdk/x64/jdk-11.0.19+7/

mkdir testinput
cp etc/hadoop/*.xml testinput

bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.5.jar grep testinput testoutput 'dfs[a-z.]+'

cat testoutput/*

When running this test code, I noticed this warning (first message displayed):

2023-05-31 12:31:33,686 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Checking for Loadable Native Libraries

The Apache Hadoop 3.3.5 – Native Libraries Guide explains that there is a NativeLibraryChecker that can be run using the command bin/hadoop checknative -a to show which native libraries can/cannot be loaded.

saint@ubuntuvm:~/java/binaries/hadoop/hadoop-3.3.5$ find . -name lib*.so
./lib/native/libhadoop.so
./lib/native/libhdfspp.so
./lib/native/libhdfs.so
./lib/native/libnativetask.so
saint@ubuntuvm:~/java/binaries/hadoop/hadoop-3.3.5$ uname -a
Linux ubuntuvm 5.19.0-41-generic #42~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Tue Apr 18 17:40:00 UTC 2 x86_64 x86_64 x86_64 GNU/Linux
saint@ubuntuvm:~/java/binaries/hadoop/hadoop-3.3.5$ bin/hadoop checknative -a
2023-05-31 13:36:04,467 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Native library checking:
hadoop:  false 
zlib:    false 
zstd  :  false 
bzip2:   false 
openssl: false 
ISA-L:   false 
PMDK:    false 
2023-05-31 13:36:04,711 INFO util.ExitUtil: Exiting with status 1: ExitException

Diagnosing Native Library Load Errors

My assumption when seeing that none of these native libraries could be loaded was that I needed to install all those dependencies. I started with lib64z.

saint@ubuntuvm:~/java/binaries/hadoop/hadoop-3.3.5$ sudo apt install lib64z1
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
The following additional packages will be installed:
  gcc-12-base:i386 krb5-locales libc6:i386 libc6-amd64:i386 libcom-err2:i386 libcrypt1:i386
  libgcc-s1:i386 libgssapi-krb5-2 libgssapi-krb5-2:i386 libidn2-0:i386 libk5crypto3 libk5crypto3:i386
  libkeyutils1:i386 libkrb5-3 libkrb5-3:i386 libkrb5support0 libkrb5support0:i386 libnsl2:i386
  libnss-nis:i386 libnss-nisplus:i386 libssl3 libssl3:i386 libtirpc3:i386 libunistring2:i386
Suggested packages:
  glibc-doc:i386 locales:i386 krb5-doc krb5-user krb5-doc:i386 krb5-user:i386
The following NEW packages will be installed:
  gcc-12-base:i386 krb5-locales lib64z1:i386 libc6:i386 libc6-amd64:i386 libcom-err2:i386
  libcrypt1:i386 libgcc-s1:i386 libgssapi-krb5-2:i386 libidn2-0:i386 libk5crypto3:i386
  libkeyutils1:i386 libkrb5-3:i386 libkrb5support0:i386 libnsl2:i386 libnss-nis:i386
  libnss-nisplus:i386 libssl3:i386 libtirpc3:i386 libunistring2:i386
The following packages will be upgraded:
  libgssapi-krb5-2 libk5crypto3 libkrb5-3 libkrb5support0 libssl3
5 upgraded, 20 newly installed, 0 to remove and 85 not upgraded.
Need to get 10.3 MB/12.2 MB of archives.
After this operation, 38.1 MB of additional disk space will be used.
Do you want to continue? [Y/n] 

Interestingly, rerunning checknative still showed false for all the native libraries! Next step was to inspect how the checknative argument is handled. It invokes the hadoop/NativeLibraryChecker.java class, which in turn calls the hadoop/NativeCodeLoader.java. One of the most important observations in the latter file is the additional debug logging available when the library doesn’t load!

Enabling Debug Logging

The logging code uses LoggerFactory, which is discussed in the Introduction to SLF4J | Baeldung. My question is now about how to change slf4j level at runtime? – Stack Overflow. A Google search for hadoop change log level leads me to another SO post on Setting the logging level in Hadoop to WARN – Stack Overflow but that isn’t as useful as the Hadoop commands guide at Apache Hadoop 2.7.0 –. Just need to pass the --loglevel flag to hadoop.

bin/hadoop --loglevel DEBUG checknative -a

The debug output is much now more informative! Notice the warning about the possible platform mismatch of the native library!

saint@ubuntuvm:~/java/binaries/hadoop/hadoop-3.3.5$ bin/hadoop --loglevel DEBUG checknative -a
2023-05-31 14:47:32,624 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
2023-05-31 14:47:32,625 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /home/saint/java/binaries/hadoop/hadoop-3.3.5/lib/native/libhadoop.so.1.0.0: /home/saint/java/binaries/hadoop/hadoop-3.3.5/lib/native/libhadoop.so.1.0.0: cannot open shared object file: No such file or directory (Possible cause: can't load AARCH64-bit .so on a AMD 64-bit platform)
2023-05-31 14:47:32,625 DEBUG util.NativeCodeLoader: java.library.path=/home/saint/java/binaries/hadoop/hadoop-3.3.5/lib/native
2023-05-31 14:47:32,625 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2023-05-31 14:47:32,836 DEBUG util.Shell: setsid exited with exit code 0
Native library checking:
hadoop:  false 
zlib:    false 
zstd  :  false 
bzip2:   false 
openssl: false 
ISA-L:   false 
PMDK:    false 
2023-05-31 14:47:32,847 DEBUG util.ExitUtil: Exiting with status 1: ExitException
1: ExitException
	at org.apache.hadoop.util.ExitUtil.terminate(ExitUtil.java:381)
	at org.apache.hadoop.util.ExitUtil.terminate(ExitUtil.java:369)
	at org.apache.hadoop.util.NativeLibraryChecker.main(NativeLibraryChecker.java:154)
2023-05-31 14:47:32,856 INFO util.ExitUtil: Exiting with status 1: ExitException

To determine the architecture for which the shared library was compiled, I started with the objdump -f command as suggested by a StackOverflow post. However, it outputs architecture: UNKNOWN!, which isn’t very useful. The file command from the same post proves to be exactly what I need.

saint@ubuntuvm:~/java/binaries/hadoop/aarch64/hadoop-3.3.5$ objdump -f lib/native/libhadoop.so

lib/native/libhadoop.so:     file format elf64-little
architecture: UNKNOWN!, flags 0x00000150:
HAS_SYMS, DYNAMIC, D_PAGED
start address 0x0000000000005b80

saint@ubuntuvm:~/java/binaries/hadoop/aarch64/hadoop-3.3.5$ file lib/native/libhadoop.so
lib/native/libhadoop.so: symbolic link to libhadoop.so.1.0.0
saint@ubuntuvm:~/java/binaries/hadoop/aarch64/hadoop-3.3.5$ file lib/native/libhadoop.so.1.0.0
lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, ARM aarch64, version 1 (SYSV), dynamically linked, BuildID[sha1]=19fbe9b0a7449eb05b687721548251af752b869f, with debug_info, not stripped

Turns out I was using an x86-64 Ubuntu VM instead of the aarch64 Ubuntu VM I had created so naturally, hadoop couldn’t load the aarch64 hadoop native library! For the VM I had been using, I needed to get the hadoop build by running:

curl -Lo hadoop-3.3.5.tar.gz https://dlcdn.apache.org/hadoop/common/hadoop-3.3.5/hadoop-3.3.5.tar.gz

Checking the loading status of the native libraries now indicates that the hadoop native library can be successfully loaded:

saint@ubuntuvm:~/java/binaries/hadoop/x64/hadoop-3.3.5$ bin/hadoop checknative -a
2023-05-31 14:58:40,869 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native
2023-05-31 14:58:40,877 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
2023-05-31 14:58:40,887 WARN erasurecode.ErasureCodeNative: Loading ISA-L failed: Failed to load libisal.so.2 (libisal.so.2: cannot open shared object file: No such file or directory)
2023-05-31 14:58:40,887 WARN erasurecode.ErasureCodeNative: ISA-L support is not available in your platform... using builtin-java codec where applicable
2023-05-31 14:58:41,035 INFO nativeio.NativeIO: The native code was built without PMDK support.
Native library checking:
hadoop:  true /home/saint/java/binaries/hadoop/x64/hadoop-3.3.5/lib/native/libhadoop.so.1.0.0
zlib:    true /lib/x86_64-linux-gnu/libz.so.1
zstd  :  true /lib/x86_64-linux-gnu/libzstd.so.1
bzip2:   true /lib/x86_64-linux-gnu/libbz2.so.1
openssl: false Cannot load libcrypto.so (libcrypto.so: cannot open shared object file: No such file or directory)!
ISA-L:   false Loading ISA-L failed: Failed to load libisal.so.2 (libisal.so.2: cannot open shared object file: No such file or directory)
PMDK:    false The native code was built without PMDK support.
2023-05-31 14:58:41,056 INFO util.ExitUtil: Exiting with status 1: ExitException

Switching to the aarch64 Ubuntu VM also showed the aarch64 hadoop native library being successfully loaded on that platform. In hindsight, the 386 architecture references when I installed lib64z could have been a warning sign if I wasn’t just blasting my way through running these commands.


Building Racket in Linux

The Racket website has documentation on how to clone the PLT repository.

git clone git://git.racket-lang.org/plt.git

Next, the src/README file has all the gory details on how to build Racket. The procedure is rather straightforward for Linux:

mkdir build
cd build
../configure
make
make install

I’m yet to figure out how to successfully build racket in Visual C++ (2008 Professional), so that’s the next item on my list.

Update (03/11/11): Actually rather straightforward for Visual C++ as well, run vsvars32.bat to ensure that devenv and other commands are in the path:

cd plt\src\worksp\
"C:\Program Files (x86)\Microsoft Visual Studio 9.0\Common7\Tools\vsvars32.bat"
build

I got the hint from this thread.


Building Firefox with Debug Symbols

Building firefox with debug symbols has been a rather tricky endeavor for me for quite some time. The documentation on how to do this seems to indicate that adding these lines to your mozconfig file should be sufficient:

export MOZ_DEBUG_SYMBOLS=1
ac_add_options --enable-debugger-info-modules=yes

should do the trick. However, I have (for months now) been running into this make error: no rule to make nspr4.pdb needed by export whenever I try to build with debug symbols. Bug 338224 has a pending patch for this issue. In the mean time, the following trick seems to solve the problem for me: force the -Zi compiler option into the compiler flags variable. For the nspr files, the PDB file will be generated as desired. For the other files, the -Zi option will be redundant but this fix gets you up and running ASAP. Here’s the associated patch file:

diff --git a/nsprpub/configure.in b/nsprpub/configure.in
--- a/nsprpub/configure.in
+++ b/nsprpub/configure.in
@@ -2827,18 +2827,18 @@ if test -n "$_SAVE_DEBUG_FLAGS"; then
 fi

 if test -n "$MOZ_OPTIMIZE"; then
     CFLAGS="$CFLAGS $_OPTIMIZE_FLAGS"
     CXXFLAGS="$CXXFLAGS $_OPTIMIZE_FLAGS"
 fi

 if test -n "$MOZ_DEBUG_SYMBOLS"; then
-    CFLAGS="$CFLAGS $_DEBUG_FLAGS"
-    CXXFLAGS="$CXXFLAGS $_DEBUG_FLAGS"
+    CFLAGS="$CFLAGS $_DEBUG_FLAGS -Zi"
+    CXXFLAGS="$CXXFLAGS $_DEBUG_FLAGS -Zi"
 fi

 if test -n "$MOZ_OPTIMIZE"; then
     OBJDIR_TAG=_OPT
 else
     OBJDIR_TAG=_DBG
 fi

Here is the mozconfig file I used with this patch:

. $topsrcdir/browser/config/mozconfig
mk_add_options MOZ_OBJDIR=@TOPSRCDIR@/../obj
ac_add_options --disable-xpconnect-idispatch
ac_add_options --disable-activex
ac_add_options --disable-activex-scripting
ac_add_options --disable-accessibility
ac_add_options --enable-optimize=no
ac_add_options --enable-debug-symbols=yes