I was recently trying to build the Prime95 Mersenne search software in Visual Studio 2022 when I got error messages about missing a gmp.h dependency.
1>C:\repos\gimps\p95v3019b13.source\common.h(23,10): error C1083: Cannot open include file: 'gmp.h': No such file or directory
...
This got me started trying to figure out how to build the GMP sources on Windows. It was easy to do in the MSYS MINGW64 shell. Use these steps:
cd /c/repos/gmp
curl -Lo gmp-6.3.0.tar.xz https://gmplib.org/download/gmp/gmp-6.3.0.tar.xz
unxz --keep gmp-6.3.0.tar.xz
tar xf gmp-6.3.0.tar
cd gmp-6.3.0
./configure
make
Such distractions aside, there is a link to download gmp-6.3.0.tar.xz:
cd /c/repos/gmp
curl -Lo gmp-6.3.0.tar.xz https://gmplib.org/download/gmp/gmp-6.3.0.tar.xz
file gmp-6.3.0.tar.xz
The file command outputs gmp-6.3.0.tar.xz: XZ compressed data, checksum CRC64. Thisrepresents XZ data compression, which is unfamiliar to me (haven’t run into this often). The unxz command can be used to decompress the file with the --keep option to avoid removing the source file.
unxz --keep gmp-6.3.0.tar.xz
tar xf gmp-6.3.0.tar
# Search for gmp.h
cd gmp-6.3.0
find . -name "gmp.h"
On an MS-DOS system DJGPP can be used to build GMP, and on an MS Windows system Cygwin, DJGPP and MINGW can be used. All three are excellent ports of GCC and the various GNU tools.
Trying building it in the MSYS MINGW64 Shell. The end of the ./configure output is shown below. The host type and install prefix are different from the Cygwin environment’s.
config.status: linking mpn/x86_64/k8/gmp-mparam.h to gmp-mparam.h
config.status: executing libtool commands
configure: summary of build options:
Version: GNU MP 6.3.0
Host type: x86_64-w64-mingw32
ABI: 64
Install prefix: /mingw64
Compiler: gcc
Static libraries: yes
Shared libraries: no
The make command succeeds in the MSYS MINGW64 Shell, running for 4 minutes. I can ignore Cygwin for now. Let’s try the Tutorial on GMP (colorado.edu). Copy the example into a file called mpz_simple1.c then use the command from the tutorial to compile it. Interestingly, I don’t need the -I and -L arguments from the tutorial. The gmp library must already be installed.
cd /c/repos/scratchpad/apps/gmp/tutorial
gcc -o mpz_simple1 mpz_simple1.c -lgmp
To see how gmp.h and the libraries are found, run these commands:
Poking around in file explorer shows 2022-01-05 timestamps for gmp.h and libgmp.*. Looks like these were indeed installed with MSYS. How do I automatically output the timestamps for each result of find? bash – How to loop through file names returned by find? – Stack Overflow suggests this command:
find . -name "*gmp*" | while IFS= read -r file; do ls -l $file; done
At this point, all we have seen is how to build GMP in the MSYS MINGW64 shell. We have also verified that we can build a sample GMP program, the Tutorial on GMP (colorado.edu). The Cygwin and Visual Studio environments can be investigated another time.
To change the active code page, go to Control Panel > Region. Click on the “Change system locale…” button in the Administrative tab.
The Region Settings dialog will pop up. Select a different locale e.g. Japanese (Japan).
Reboot when prompted. You can verify (even before rebooting) that the active and OEM code pages have changed. Locales like Kiswahili (Kenya) and English (India) did not change the code page values (and therefore didn’t prompt to reboot).
After rebooting, I delete the build directory then configure and build OpenJDK again. This time the build fails with these errors:
ERROR: Build failed for target 'images' in configuration 'windows-x86_64-server-slowdebug' (exit code 2)
Stopping javac server
=== Output from failing command(s) repeated here ===
* For target hotspot_variant-server_libjvm_gtest_objs_test_json.obj:
test_json.cpp
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(357): error C2143: syntax error: missing ')' before ']'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(355): error C2660: 'JSON_GTest::test': function does not take 1 arguments
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(49): note: see declaration of 'JSON_GTest::test'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(355): note: while trying to match the argument list '(const char [171])'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(357): error C2143: syntax error: missing ';' before ']'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(357): error C2059: syntax error: ']'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(357): error C2017: illegal escape sequence
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(357): error C2059: syntax error: ')'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(363): error C2143: syntax error: missing ')' before ']'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(361): error C2660: 'JSON_GTest::test': function does not take 1 arguments
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(49): note: see declaration of 'JSON_GTest::test'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(361): note: while trying to match the argument list '(const char [174])'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(363): error C2143: syntax error: missing ';' before ']'
d:\java\forks\jdk\test\hotspot\gtest\utilities\test_json.cpp(363): error C2059: syntax error: ']'
... (rest of output omitted)
* All command lines available in /cygdrive/d/java/forks/jdk/build/windows-x86_64-server-slowdebug/make-support/failure-logs.
=== End of repeated output ===
No indication of failed target found.
HELP: Try searching the build log for '] Error'.
HELP: Run 'make doctor' to diagnose build problems.
The Visual C++ compiler’s behavior when reading source files depends on whether or not source files have a byte-order mark.
By default, Visual Studio detects a byte-order mark to determine if the source file is in an encoded Unicode format, for example, UTF-16 or UTF-8. If no byte-order mark is found, it assumes that the source file is encoded in the current user code page, unless you’ve specified a code page by using /utf-8 or the /source-charset option.
This can be easily tested using hexdump in Cygwin. Launch notepad and open the test.txt file created by these commands. The File > Save as dialog has an Encoding dropdown that write a byte-order marker for any of the UTF options. Running hexdump will display the byte-order markers.
echo abc123 > test.txt
hexdump -C test.txt
Inspect the OpenJDK source file failing to build confirms that there is no BOM in the file. (can this be done on GitHub?)
diff --git a/make/autoconf/flags-cflags.m4 b/make/autoconf/flags-cflags.m4
index c0c78ce95b6..bbb0426c368 100644
--- a/make/autoconf/flags-cflags.m4
+++ b/make/autoconf/flags-cflags.m4
@@ -560,7 +560,9 @@ AC_DEFUN([FLAGS_SETUP_CFLAGS_HELPER],
TOOLCHAIN_CFLAGS_JVM="-qtbtable=full -qtune=balanced -fno-exceptions \
-qalias=noansi -qstrict -qtls=default -qnortti -qnoeh -qignerrno -qstackprotect"
elif test "x$TOOLCHAIN_TYPE" = xmicrosoft; then
- TOOLCHAIN_CFLAGS_JVM="-nologo -MD -Zc:preprocessor -Zc:strictStrings -Zc:inline -MP"
+ # The -utf8 option sets source and execution character sets to UTF-8 to enable correct
+ # compilation of all source files regardless of the active code page on Windows.
+ TOOLCHAIN_CFLAGS_JVM="-nologo -MD -Zc:preprocessor -Zc:strictStrings -Zc:inline -MP -utf-8"
TOOLCHAIN_CFLAGS_JDK="-nologo -MD -Zc:preprocessor -Zc:strictStrings -Zc:inline -Zc:wchar_t-"
fi
The build still fails but this time the error is from the java.desktop tree.
ERROR: Build failed for target 'images' in configuration 'windows-x86_64-server-slowdebug' (exit code 2)
=== Output from failing command(s) repeated here ===
* For target support_native_java.desktop_libfreetype_afblue.obj:
afblue.c
d:\java\forks\jdk\src\java.desktop\share\native\libfreetype\src\autofit\afblue.c(1): error C2220: the following warning is treated as an error
d:\java\forks\jdk\src\java.desktop\share\native\libfreetype\src\autofit\afblue.c(1): warning C4819: The file contains a character that cannot be represented in the current code page (932). Save the file in Unicode format to prevent data loss
d:\java\forks\jdk\src\java.desktop\share\native\libfreetype\src\autofit\afscript.h(1): warning C4819: The file contains a character that cannot be represented in the current code page (932). Save the file in Unicode format to prevent data loss
d:\java\forks\jdk\src\java.desktop\share\native\libfreetype\src\autofit\afblue.c(257): warning C4819: The file contains a character that cannot be represented in the current code page (932). Save the file in Unicode format to prevent data loss
... (rest of output omitted)
* For target support_native_java.desktop_libfreetype_afcjk.obj:
afcjk.c
...
When running this test code, I noticed this warning (first message displayed):
2023-05-31 12:31:33,686 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Checking for Loadable Native Libraries
The Apache Hadoop 3.3.5 – Native Libraries Guide explains that there is a NativeLibraryChecker that can be run using the command bin/hadoop checknative -a to show which native libraries can/cannot be loaded.
saint@ubuntuvm:~/java/binaries/hadoop/hadoop-3.3.5$ find . -name lib*.so
./lib/native/libhadoop.so
./lib/native/libhdfspp.so
./lib/native/libhdfs.so
./lib/native/libnativetask.so
saint@ubuntuvm:~/java/binaries/hadoop/hadoop-3.3.5$ uname -a
Linux ubuntuvm 5.19.0-41-generic #42~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Tue Apr 18 17:40:00 UTC 2 x86_64 x86_64 x86_64 GNU/Linux
saint@ubuntuvm:~/java/binaries/hadoop/hadoop-3.3.5$ bin/hadoop checknative -a
2023-05-31 13:36:04,467 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Native library checking:
hadoop: false
zlib: false
zstd : false
bzip2: false
openssl: false
ISA-L: false
PMDK: false
2023-05-31 13:36:04,711 INFO util.ExitUtil: Exiting with status 1: ExitException
Diagnosing Native Library Load Errors
My assumption when seeing that none of these native libraries could be loaded was that I needed to install all those dependencies. I started with lib64z.
saint@ubuntuvm:~/java/binaries/hadoop/hadoop-3.3.5$ sudo apt install lib64z1
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
The following additional packages will be installed:
gcc-12-base:i386 krb5-locales libc6:i386 libc6-amd64:i386 libcom-err2:i386 libcrypt1:i386
libgcc-s1:i386 libgssapi-krb5-2 libgssapi-krb5-2:i386 libidn2-0:i386 libk5crypto3 libk5crypto3:i386
libkeyutils1:i386 libkrb5-3 libkrb5-3:i386 libkrb5support0 libkrb5support0:i386 libnsl2:i386
libnss-nis:i386 libnss-nisplus:i386 libssl3 libssl3:i386 libtirpc3:i386 libunistring2:i386
Suggested packages:
glibc-doc:i386 locales:i386 krb5-doc krb5-user krb5-doc:i386 krb5-user:i386
The following NEW packages will be installed:
gcc-12-base:i386 krb5-locales lib64z1:i386 libc6:i386 libc6-amd64:i386 libcom-err2:i386
libcrypt1:i386 libgcc-s1:i386 libgssapi-krb5-2:i386 libidn2-0:i386 libk5crypto3:i386
libkeyutils1:i386 libkrb5-3:i386 libkrb5support0:i386 libnsl2:i386 libnss-nis:i386
libnss-nisplus:i386 libssl3:i386 libtirpc3:i386 libunistring2:i386
The following packages will be upgraded:
libgssapi-krb5-2 libk5crypto3 libkrb5-3 libkrb5support0 libssl3
5 upgraded, 20 newly installed, 0 to remove and 85 not upgraded.
Need to get 10.3 MB/12.2 MB of archives.
After this operation, 38.1 MB of additional disk space will be used.
Do you want to continue? [Y/n]
Interestingly, rerunning checknative still showed false for all the native libraries! Next step was to inspect how the checknative argument is handled. It invokes the hadoop/NativeLibraryChecker.java class, which in turn calls the hadoop/NativeCodeLoader.java. One of the most important observations in the latter file is the additional debug logging available when the library doesn’t load!
The debug output is much now more informative! Notice the warning about the possible platform mismatch of the native library!
saint@ubuntuvm:~/java/binaries/hadoop/hadoop-3.3.5$ bin/hadoop --loglevel DEBUG checknative -a
2023-05-31 14:47:32,624 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
2023-05-31 14:47:32,625 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /home/saint/java/binaries/hadoop/hadoop-3.3.5/lib/native/libhadoop.so.1.0.0: /home/saint/java/binaries/hadoop/hadoop-3.3.5/lib/native/libhadoop.so.1.0.0: cannot open shared object file: No such file or directory (Possible cause: can't load AARCH64-bit .so on a AMD 64-bit platform)
2023-05-31 14:47:32,625 DEBUG util.NativeCodeLoader: java.library.path=/home/saint/java/binaries/hadoop/hadoop-3.3.5/lib/native
2023-05-31 14:47:32,625 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2023-05-31 14:47:32,836 DEBUG util.Shell: setsid exited with exit code 0
Native library checking:
hadoop: false
zlib: false
zstd : false
bzip2: false
openssl: false
ISA-L: false
PMDK: false
2023-05-31 14:47:32,847 DEBUG util.ExitUtil: Exiting with status 1: ExitException
1: ExitException
at org.apache.hadoop.util.ExitUtil.terminate(ExitUtil.java:381)
at org.apache.hadoop.util.ExitUtil.terminate(ExitUtil.java:369)
at org.apache.hadoop.util.NativeLibraryChecker.main(NativeLibraryChecker.java:154)
2023-05-31 14:47:32,856 INFO util.ExitUtil: Exiting with status 1: ExitException
To determine the architecture for which the shared library was compiled, I started with the objdump -f command as suggested by a StackOverflow post. However, it outputs architecture: UNKNOWN!, which isn’t very useful. The file command from the same post proves to be exactly what I need.
saint@ubuntuvm:~/java/binaries/hadoop/aarch64/hadoop-3.3.5$ objdump -f lib/native/libhadoop.so
lib/native/libhadoop.so: file format elf64-little
architecture: UNKNOWN!, flags 0x00000150:
HAS_SYMS, DYNAMIC, D_PAGED
start address 0x0000000000005b80
saint@ubuntuvm:~/java/binaries/hadoop/aarch64/hadoop-3.3.5$ file lib/native/libhadoop.so
lib/native/libhadoop.so: symbolic link to libhadoop.so.1.0.0
saint@ubuntuvm:~/java/binaries/hadoop/aarch64/hadoop-3.3.5$ file lib/native/libhadoop.so.1.0.0
lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, ARM aarch64, version 1 (SYSV), dynamically linked, BuildID[sha1]=19fbe9b0a7449eb05b687721548251af752b869f, with debug_info, not stripped
Turns out I was using an x86-64 Ubuntu VM instead of the aarch64 Ubuntu VM I had created so naturally, hadoop couldn’t load the aarch64 hadoop native library! For the VM I had been using, I needed to get the hadoop build by running:
Checking the loading status of the native libraries now indicates that the hadoop native library can be successfully loaded:
saint@ubuntuvm:~/java/binaries/hadoop/x64/hadoop-3.3.5$ bin/hadoop checknative -a
2023-05-31 14:58:40,869 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native
2023-05-31 14:58:40,877 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
2023-05-31 14:58:40,887 WARN erasurecode.ErasureCodeNative: Loading ISA-L failed: Failed to load libisal.so.2 (libisal.so.2: cannot open shared object file: No such file or directory)
2023-05-31 14:58:40,887 WARN erasurecode.ErasureCodeNative: ISA-L support is not available in your platform... using builtin-java codec where applicable
2023-05-31 14:58:41,035 INFO nativeio.NativeIO: The native code was built without PMDK support.
Native library checking:
hadoop: true /home/saint/java/binaries/hadoop/x64/hadoop-3.3.5/lib/native/libhadoop.so.1.0.0
zlib: true /lib/x86_64-linux-gnu/libz.so.1
zstd : true /lib/x86_64-linux-gnu/libzstd.so.1
bzip2: true /lib/x86_64-linux-gnu/libbz2.so.1
openssl: false Cannot load libcrypto.so (libcrypto.so: cannot open shared object file: No such file or directory)!
ISA-L: false Loading ISA-L failed: Failed to load libisal.so.2 (libisal.so.2: cannot open shared object file: No such file or directory)
PMDK: false The native code was built without PMDK support.
2023-05-31 14:58:41,056 INFO util.ExitUtil: Exiting with status 1: ExitException
Switching to the aarch64 Ubuntu VM also showed the aarch64 hadoop native library being successfully loaded on that platform. In hindsight, the 386 architecture references when I installed lib64z could have been a warning sign if I wasn’t just blasting my way through running these commands.
Building firefox with debug symbols has been a rather tricky endeavor for me for quite some time. The documentation on how to do this seems to indicate that adding these lines to your mozconfig file should be sufficient:
should do the trick. However, I have (for months now) been running into this make error: no rule to make nspr4.pdb needed by export whenever I try to build with debug symbols. Bug 338224 has a pending patch for this issue. In the mean time, the following trick seems to solve the problem for me: force the -Zi compiler option into the compiler flags variable. For the nspr files, the PDB file will be generated as desired. For the other files, the -Zi option will be redundant but this fix gets you up and running ASAP. Here’s the associated patch file:
diff --git a/nsprpub/configure.in b/nsprpub/configure.in
--- a/nsprpub/configure.in
+++ b/nsprpub/configure.in
@@ -2827,18 +2827,18 @@ if test -n "$_SAVE_DEBUG_FLAGS"; then
fi
if test -n "$MOZ_OPTIMIZE"; then
CFLAGS="$CFLAGS $_OPTIMIZE_FLAGS"
CXXFLAGS="$CXXFLAGS $_OPTIMIZE_FLAGS"
fi
if test -n "$MOZ_DEBUG_SYMBOLS"; then
- CFLAGS="$CFLAGS $_DEBUG_FLAGS"
- CXXFLAGS="$CXXFLAGS $_DEBUG_FLAGS"
+ CFLAGS="$CFLAGS $_DEBUG_FLAGS -Zi"
+ CXXFLAGS="$CXXFLAGS $_DEBUG_FLAGS -Zi"
fi
if test -n "$MOZ_OPTIMIZE"; then
OBJDIR_TAG=_OPT
else
OBJDIR_TAG=_DBG
fi
Here is the mozconfig file I used with this patch: