Hi,
On Wed, 14 Jun 2023 at 18:00, Ludovic Courtès <ludo@gnu.org> wrote:
Toggle quote (3 lines)
> Cc’in Efraim, Simon, and Nicolas who’ve looked into Julia packaging in
> the past. Hopefully we can get inspiration from Arch’s build recipe!
Hum, the difference seems:
USE_SYSTEM_DSFMT=0
USE_SYSTEM_LIBUV=0
which are set to 1 in our Guix recipe; I guess it does not come from
that. And we link against OpenBLAS and they link against some NETLIB; I
guess it does not come from that. Well, the other difference could be
“make release” that we do not run. Maybe?
Somehow, it seems from the ability to exploit the multicore, IIUC.
Using the binary generated by upstream:
Toggle snippet (17 lines)
$ ldd julia-1.9.1/bin/julia
linux-vdso.so.1 (0x00007fffd83f1000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f08fb274000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f08fb251000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f08fb05f000)
libjulia.so.1 => /tmp/julia-1.9.1/bin/../lib/libjulia.so.1 (0x00007f08fb03c000)
/lib64/ld-linux-x86-64.so.2 (0x00007f08fb28e000)
$ ldd julia-1.6.7/bin/julia
linux-vdso.so.1 (0x00007fffcdbd7000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f822423f000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f822421c000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f822402a000)
libjulia.so.1 => /tmp/julia-1.6.7/bin/../lib/libjulia.so.1 (0x00007f8223e04000)
/lib64/ld-linux-x86-64.so.2 (0x00007f8224259000)
And please note it also contain all these binaries:
Toggle snippet (30 lines)
libamd.so -> libamd.so.2.4.6
libamd.so.2 -> libamd.so.2.4.6
libamd.so.2.4.6
libatomic.so -> libatomic.so.1.2.0
libatomic.so.1 -> libatomic.so.1.2.0
libatomic.so.1.2.0
libblastrampoline.so -> libblastrampoline.so.5
libblastrampoline.so.5
libblastrampoline.so.5.4.0 -> libblastrampoline.so.5
[...]
libsuitesparseconfig.so -> libsuitesparseconfig.so.5.10.1
libsuitesparseconfig.so.5 -> libsuitesparseconfig.so.5.10.1
libsuitesparseconfig.so.5.10.1
libumfpack.so -> libumfpack.so.5.7.9
libumfpack.so.5 -> libumfpack.so.5.7.9
libumfpack.so.5.7.9
libunwind.so -> libunwind.so.8.0.1
libunwind.so.8 -> libunwind.so.8.0.1
libunwind.so.8.0.1
libuv.so -> libuv.so.2.0.0
libuv.so.2 -> libuv.so.2.0.0
libuv.so.2.0.0
libz.so -> libz.so.1
libz.so.1 -> libz.so.1.2.13
libz.so.1.2.13
sys.so
I get these time:
+ ~7ms v1.9.1
+ ~18ms v1.6.7
compared to ~500ms of v1.8.3 provided by Guix.
I guess the issue is about “threading”. Most of the time is about
’futex’ in the Guix version.
Toggle snippet (37 lines)
$ head v1.9.1.txt
% time seconds usecs/call calls errors syscall
------ ----------- ----------- --------- --------- ----------------
83.48 5.682198 6 926675 sched_yield
14.87 1.011882 987 1025 futex
0.42 0.028269 2 12308 rt_sigprocmask
0.35 0.023628 9 2592 madvise
0.18 0.012532 6266 2 wait4
0.18 0.012353 10 1227 epoll_wait
0.17 0.011707 2 5015 13 read
0.09 0.006235 13 448 brk
$ head v1.6.7.txt
245193 ????( <detached ...>
% time seconds usecs/call calls errors syscall
------ ----------- ----------- --------- --------- ----------------
86.41 5.870043 5 1008236 sched_yield
12.85 0.872865 467 1869 futex
0.26 0.017538 3 5165 madvise
0.09 0.006171 2 2173 12 read
0.07 0.004486 13 321 brk
0.06 0.004242 2 1772 rt_sigprocmask
0.05 0.003197 2 1554 456 statx
$ head vguix.txt
% time seconds usecs/call calls errors syscall
------ ----------- ----------- --------- --------- ----------------
88.96 7.843293 4621 1697 2 futex
6.88 0.606245 31 19080 sched_yield
1.74 0.153092 153092 1 1 rt_sigtimedwait
0.48 0.041975 1 26317 1 read
0.36 0.032148 1 27602 rt_sigprocmask
0.34 0.030236 10078 3 1 wait4
0.33 0.028833 1 20780 mincore
0.33 0.028801 1 22424 write
Hum, I do not know… It needs some investigations.
Thanks for the report.
Cheers,
simon