Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

{2023.06}[2023a,sapphire_rapids] Remaining apps from EB 4.9.1 2023a easystack #895

Merged

Conversation

bedroge
Copy link
Collaborator

@bedroge bedroge commented Jan 25, 2025

No description provided.

@bedroge bedroge added 2023.06-software.eessi.io 2023.06 version of software.eessi.io sapphire_rapids labels Jan 25, 2025
Copy link

eessi-bot bot commented Jan 25, 2025

Instance eessi-bot-mc-aws is configured to build for:

  • architectures: x86_64/generic, x86_64/intel/haswell, x86_64/intel/sapphire_rapids, x86_64/intel/skylake_avx512, x86_64/amd/zen2, x86_64/amd/zen3, aarch64/generic, aarch64/neoverse_n1, aarch64/neoverse_v1
  • repositories: eessi.io-2023.06-software, eessi.io-2023.06-compat

Copy link

eessi-bot bot commented Jan 25, 2025

Instance eessi-bot-mc-azure is configured to build for:

  • architectures: x86_64/amd/zen4
  • repositories: eessi.io-2023.06-software, eessi.io-2023.06-compat

@bedroge
Copy link
Collaborator Author

bedroge commented Jan 25, 2025

bot: build repo:eessi.io-2023.06-software arch:x86_64/intel/sapphire_rapids

Copy link

eessi-bot bot commented Jan 25, 2025

Updates by the bot instance eessi-bot-mc-aws (click for details)

Copy link

eessi-bot bot commented Jan 25, 2025

Updates by the bot instance eessi-bot-mc-azure (click for details)
  • received bot command build repo:eessi.io-2023.06-software arch:x86_64/intel/sapphire_rapids from bedroge

    • expanded format: build repository:eessi.io-2023.06-software architecture:x86_64/intel/sapphire_rapids
  • handling command build repository:eessi.io-2023.06-software architecture:x86_64/intel/sapphire_rapids resulted in:

    • no jobs were submitted

Copy link

eessi-bot bot commented Jan 25, 2025

New job on instance eessi-bot-mc-aws for CPU micro-architecture x86_64-intel-sapphire_rapids for repository eessi.io-2023.06-software in job dir /project/def-users/SHARED/jobs/2025.01/pr_895/42607

date job status comment
Jan 25 22:50:12 UTC 2025 submitted job id 42607 awaits release by job manager
Jan 25 22:50:26 UTC 2025 released job awaits launch by Slurm scheduler
Jan 25 22:55:28 UTC 2025 running job 42607 is running
Jan 26 13:28:26 UTC 2025 finished
😁 SUCCESS (click triangle for details)
Details
✅ job output file slurm-42607.out
✅ no message matching FATAL:
✅ no message matching ERROR:
✅ no message matching FAILED:
✅ no message matching required modules missing:
✅ found message(s) matching No missing installations
✅ found message matching .tar.gz created!
Artefacts
eessi-2023.06-software-linux-x86_64-intel-sapphire_rapids-1737895229.tar.gzsize: 8827 MiB (9256482271 bytes)
entries: 50804
modules under 2023.06/software/linux/x86_64/intel/sapphire_rapids/modules/all
Arrow/14.0.1-gfbf-2023a.lua
arrow-R/14.0.1-foss-2023a-R-4.3.2.lua
Biopython/1.83-foss-2023a.lua
BLAST+/2.14.1-gompi-2023a.lua
cimfomfa/22.273-GCCcore-12.3.0.lua
cpio/2.15-GCCcore-12.3.0.lua
DIAMOND/2.1.8-GCC-12.3.0.lua
ESPResSo/4.2.2-foss-2023a.lua
FastME/2.1.6.3-GCC-12.3.0.lua
GATK/4.5.0.0-GCCcore-12.3.0-Java-17.lua
ipympl/0.9.3-gfbf-2023a.lua
ISA-L/2.30.0-GCCcore-12.3.0.lua
Java/17.0.6.lua
Java/.modulerc.lua
LMDB/0.9.31-GCCcore-12.3.0.lua
MCL/22.282-GCCcore-12.3.0.lua
MMseqs2/14-7e284-gompi-2023a.lua
ncdu/1.18-GCC-12.3.0.lua
OrthoFinder/2.5.5-foss-2023a.lua
pyfaidx/0.8.1.1-GCCcore-12.3.0.lua
Pysam/0.22.0-GCC-12.3.0.lua
python-isal/1.1.0-GCCcore-12.3.0.lua
RapidJSON/1.1.0-20230928-GCCcore-12.3.0.lua
R-bundle-Bioconductor/3.18-foss-2023a-R-4.3.2.lua
SAMtools/1.18-GCC-12.3.0.lua
utf8proc/2.8.0-GCCcore-12.3.0.lua
Valgrind/3.21.0-gompi-2023a.lua
WhatsHap/2.2-foss-2023a.lua
software under 2023.06/software/linux/x86_64/intel/sapphire_rapids/software
Arrow/14.0.1-gfbf-2023a
arrow-R/14.0.1-foss-2023a-R-4.3.2
Biopython/1.83-foss-2023a
BLAST+/2.14.1-gompi-2023a
cimfomfa/22.273-GCCcore-12.3.0
cpio/2.15-GCCcore-12.3.0
DIAMOND/2.1.8-GCC-12.3.0
ESPResSo/4.2.2-foss-2023a
FastME/2.1.6.3-GCC-12.3.0
GATK/4.5.0.0-GCCcore-12.3.0-Java-17
ipympl/0.9.3-gfbf-2023a
ISA-L/2.30.0-GCCcore-12.3.0
Java/17.0.6
LMDB/0.9.31-GCCcore-12.3.0
MCL/22.282-GCCcore-12.3.0
MMseqs2/14-7e284-gompi-2023a
ncdu/1.18-GCC-12.3.0
OrthoFinder/2.5.5-foss-2023a
pyfaidx/0.8.1.1-GCCcore-12.3.0
Pysam/0.22.0-GCC-12.3.0
python-isal/1.1.0-GCCcore-12.3.0
RapidJSON/1.1.0-20230928-GCCcore-12.3.0
R-bundle-Bioconductor/3.18-foss-2023a-R-4.3.2
SAMtools/1.18-GCC-12.3.0
utf8proc/2.8.0-GCCcore-12.3.0
Valgrind/3.21.0-gompi-2023a
WhatsHap/2.2-foss-2023a
other under 2023.06/software/linux/x86_64/intel/sapphire_rapids
no other files in tarball
Jan 26 13:28:26 UTC 2025 test result
😁 SUCCESS (click triangle for details)
ReFrame Summary
[ OK ] ( 1/15) EESSI_ESPRESSO_LJ_PARTICLES %module_name=ESPResSo/4.2.2-foss-2023b %device_type=cpu %scale=1_node /adc964f6 @BotBuildTests:x86-64-intel-srapids-node+default
P: perf: 0.001297 s/step (r:0, l:None, u:None)
[ OK ] ( 2/15) EESSI_ESPRESSO_LJ_PARTICLES %module_name=ESPResSo/4.2.2-foss-2023a %device_type=cpu %scale=1_node /3b8b8926 @BotBuildTests:x86-64-intel-srapids-node+default
P: perf: 0.001279 s/step (r:0, l:None, u:None)
[ OK ] ( 3/15) EESSI_ESPRESSO_LJ_PARTICLES %module_name=ESPResSo/4.2.1-foss-2023a %device_type=cpu %scale=1_node /fdd6aced @BotBuildTests:x86-64-intel-srapids-node+default
P: perf: 0.001286 s/step (r:0, l:None, u:None)
[ OK ] ( 4/15) EESSI_ESPRESSO_P3M_IONIC_CRYSTALS %module_name=ESPResSo/4.2.2-foss-2023b %device_type=cpu %scale=1_node /3b90ffd6 @BotBuildTests:x86-64-intel-srapids-node+default
P: perf: 0.2595 s/step (r:0, l:None, u:None)
[ OK ] ( 5/15) EESSI_ESPRESSO_P3M_IONIC_CRYSTALS %module_name=ESPResSo/4.2.2-foss-2023a %device_type=cpu %scale=1_node /f1621689 @BotBuildTests:x86-64-intel-srapids-node+default
P: perf: 0.2562 s/step (r:0, l:None, u:None)
[ OK ] ( 6/15) EESSI_ESPRESSO_P3M_IONIC_CRYSTALS %module_name=ESPResSo/4.2.1-foss-2023a %device_type=cpu %scale=1_node /4a9e965c @BotBuildTests:x86-64-intel-srapids-node+default
P: perf: 0.2518 s/step (r:0, l:None, u:None)
[ OK ] ( 7/15) EESSI_LAMMPS_lj %device_type=cpu %module_name=LAMMPS/2Aug2023_update2-foss-2023a-kokkos %scale=1_node /04ff9ece @BotBuildTests:x86-64-intel-srapids-node+default
P: perf: 662.055 timesteps/s (r:0, l:None, u:None)
[ OK ] ( 8/15) EESSI_OSU_coll %benchmark_info=mpi.collective.osu_allreduce %module_name=OSU-Micro-Benchmarks/7.2-gompi-2023b %scale=1_node %device_type=cpu /775175bf @BotBuildTests:x86-64-intel-srapids-node+default
P: latency: 1.88 us (r:0, l:None, u:None)
[ OK ] ( 9/15) EESSI_OSU_coll %benchmark_info=mpi.collective.osu_allreduce %module_name=OSU-Micro-Benchmarks/7.1-1-gompi-2023a %scale=1_node %device_type=cpu /52707c40 @BotBuildTests:x86-64-intel-srapids-node+default
P: latency: 1.65 us (r:0, l:None, u:None)
[ OK ] (10/15) EESSI_OSU_coll %benchmark_info=mpi.collective.osu_alltoall %module_name=OSU-Micro-Benchmarks/7.2-gompi-2023b %scale=1_node %device_type=cpu /b1aacda9 @BotBuildTests:x86-64-intel-srapids-node+default
P: latency: 3.7 us (r:0, l:None, u:None)
[ OK ] (11/15) EESSI_OSU_coll %benchmark_info=mpi.collective.osu_alltoall %module_name=OSU-Micro-Benchmarks/7.1-1-gompi-2023a %scale=1_node %device_type=cpu /c6bad193 @BotBuildTests:x86-64-intel-srapids-node+default
P: latency: 3.88 us (r:0, l:None, u:None)
[ OK ] (12/15) EESSI_OSU_pt2pt_CPU %benchmark_info=mpi.pt2pt.osu_latency %module_name=OSU-Micro-Benchmarks/7.2-gompi-2023b %scale=1_node /15cad6c4 @BotBuildTests:x86-64-intel-srapids-node+default
P: latency: 0.4 us (r:0, l:None, u:None)
[ OK ] (13/15) EESSI_OSU_pt2pt_CPU %benchmark_info=mpi.pt2pt.osu_latency %module_name=OSU-Micro-Benchmarks/7.1-1-gompi-2023a %scale=1_node /6672deda @BotBuildTests:x86-64-intel-srapids-node+default
P: latency: 0.37 us (r:0, l:None, u:None)
[ OK ] (14/15) EESSI_OSU_pt2pt_CPU %benchmark_info=mpi.pt2pt.osu_bw %module_name=OSU-Micro-Benchmarks/7.2-gompi-2023b %scale=1_node /2a9a47b1 @BotBuildTests:x86-64-intel-srapids-node+default
P: bandwidth: 13070.33 MB/s (r:0, l:None, u:None)
[ OK ] (15/15) EESSI_OSU_pt2pt_CPU %benchmark_info=mpi.pt2pt.osu_bw %module_name=OSU-Micro-Benchmarks/7.1-1-gompi-2023a %scale=1_node /1b24ab8e @BotBuildTests:x86-64-intel-srapids-node+default
P: bandwidth: 13513.97 MB/s (r:0, l:None, u:None)
[ PASSED ] Ran 15/15 test case(s) from 15 check(s) (0 failure(s), 0 skipped, 0 aborted)
Details
✅ job output file slurm-42607.out
✅ no message matching ERROR:
✅ no message matching [\s*FAILED\s*].*Ran .* test case
Jan 26 15:51:14 UTC 2025 uploaded transfer of eessi-2023.06-software-linux-x86_64-intel-sapphire_rapids-1737895229.tar.gz to S3 bucket succeeded

@bedroge bedroge changed the title {2023.06} Remaining apps from EB 4.9.1 2023a easystack {2023.06}[2023a,sapphire_rapids] Remaining apps from EB 4.9.1 2023a easystack Jan 25, 2025
@bedroge bedroge added the ready-to-deploy Mark a PR as ready to deploy label Jan 26, 2025
@boegel boegel added bot:deploy Ask bot to deploy missing software installations to EESSI and removed ready-to-deploy Mark a PR as ready to deploy labels Jan 26, 2025
@boegel
Copy link
Contributor

boegel commented Jan 26, 2025

tarball deploy triggered, so merging this...

@boegel boegel merged commit ebe21c1 into EESSI:2023.06-software.eessi.io Jan 26, 2025
49 checks passed
Copy link

eessi-bot bot commented Jan 26, 2025

PR merged! Moved ['/project/def-users/SHARED/jobs/2025.01/pr_895/42607'] to /project/def-users/SHARED/trash_bin/EESSI/software-layer/2025.01.26

Copy link

eessi-bot bot commented Jan 26, 2025

PR merged! Moved [] to /project/def-users/SHARED/trash_bin/EESSI/software-layer/2025.01.26

@bedroge bedroge deleted the sapphire_rapids_eb491_2023a branch January 26, 2025 16:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
2023.06-software.eessi.io 2023.06 version of software.eessi.io bot:deploy Ask bot to deploy missing software installations to EESSI sapphire_rapids
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants