Nice to see that they are working to avoid #infiniband for #ai #networks #ultraethernet https://ultraethernet.org/
#infiniband #ai #networks #ultraethernet
Eleven years ago I volunteered to add native #Infiniband / #RDMA support to #ZeroMQ. At the time I was working on high-performance networking and I thought it was a nice challenge... but shortly afterwards I landed my job at @mozilla and never finished it.
Since then I've been contacted multiple times by people who wished to finish my work but none succeeded. Last time was yesterday. Maybe I should give it a spin again: https://zeromq-dev.zeromq.narkive.com/a3hbU2H1/contributing-native-infiniband-rdma-support-to-0mq
The dates for the 2023 MVAPICH User Group (#MUG23) have been set as August 21-23 and, as usual, it will be near OSU in Columbus. Despite its name, MUG a great snapshot of the state of the practice in #InfiniBand in #HPC and #AI overall.
An interview with Microsoft’s Nidhi Chappell discusses the infrastructure used to run large scale AI models in Azure: https://www.nextplatform.com/2023/03/21/inside-the-infrastructure-that-microsoft-builds-to-run-ai/
#microsoft #ai #openai #chatgpt #gpu #nvidia #grace #mpi #infiniband
#microsoft #ai #openai #chatgpt #gpu #nvidia #grace #mpi #infiniband
I took up the two networking functions with the least dependencies to refactor, and they are #InfiniBand related
and… i have no access to infiniband hardware so this is turning out to be very difficult.
if you have a #FreeBSD machine with InfiniBand NICs, or any spare machine with InfiniBand that you'd like to (temporarily) donate to my effort, please reach out!
Doing this while reading tools and driver code, and trying to fit that with three (conflicting…) pastes is proving rather difficult…
HPC/Redhat System Administrator for the National Radio Astronomy Observatory. Have been into computers since I can remember and I have finally found the job that I love and let's me work with my hobby in a laidback, chill, and interesting environment.
Questions regarding #Infiniband, #Redhat, #Torque, #MOAB, or anything else #HPC related please feel free to ask! I know I'll be posting some here eventually! :redhatalt:
#introduction #infiniband #redhat #torque #moab #hpc
I really wish #AMD would spend some effort on #interconnects. It's hackish & scary & unideal but put #Thunderbolt onboard. Put 100gbit or #infiniband onboard. But mostly, please bring #GenZ or #CCIX or #OpenCAPI or any standard to market. This would be such a delight to see from AMD. My heart would leap with the beautiful capabilities & futures being opened.
#opencapi #ccix #genz #infiniband #thunderbolt #interconnects #amd
the hope is that in the future, all these dedicated processor interconnects can be outmoded via some gobs of #GenZ or #OpenCAPI or #CCIX (or #HyperTransport 4 #HTX - #InfinityFabric released - make my day AMD!) that allow for more ad-hoc disaggregation & configurations. i don't even really want coherency, just some vaguely #infiniband like #rdma across remote chips, spread out across a board or larger chassis.
#rdma #infiniband #infinityfabric #htx #hypertransport #ccix #opencapi #genz