Apr 24, 2024
The basis of the universe may not be energy or matter but information
Posted by Dan Breeden in categories: particle physics, supercomputing
In this radical view, the universe is a giant supercomputer processing particles as bits.
In this radical view, the universe is a giant supercomputer processing particles as bits.
NVIDIA is all set to aid Japan in building the nation’s hybrid quantum supercomputer, fueled by the immense power of its HPC & AI GPUs.
Japan To Rapidly Progressing In Quantum and AI Computing Segments Through Large-Scale Developments With The Help of NVIDIA’s AI & HPC Infrastructure
Nikkei Asia reports that the National Institute of Advanced Industrial and Technology (AIST), Japan, is building a quantum supercomputer to excel in this particular segment for prospects. The new project is called ABCI-Q & will be entirely powered by NVIDIA’s accelerated & quantum computing platforms, hinting towards high-performance and efficiency results out of the system. The Japanese supercomputer will be built in collaboration with Fujitsu as well.
Tesla’s Dojo supercomputer represents a significant investment and commitment to innovation in the field of AI computation, positioning Tesla as a key player in shaping the future of neural net hardware.
Questions to inspire discussion.
Continue reading “Tesla’s Dojo Supercomputer: A Game-Changer in AI Computation” »
An AI-driven supercomputer dubbed Earth’s ‘digital twin’ could help us avoid the worst impacts of climate catastrophes headed our way.
Today is the ribbon-cutting ceremony for the “Venado” supercomputer, which was hinted at back in April 2021 when Nvidia announced its plans for its first datacenter-class Arm server CPU and which was talked about in some detail – but not really enough to suit our taste for speeds and feeds – back in May 2022 by the folks at Los Alamos National Laboratory where Venado is situated.
Now we can finally get more details on the Venado system and get a little more insight into how Los Alamos will put it to work, and more specifically, why a better balance of memory bandwidth and compute that depends upon it is perhaps more important to this lab than it is in other HPC centers of the world.
Los Alamos was founded back in 1943 as the home of the Manhattan Project that created the world’s first nuclear weapons. We did not have supercomputers back then, of course, but plenty of very complex calculations have always been done at Los Alamos; sometimes by hand, sometimes by tabulators from IBM that used punch cards to store and manipulate data – an early form of simulation. The first digital computer to do such calculations at Los Alamos was called MANIAC and was installed in 1952; it could perform 10,000 operations per second and ran Monte Carlo simulations, which use randomness to simulate what are actually deterministic processes.
Microsoft is reportedly planning to build a $100 billion data center and supercomputer, called “Stargate,” for OpenAI.
A combination of advances in magnetic resonance imaging to help track the movement of fluids in the brain and supercomputer-powered simulations are modifying our understanding of cognitive decline.
😗😁😘 year 2023.
The world’s first supercomputer capable of simulating networks at the scale of the human brain has been announced by researchers at the International Centre for Neuromorphic Systems (ICNS) at Western Sydney University.
DeepSouth uses a neuromorphic system which mimics biological processes, using hardware to efficiently emulate large networks of spiking neurons at 228 trillion synaptic operations per second — rivalling the estimated rate of operations in the human brain.
Ching-Yao Tang and Ke-Jung Chen used the powerful supercomputer at Berkeley National Lab to create the world’s first high-resolution 3D hydrodynamics simulations of turbulent star-forming clouds for the first stars. Their results indicate that supersonic turbulence effectively fragments the star-forming clouds into several clumps, each with dense cores ranging from 22 to 175 solar masses, destined to form the first stars of masses of about 8 to 58 solar masses that agree well with the observation.
Furthermore, if the turbulence is weak or unresolved in the simulations, the researchers can reproduce similar results from previous simulations. This result first highlights the importance of turbulence in the first star formation and offers a promising pathway to decrease the theoretical mass scale of the first stars. It successfully reconciles the mass discrepancy between simulations and observations, providing a strong theoretical foundation for the first star formation.