Toggle light / dark theme

Supercomputer uses machine learning to set new speed record

Give people a barrier, and at some point they are bound to smash through. Chuck Yeager broke the sound barrier in 1947. Yuri Gagarin burst into orbit for the first manned spaceflight in 1961. The Human Genome Project finished cracking the genetic code in 2003. And we can add one more barrier to humanity’s trophy case: the exascale barrier.

The exascale barrier represents the challenge of achieving exascale-level computing, which has long been considered the benchmark for high performance. To reach that level, however, a computer needs to perform a quintillion calculations per second. You can think of a quintillion as a million trillion, a billion billion, or a million million millions. Whichever you choose, it’s an incomprehensibly large number of calculations.

On May 27, 2022, Frontier, a supercomputer built by the Department of Energy’s Oak Ridge National Laboratory, managed the feat. It performed 1.1 quintillion calculations per second to become the fastest computer in the world.

Google Scientists Discovered 380,000 New Materials Using Artificial Intelligence

New advancements in technology frequently necessitate the development of novel materials – and thanks to supercomputers and advanced simulations, researchers can bypass the time-consuming and often inefficient process of trial-and-error.

The Materials Project, an open-access database founded at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) in 2011, computes the properties of both known and predicted materials. Researchers can focus on promising materials for future technologies – think lighter alloys that improve fuel economy in cars, more efficient solar cells to boost renewable energy, or faster transistors for the next generation of computers.

Artificial Intelligence Electricity Use Is In The Crosshairs

Artificial intelligence has progressed from sci-fi fantasy to mainstream reality. AI now powers online tools from search engines to voice assistants and it is used in everything from medical imaging analysis to autonomous vehicles. But the advance of AI will soon collide with another pressing issue: energy consumption.

Much like cryptocurrencies today, AI risks becoming a target for criticism and regulation based on its high electricity appetite. Partisans are forming into camps, with AI optimists extolling continued progress through more compute power, while pessimists are beginning to portray AI power usage as wasteful and even dangerous. Attacks echo those leveled at crypto mining in recent years. Undoubtedly, there will be further efforts to choke off AI innovation by cutting its energy supply.

The pessimists raise some valid points. Developing ever-more capable AI does require vast computing resources. For example, the amount of compute used to train OpenAI’s ChatGPT-3 reportedly equaled 800 petaflops of processing power—on par with the 20 most powerful supercomputers in the world combined. Similarly, ChatGPT receives somewhere on the order of hundreds of millions of queries each day. Estimates suggest that the electricity required to respond to all these queries might be around 1 GWh daily, enough to power the daily energy consumption of about 33,000 U.S. households. Demand is expected to further increase in the future.

Gravitas | Artificial Intelligence discovers material to cut Lithium use | WION

In a significant breakthrough, Microsoft and the Pacific Northwest National Laboratory have utilised artificial intelligence and supercomputing to discover a new material that could dramatically reduce lithium use in batteries by up to 70%. This discovery, potentially revolutionising the battery industry, was achieved by narrowing down from 32 million inorganic materials to 18 candidates in just a week, a process that could have taken over 20 years traditionally.

#microsoft #ai #gravitas.

About Channel:

WION The World is One News examines global issues with in-depth analysis. We provide much more than the news of the day. Our aim is to empower people to explore their world. With our Global headquarters in New Delhi, we bring you news on the hour, by the hour. We deliver information that is not biased. We are journalists who are neutral to the core and non-partisan when it comes to world politics. People are tired of biased reportage and we stand for a globalized united world. So for us, the World is truly One.

Please keep discussions on this channel clean and respectful and refrain from using racist or sexist slurs and personal insults.

Check out our website: http://www.wionews.com.

Lawrence Berkeley Lab Researchers Optimize Higher Density Copper Doping to Make LK99 Variant into a Superconductor

Lawrence Berkeley National Lab researchers use computational methods to describe an approach for optimizing the LK99 material as a superconductor.

Some will say, hey why is Nextbigfuture still covering LK99. Didn’t some angry scientists say that LK99 was not a superconductor? I have been covering science for over 20 years and there are a lot of angry scientists who believe many things will not work. Scientists going into experiments looking to debunk something will not be the ones who figure out how to make it work.

Lawrence Berkeley National Lab researchers spent time and worked on supercomputers to try to figure out how to make LK99 work. There computational work is showing promise.

Cyborg computer combining AI and human brain cells really works

A new biohybrid computer combining a “brain organoid” and a traditional AI was able to perform a speech recognition task with 78% accuracy — demonstrating the potential for human biology to one day boost our computing capabilities.

The background: The human brain is the most energy efficient “computer” on Earth — while a supercomputer needs 20 mega watts of power to process more than a quintillion calculations per second, your brain can do the equivalent with just 20 watts (a megawatt is 1 million watts).

This has given researchers the idea to try boosting computers by combining them with a three-dimensional clump of lab-grown human brain cells, known as a brain organoid.

Google Addresses the Mysteries of Its Hypercomputer

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, “Say what?” It turns out that the Hypercomputer is Google’s take on a modular supercomputer with a healthy dose of its homegrown TPU v5p AI accelerators, which were also announced this month.

The modular design also allows workloads to be sliced up between TPUs and GPUs, with Google’s software tools doing the provisioning and orchestration in the background. Theoretically, if Google were to add a quantum computer to the Google Cloud, it could also be plugged into the Hypercomputer.

While the Hypercomputer was advertised as an AI supercomputer, the good news is that the system also runs scientific computing applications.

Neutron Stars’ Inner Mysteries: A Glimpse Into Quark-Matter Cores

New theoretical analysis places the likelihood of massive neutron stars hiding cores of deconfined quark matter between 80 and 90 percent. The result was reached through massive supercomputer runs utilizing Bayesian statistical inference.

Neutron star cores contain matter at the highest densities reached in our present-day Universe, with as much as two solar masses of matter compressed inside a sphere of 25 km in diameter. These astrophysical objects can indeed be thought of as giant atomic nuclei, with gravity compressing their cores to densities exceeding those of individual protons and neutrons manyfold.

These densities make neutron stars interesting astrophysical objects from the point of view of particle and nuclear physics. A longstanding open problem concerns whether the immense central pressure of neutron stars can compress protons and neutrons into a new phase of matter, known as cold quark matter. In this exotic state of matter, individual protons and neutrons no longer exist.