Just because a mathematical formula works does not mean it reflects reality.
Category: information science – Page 205
Graph representations can solve complex problems in natural science, as patterns of connectivity can give rise to a magnitude of emergent phenomena. Graph-based approaches are specifically important during quantum communication, alongside quantum search algorithms in highly branched quantum networks. In a new report now published on Science Advances, Max Ehrhardt and a team of scientists in physics, experimental physics and quantum science in Germany introduced a hitherto unidentified paradigm to directly realize excitation dynamics associated with three-dimensional networks. To accomplish this, they explored the hybrid action of space and polarization degrees of freedom of photon pairs inside complex waveguide circuits. The team experimentally explored multiparticle quantum walks on complex and highly connected graphs as testbeds to pave the way to explore the potential applications of fermionic dynamics in integrated photonics.
Complex networks
Complex networks can occur across diverse fields of science, ranging from biological signaling pathways and biochemical molecules to exhibit efficient energy transport to neuromorphic circuits across to social interactions across the internet. Such structures are typically modeled using graphs whose complexity relies on the number of nodes and linkage patterns between them. The physical representation of a graph is limited by their requirement for arrangement in three-dimensional (3D) space. The human brain is a marked example of scaling behavior that is unfavorable for physical simulation due to its staggering number of 80 billion neurons, dwarfed by 100 trillion synapses that allow the flow of signals between them. Despite the number of comparably miniscule volume of nodes, discrete quantum systems faced a number of challenges owing to complex network topologies, efficient multipartite quantum communications and search algorithms.
Managers of the ImageNet data set paved the way for advances in deep learning. Now they’ve taken a big step to protect people’s privacy.
When physicists need to understand the quantum mechanics that describe how atomic clocks work, how your magnet sticks to your refrigerator or how particles flow through a superconductor, they use quantum field theories.
When they work through problems in quantum field theories, they do so in “imaginary” time, then map those simulations into real quantities. But traditionally, these simulations nearly always include uncertainties or unknown factors that could cause equation results to be “off.” So, when physicists interpret their simulation results into real quantities, these uncertainties amplify exponentially, making it difficult to have confidence that their results are as accurate as necessary.
Now, a pair of University of Michigan physicists have discovered that a set of functions called the Nevanlinna functions can tighten the interpretation step, showing that physicists may be able to overcome one of the major limitations of modern quantum simulation. The work, published in Physical Review Letters, was led by U-M physics undergraduate student Jiani Fei.
What do you do after solving the answer to life, the universe, and everything? If you’re mathematicians Drew Sutherland and Andy Booker, you go for the harder problem.
In 2019, Booker, at the University of Bristol, and Sutherland, principal research scientist at MIT, were the first to find the answer to 42. The number has pop culture significance as the fictional answer to “the ultimate question of life, the universe, and everything,” as Douglas Adams famously penned in his novel “The Hitchhiker’s Guide to the Galaxy.” The question that begets 42, at least in the novel, is frustratingly, hilariously unknown.
In mathematics, entirely by coincidence, there exists a polynomial equation for which the answer, 42, had similarly eluded mathematicians for decades. The equation x3+y3+z3=k is known as the sum of cubes problem. While seemingly straightforward, the equation becomes exponentially difficult to solve when framed as a “Diophantine equation”—a problem that stipulates that, for any value of k, the values for x, y, and z must each be whole numbers.
EA, Ubisoft, Warner Bros, and more explore how artificial intelligence innovations will lead to more believable open worlds and personal adventures within them.
Most NPCs simply patrol a specific area until the player interacts with them, at which point they try to become a more challenging target to hit. That’s fine in confined spaces, but in big worlds where NPCs have the freedom to roam, it just doesn’t scale. More advanced AI techniques such as machine learning – which uses algorithms to study incoming data, interpret it, and decide on a course of action in real-time – give AI agents much more flexibility and freedom. But developing them is time-consuming, computationally expensive, and a risk because it makes NPCs less predictable – hence the Assassin’s Creed Valhalla stalking situation.
However, as open-world and narrative-based games become more complex, and as modern PCs and consoles display ever more authentic and detailed environments, the need for more advanced AI techniques is growing. It’s going to be weird and alienating to be thrust into an almost photorealistic world filled with intricate systems and narrative possibilities, only to discover that non-player characters still act like soulless robots.
This is something the developers pushing the boundaries of open-world game design understand. Ubisoft, for example, has dedicated AI research teams at its Chengdu, Mumbai, Pune, and Montpelier studios, as well as a Strategic Innovation Lab in Paris and the Montreal studio’s La Forge lab, and is working with tech firms and universities on academic AI research topics.
Researchers have developed a new data transfer system that is 20 times faster than USB 3.0.
This combines high-frequency silicon chips with a polymer cable as thin as a strand of hair. The system could boost energy efficiency in data centres and lighten the loads of electronics-rich spacecraft. Researchers presented their breakthrough at the recent IEEE International Solid-State Circuits Conference, held virtually.
“There’s an explosion in the amount of information being shared between computer chips – cloud computing, the Internet, big data. And a lot of this happens over conventional copper wire,” says Jack Holloway, who led the research. Holloway completed his PhD in MIT’s Department of Electrical Engineering and Computer Science last year and currently works for Raytheon.
Scientists have taken a major step forward in harnessing machine learning to accelerate the design for better batteries: Instead of using it just to speed up scientific analysis by looking for patterns in data, as researchers generally do, they combined it with knowledge gained from experiments and equations guided by physics to discover and explain a process that shortens the lifetimes of fast-charging lithium-ion batteries.
It was the first time this approach, known as “scientific machine learning,” has been applied to battery cycling, said Will Chueh, an associate professor at Stanford University and investigator with the Department of Energy’s SLAC National Accelerator Laboratory who led the study. He said the results overturn long-held assumptions about how lithium-ion batteries charge and discharge and give researchers a new set of rules for engineering longer-lasting batteries.
The research, reported today in Nature Materials, is the latest result from a collaboration between Stanford, SLAC, the Massachusetts Institute of Technology and Toyota Research Institute (TRI). The goal is to bring together foundational research and industry know-how to develop a long-lived electric vehicle battery that can be charged in 10 minutes.
Algorithms are meaningless without good data. The public can exploit that to demand change.
Four Emerging Technology Areas Impacting Industry 4.0: Advanced Computing, Artificial intelligence, Big Data & Materials Science.