Menu

Blog

Archive for the ‘computing’ category: Page 112

Jan 31, 2024

A Laser Breakthrough Could Change Quantum Machines Forever

Posted by in categories: computing, quantum physics

Soon, photons may be rewriting the rules of computing as we know them.

Jan 31, 2024

Tencent sees HPC, quantum, cloud and edge converging

Posted by in categories: computing, quantum physics

Chinese tech giant #tencent has predicted that high-performance #computing (HPC), #quantum computing, cloud computing and #EdgeComputing will soon merge.


And it will all come together in one big, happy, hybrid innovation engine.

Jan 31, 2024

Quantum Computing Can Help Unlock Understanding of Aging And Disease

Posted by in categories: biotech/medical, computing, quantum physics

A team of researchers demonstrate how quantum computing can be integrated into the study of living organisms.

Jan 31, 2024

Mastering the quantum code: A primer on quantum software

Posted by in categories: computing, information science, quantum physics

In the world of quantum computing, the spotlight often lands on the hardware: qubits, superconducting circuits, and the like. But it’s time to shift our focus to the unsung hero of this tale – the quantum software, the silent maestro orchestrating the symphony of qubits. From turning abstract quantum algorithms into executable code to optimizing circuit designs, quantum software plays a pivotal role.

Here, we’ll explore the foundations of quantum programming, draw comparisons to classical computing, delve into the role of quantum languages, and forecast the transformational impact of this nascent technology. Welcome to a beginner’s guide to quantum software – a journey to the heart of quantum computing.

Quantum vs. Classical Programming: The Core Differences.

Jan 30, 2024

Forecasting Floods: Implications of Back-to-Back Atmospheric River Events

Posted by in categories: computing, economics, information science

How can back-to-back atmospheric rivers impact the economy? This is what a recent study published in Science Advances hopes to address as a team of researchers led by Stanford University investigates the economic toll of back-to-back atmospheric rivers compared to single events. This study holds the potential to help scientists, the public, and city planners better prepare for atmospheric rivers, as they can cause widespread flooding in short periods of time.

For the study, the researchers analyzed data from the Modern-Era Retrospective Analysis for Research and Applications, version 2, (MERRA-2) between 1981 and 2021 and computer algorithms to ascertain the economic impact of atmospheric rivers throughout California. The goal was to ascertain how much worse back-to-back atmospheric rivers were compared to single events. The study’s findings discovered that back-to-back atmospheric rivers caused three times greater economic damage than single events, which is also higher when the first atmospheric river exhibits greater strength.

“Our work really shows that we need to consider the likelihood for multiple, back-to-back events for predicting damages, because damage from multiple events could be far worse than from one event alone,” said Dr. Katy Serafin, who is a coastal scientists and assistant professor in the Department of Geography at the University of Florida and a co-author on the study.

Jan 30, 2024

The Professions of the Future (1)

Posted by in categories: automation, big data, business, computing, cyborgs, disruptive technology, education, Elon Musk, employment, evolution, futurism, information science, innovation, internet, life extension, lifeboat, machine learning, posthumanism, Ray Kurzweil, robotics/AI, science, singularity, Skynet, supercomputing, transhumanism

We are witnessing a professional revolution where the boundaries between man and machine slowly fade away, giving rise to innovative collaboration.

Photo by Mateusz Kitka (Pexels)

As Artificial Intelligence (AI) continues to advance by leaps and bounds, it’s impossible to overlook the profound transformations that this technological revolution is imprinting on the professions of the future. A paradigm shift is underway, redefining not only the nature of work but also how we conceptualize collaboration between humans and machines.

As creator of the ETER9 Project (2), I perceive AI not only as a disruptive force but also as a powerful tool to shape a more efficient, innovative, and inclusive future. As we move forward in this new world, it’s crucial for each of us to contribute to building a professional environment that celebrates the interplay between humanity and technology, where the potential of AI is realized for the benefit of all.

In the ETER9 Project, dedicated to exploring the interaction between artificial intelligences and humans, I have gained unique insights into the transformative potential of AI. Reflecting on the future of professions, it’s evident that adaptability and a profound understanding of technological dynamics will be crucial to navigate this new landscape.

Continue reading “The Professions of the Future (1)” »

Jan 30, 2024

Beijing urges breakthroughs in chips and quantum computing to command future

Posted by in categories: computing, quantum physics

Beijing is pushing a ‘whole-of-the-nation’ approach to focus resources on tech breakthroughs in key areas amid rising pressure from the US.

Jan 30, 2024

Harnessing synthetic active particles for physical reservoir computing

Posted by in categories: computing, particle physics

The ability of living systems to process signals and information is of vital importance. Inspired by nature, Wang and Cichos show an experimental realization of a physical reservoir computer using self-propelled active microparticles to predict chaotic time series such as the Mackey–Glass and Lorenz series.

Jan 30, 2024

How to Build an Origami Computer

Posted by in categories: computing, mathematics

Two mathematicians have shown that origami can, in principle, be used to perform any possible computation.

Jan 30, 2024

New Microchip Breakthrough: New Era in Electronics?

Posted by in categories: computing, innovation

Ok… here we go again! (Yes, this is real. Already being tested in full wafers.)