Menu

Blog

Archive for the ‘information science’ category: Page 2

Dec 20, 2024

AI-powered algorithm enables personalized age transformation for human faces

Posted by in categories: biotech/medical, cyborgs, information science, life extension

Researchers at University of North Carolina at Chapel Hill and University of Maryland recently developed MyTimeMachine (MyTM), a new AI-powered method for personalized age transformation that can make human faces in images or videos appear younger or older, accounting for subjective factors influencing aging.

This algorithm, introduced in a paper posted to the arXiv preprint server, could be used to broaden or enhance the features of consumer-facing picture-editing platforms, but could also be a valuable tool for the film, TV and entertainment industries.

Continue reading “AI-powered algorithm enables personalized age transformation for human faces” »

Dec 19, 2024

New software unlocks secrets of cell signaling, showing realistic simulations

Posted by in categories: bioengineering, biotech/medical, information science, neuroscience

Researchers at University of California San Diego have developed and tested a new software package, called Spatial Modeling Algorithms for Reactions and Transport (SMART), that can realistically simulate cell-signaling networks—the complex systems of molecular interactions that allow cells to respond to diverse cues from their environment.

Cell-signaling networks involve many distinct steps and are also greatly influenced by the complex, three-dimensional shapes of cells and subcellular components, making them difficult to simulate with existing tools. SMART offers a solution to this problem, which could help accelerate research in fields across the life sciences, such as , pharmacology and .

The researchers successfully tested the new software in biological systems at several different scales, from cell signaling in response to adhesive cues, to calcium release events in subcellular regions of neurons and , to the production of ATP (the energy currency in cells) within a detailed representation of a single mitochondrion.

Dec 19, 2024

New radar algorithm reveals hidden dance of ionospheric plasma

Posted by in categories: information science, particle physics, space

At night, charged particles from the sun caught by Earth’s magnetosphere rain down into the atmosphere. The impacting particles rip electrons from atoms in the atmosphere, creating both beauty and chaos. These high-energy interactions cause the northern and southern lights, but they also scatter radio signals, wreaking havoc on ground-based and satellite communications.

Scientists would like to track electrical activity in the ionosphere by measuring the distribution of plasma, the form matter takes when positive ions are separated from their electrons, to help better predict how communications will be affected by electromagnetic energy.

But analyzing plasma in the ionosphere is a challenge because its distribution changes quickly and its movements are often unpredictable. In addition, collisional physics makes detecting true motion in the lower ionosphere exceedingly difficult.

Dec 18, 2024

Helping machine learning models identify objects in any pose

Posted by in categories: information science, robotics/AI, space

A new visual recognition approach improved a machine learning technique’s ability to both identify an object and how it is oriented in space, according to a study presented in October at the European Conference on Computer Vision in Milan, Italy.

Self-supervised learning is a machine learning approach that trains on unlabeled data, extending generalizability to real-world data. While it excels at identifying objects, a task called semantic classification, it may struggle to recognize objects in new poses.

This weakness quickly becomes a problem in situations like autonomous vehicle navigation, where an algorithm must assess whether an approaching car is a head-on collision threat or side-oriented and just passing by.

Dec 18, 2024

Retrocausal Quantum Teleportation Protocol

Posted by in categories: computing, cosmology, information science, quantum physics, time travel

While classical physics presents a deterministic universe where cause must precede effect, quantum mechanics and relativity theory paint a more nuanced picture. There are already well-known examples from relativity theory like wormholes, which are valid solutions of Einstein’s Field Equations, and similarly in quantum mechanics the non-classical state of quantum entanglement—the “spooky action at a distance” that troubled Einstein—which demonstrates that quantum systems can maintain instantaneous correlations across space and, potentially, time.

Perhaps most intriguingly, the protocol suggests that quantum entanglement can be used to effectively send information about optimal measurement settings “back in time”—information that would normally only be available after an experiment is complete. This capability, while probabilistic in nature, could revolutionize quantum computing and measurement techniques. Recent advances in multipartite hybrid entanglement even suggest these effects might be achievable in real-world conditions, despite environmental noise and interference. The realization of such a retrocausal quantum computational network would, effectively, be the construction of a time machine, defined in general as a system in which some phenomenon characteristic only of chronology violation can reliably be observed.

This article explores the theoretical foundations, experimental proposals, significant improvements, and potential applications of the retrocausal teleportation protocol. From its origins in quantum mechanics and relativity theory to its implications for our understanding of causality and the nature of time itself, we examine how this cutting-edge research challenges our classical intuitions while opening new possibilities for quantum technology. As we delve into these concepts, we’ll see how the seemingly fantastic notion of time travel finds a subtle but profound expression in the quantum realm, potentially revolutionizing our approach to quantum computation and measurement while deepening our understanding of the universe’s temporal fabric.

Dec 18, 2024

Quantum AI chip taps into “parallel universes” and solves equation in 5 minutes that would normally take 1 septillion years

Posted by in categories: cosmology, information science, quantum physics, robotics/AI

Google’s Willow chip achieves scalable quantum error correction, reducing errors, and maintaining stability across a million cycles.

Dec 18, 2024

Beyond AI: Preparing For Artificial Superintelligence

Posted by in categories: business, economics, information science, robotics/AI

In 1956, a group of pioneering minds gathered at Dartmouth College to define what we now call artificial intelligence (AI). Even in the early 1990s when colleagues and I were working for early-stage expert systems software companies, the notion that machines could mimic human intelligence was an audacious one. Today, AI drives businesses, automates processes, creates content, and personalizes experiences in every industry. It aids and abets more economic activity than we “ignorant savages” (as one of the founding fathers of AI, Marvin Minsky, referred to our coterie) could have ever imagined. Admittedly, the journey is still early—a journey that may take us from narrow AI to artificial general intelligence (AGI) and ultimately to artificial superintelligence (ASI).

As business and technology leaders, it’s crucial to understand what’s coming: where AI is headed, how far off AGI and ASI might be, and what opportunities and risks lie ahead. To ignore this evolution would be like a factory owner in 1900 dismissing electricity as a passing trend.

Let’s first take stock of where we are. Modern AI is narrow AI —technologies built to handle specific tasks. Whether it’s a large language model (LLM) chatbot responding to customers, algorithms optimizing supply chains, or systems predicting loan defaults, today’s AI excels at isolated functions.

Dec 17, 2024

First Data Center-Ready Trapped-Ion Quantum Computer Outside US Is Delivered

Posted by in categories: computing, employment, information science, quantum physics

Quantum computing and networking company IonQ has delivered a data center-ready trapped-ion quantum computer to the uptownBasel innovation campus in Arlesheim, Switzerland.

The IonQ Forte Enterprise quantum computer is the first of its kind to operate outside the United States and Switzerland’s first quantum computer designed for commercial use.

According to IonQ, Forte Enterprise is now online, servicing compute jobs while performing at a record algorithmic qubit count of #AQ36. The number of algorithmic qubits (#AQ) is a tool for showing how useful a quantum computer is at solving real problems for users by summarizing its ability to run benchmark quantum algorithms often used for applications.

Dec 17, 2024

IBM and State of Illinois to Build National Quantum Algorithm Center in Chicago with Universities and Industries

Posted by in categories: information science, quantum physics, supercomputing

Anchored by next-generation IBM Quantum System Two in Illinois Quantum and Microelectronics Park, new initiative will advance useful quantum applications as industries move towards quantum-centric supercomputing.

Dec 17, 2024

A Review of Synthetic-Aperture Radar Image Formation Algorithms and Implementations: A Computational Perspective

Posted by in categories: computing, information science

A review of syntheticapertureradar image formation algorithms and implementations: a computational perspective.

✍️ Helena Cruz et al.


Designing synthetic-aperture radar image formation systems can be challenging due to the numerous options of algorithms and devices that can be used. There are many SAR image formation algorithms, such as backprojection, matched-filter, polar format, Range–Doppler and chirp scaling algorithms. Each algorithm presents its own advantages and disadvantages considering efficiency and image quality; thus, we aim to introduce some of the most common SAR image formation algorithms and compare them based on these two aspects. Depending on the requisites of each individual system and implementation, there are many device options to choose from, for instance, FPGAs, GPUs, CPUs, many-core CPUs, and microcontrollers. We present a review of the state of the art of SAR imaging systems implementations.

Page 2 of 32912345678Last