Nov 9, 2024
You could start smelling the roses from far away using AI
Posted by Saúl Morales Rodriguéz in category: robotics/AI
AI can “teleport” scents without human hands (or noses)
AI can “teleport” scents without human hands (or noses)
CISA alerts to active exploits in Palo Alto, CyberPanel, and Android, urging urgent fixes.
TSMC will halt production of AI chips for Chinese firms produced using a 7nm node or lower starting on Monday.
This article explores how AI is revolutionizing digital companionship and why raising virtual pets together might be the future of social connection.
Wetware computing and organoid intelligence is an emerging research field at the intersection of electrophysiology and artificial intelligence. The core concept involves using living neurons to perform computations, similar to how Artificial Neural Networks (ANNs) are used today. However, unlike ANNs, where updating digital tensors (weights) can instantly modify network responses, entirely new methods must be developed for neural networks using biological neurons. Discovering these methods is challenging and requires a system capable of conducting numerous experiments, ideally accessible to researchers worldwide. For this reason, we developed a hardware and software system that allows for electrophysiological experiments on an unmatched scale. The Neuroplatform enables researchers to run experiments on neural organoids with a lifetime of even more than 100 days. To do so, we streamlined the experimental process to quickly produce new organoids, monitor action potentials 24/7, and provide electrical stimulations. We also designed a microfluidic system that allows for fully automated medium flow and change, thus reducing the disruptions by physical interventions in the incubator and ensuring stable environmental conditions. Over the past three years, the Neuroplatform was utilized with over 1,000 brain organoids, enabling the collection of more than 18 terabytes of data. A dedicated Application Programming Interface (API) has been developed to conduct remote research directly via our Python library or using interactive compute such as Jupyter Notebooks. In addition to electrophysiological operations, our API also controls pumps, digital cameras and UV lights for molecule uncaging. This allows for the execution of complex 24/7 experiments, including closed-loop strategies and processing using the latest deep learning or reinforcement learning libraries. Furthermore, the infrastructure supports entirely remote use. Currently in 2024, the system is freely available for research purposes, and numerous research groups have begun using it for their experiments. This article outlines the system’s architecture and provides specific examples of experiments and results.
The recent rise in wetware computing and consequently, artificial biological neural networks (BNNs), comes at a time when Artificial Neural Networks (ANNs) are more sophisticated than ever.
The latest generation of Large Language Models (LLMs), such as Meta’s Llama 2 or OpenAI’s GPT-4, fundamentally rely on ANNs.
Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), the University of California at Berkeley, and Aarhus University have taken an intriguing step forward by fabricating “PortaChrome,” a portable light system and design tool that can change the color and textures…
The portable light system and design tool “PortaChrome” uses UV and RGB LEDs to activate photochromic dye, reprogramming everyday objects like shirts. The MIT CSAIL researchers’ software can help users turn items into multicolor displays of fashion designs and health data.
Since the public release of OpenAI’s ChatGPT, artificial intelligence (AI) has quickly become a driving force in innovation and everyday life, sparking both excitement and concern. AI promises breakthroughs in fields like medicine, education, and energy, with the potential to solve some of society’s toughest challenges. But at the same time, fears around job displacement, privacy, and the spread of misinformation have led many to call for tighter government control.
Many are now seeking swift government intervention to regulate AI’s development in the waning “lame duck” session before the inauguration of the next Congress. These efforts have been led by tech giants, including OpenAI, Amazon, Google, and Microsoft, under the guise of securing “responsible development of advanced AI systems” from risks like misinformation and bias. Building on the Biden administration’s executive order to create the U.S. Artificial Intelligence Safety Institute (AISI) and mandate that AI “safety tests,” among other things, be reported to the government, the bipartisan negotiations would permanently authorize the AISI to act as the nation’s primary AI regulatory agency.
The problem is, the measures pushed by these lobbying campaigns favor large, entrenched corporations, sidelining smaller competitors and stifling innovation. If Congress moves forward with establishing a federal AI safety agency, even with the best of intentions, it risks cementing Big Tech’s dominance at the expense of startups. Rather than fostering competition, such regulation would likely serve the interests of the industry’s largest corporations, stifling entrepreneurship and limiting AI’s potential to transform America—and the world—for the better. The unintended consequences are serious: slower product improvement, fewer technological breakthroughs, and severe costs to the economy and consumers.
A 100+ page detailed analysis on 18 LLMs for embodied decision making.
ArXiv: https://arxiv.org/abs/2410.07166 Website: https://embodied-agent-interface.github.io.
The research focuses on evaluating how well Large Language Models (LLMs) can make decisions in environments where physical actions are…
Continue reading “Embodied Agent Interface: Benchmarking LLMs for Embodied Decision Making” »
❤️ Check out Lambda here and sign up for their GPU Cloud: https://lambdalabs.com/papers.
Oasis: A Universe in a Transformer — try it out now:
https://oasis.decart.ai/welcome.
Continue reading “Crazy AI Learned Minecraft — Try It Out For Free!” »
Sotherby’s says the Ai-Da Robot work “marks a moment in the history of modern and contemporary art”