Menu

Blog

Archive for the ‘virtual reality’ category: Page 2

Nov 13, 2024

Virtual training uses generative AI to teach robots how to traverse real world terrain

Posted by in categories: physics, robotics/AI, virtual reality

MIT CSAIL researchers have developed a generative AI system, LucidSim, to train robots in virtual environments for real-world navigation. Using ChatGPT and physics simulators, robots learn to traverse complex terrains. This method outperforms traditional training, suggesting a new direction for robotic training.


A team of roboticists and engineers at MIT CSAIL, Institute for AI and Fundamental Interactions, has developed a generative AI approach to teaching robots how to traverse terrain and move around objects in the real world.

Continue reading “Virtual training uses generative AI to teach robots how to traverse real world terrain” »

Nov 11, 2024

Who’s afraid of Artificial Intelligence?

Posted by in categories: robotics/AI, transportation, virtual reality

Artificial Intelligence is everywhere in Europe.

While some are worried about its long-term impact, a team of researchers at the University of Technology in Vienna is working on responsible ways to use AI.

Watch more 👉

Continue reading “Who’s afraid of Artificial Intelligence?” »

Nov 10, 2024

International Conference on Holodecks: Five Key Takeaways

Posted by in categories: biotech/medical, virtual reality

Shaking hands with a character from the Fortnite video game. Visualizing a patient’s heart in 3D—and “feeling” it beat. Touching the walls of the Roman Coliseum—from your sofa in Los Angeles. What if we could touch and interact with things that aren’t physically in front of us? This reality might be closer than we think, thanks to an emerging technology: the holodeck.

The name might sound familiar. In Star Trek’s Next Generation, a holodeck was an advanced 3D virtual reality world that created the illusion of solid objects. Now, immersive technology researchers at USC and beyond are taking us one step closer to making this science fiction concept a science fact.

Continue reading “International Conference on Holodecks: Five Key Takeaways” »

Nov 8, 2024

Frontiers: They basically controlled butterflies in a virtual environment wirelessly with human organoids

Posted by in categories: biological, robotics/AI, virtual reality

Wetware computing and organoid intelligence is an emerging research field at the intersection of electrophysiology and artificial intelligence. The core concept involves using living neurons to perform computations, similar to how Artificial Neural Networks (ANNs) are used today. However, unlike ANNs, where updating digital tensors (weights) can instantly modify network responses, entirely new methods must be developed for neural networks using biological neurons. Discovering these methods is challenging and requires a system capable of conducting numerous experiments, ideally accessible to researchers worldwide. For this reason, we developed a hardware and software system that allows for electrophysiological experiments on an unmatched scale. The Neuroplatform enables researchers to run experiments on neural organoids with a lifetime of even more than 100 days. To do so, we streamlined the experimental process to quickly produce new organoids, monitor action potentials 24/7, and provide electrical stimulations. We also designed a microfluidic system that allows for fully automated medium flow and change, thus reducing the disruptions by physical interventions in the incubator and ensuring stable environmental conditions. Over the past three years, the Neuroplatform was utilized with over 1,000 brain organoids, enabling the collection of more than 18 terabytes of data. A dedicated Application Programming Interface (API) has been developed to conduct remote research directly via our Python library or using interactive compute such as Jupyter Notebooks. In addition to electrophysiological operations, our API also controls pumps, digital cameras and UV lights for molecule uncaging. This allows for the execution of complex 24/7 experiments, including closed-loop strategies and processing using the latest deep learning or reinforcement learning libraries. Furthermore, the infrastructure supports entirely remote use. Currently in 2024, the system is freely available for research purposes, and numerous research groups have begun using it for their experiments. This article outlines the system’s architecture and provides specific examples of experiments and results.

The recent rise in wetware computing and consequently, artificial biological neural networks (BNNs), comes at a time when Artificial Neural Networks (ANNs) are more sophisticated than ever.

The latest generation of Large Language Models (LLMs), such as Meta’s Llama 2 or OpenAI’s GPT-4, fundamentally rely on ANNs.

Nov 4, 2024

Coarse-Grained Simulations of Adeno-Associated Virus and Its Receptor Reveal Influences on Membrane Lipid Organization and Curvature

Posted by in categories: biotech/medical, virtual reality

Adeno-associated virus (AAV) is a well-known gene delivery tool with a wide range of applications, including as a vector for gene therapies. However, the molecular mechanism of its cell entry remains unknown. Here, we performed coarse-grained molecular dynamics simulations of the AAV serotype 2 (AAV2) capsid and the universal AAV receptor (AAVR) in a model plasma membrane environment. Our simulations show that binding of the AAV2 capsid to the membrane induces membrane curvature, along with the recruitment and clustering of GM3 lipids around the AAV2 capsid. We also found that the AAVR binds to the AAV2 capsid at the VR-I loops using its PKD2 and PKD3 domains, whose binding poses differs from previous structural studies. These first molecular-level insights into AAV2 membrane interactions suggest a complex process during the initial phase of AAV2 capsid internalization.

Nov 3, 2024

Disney forms dedicated AI and XR group to coordinate company-wide use and adoption

Posted by in categories: augmented reality, business, robotics/AI, virtual reality

Disney is adding another layer to its AI and extended reality strategies. As first reported by Reuters, the company recently formed a dedicated emerging technologies unit. Dubbed the Office of Technology Enablement, the group will coordinate the company’s exploration, adoption and use of artificial intelligence, AR and VR tech.

It has tapped Jamie Voris, previously the CTO of its Studios Technology division, to oversee the effort. Before joining Disney in 2010, Voris was the chief technology officer at the National Football League. More recently, he led the development of the company’s Apple Vision Pro app. Voris will report to Alan Bergman, the co-chairman of Disney Entertainment. Reuters reports the company eventually plans to grow the group to about 100 employees.

“The pace and scope of advances in AI and XR are profound and will continue to impact consumer experiences, creative endeavors, and our business for years to come — making it critical that Disney explore the exciting opportunities and navigate the potential risks,” Bergman wrote in an email Disney shared with Engadget. “The creation of this new group underscores our dedication to doing that and to being a positive force in shaping responsible use and best practices.”

Oct 2, 2024

Researchers harness liquid crystal structures to design simple, yet versatile bifocal lenses

Posted by in categories: biological, computing, virtual reality

Researchers have developed a new type of bifocal lens that offers a simple way to achieve two foci (or spots) with intensities that can be adjusted by applying external voltage. The lenses, which use two layers of liquid crystal structures, could be useful for various applications such as optical interconnections, biological imaging, augmented/virtual reality devices and optical computing.

Sep 9, 2024

Improved virtual haptic technology enables uniform tactile sensation across displays

Posted by in categories: biotech/medical, virtual reality

A virtual haptic implementation technology that allows all users to experience the same tactile sensation has been developed. A research team led by Professor Park Jang-Ung from the Center for Nanomedicine within the Institute for Basic Science (IBS) and Professor Jung Hyun Ho from Severance Hospital’s Department of Neurosurgery has developed a technology that provides consistent tactile sensations on displays.

This research was conducted in collaboration with colleagues from Yonsei University Severance Hospital. It was published in Nature Communications on August 21, 2024.

Virtual haptic implementation technology, also known as tactile rendering technology, refers to the methods and systems that simulate the sense of touch in a . This technology aims to create the sensation of physical contact with virtual objects, enabling users to feel textures, shapes, and forces as if they were interacting with real-world items, even though the objects are digital.

Aug 31, 2024

Virtual and augmented reality can temporarily change the way people perceive distances, finds study

Posted by in categories: augmented reality, virtual reality

Researchers at the University of Toronto have found that using virtual and augmented reality (VR and AR) can temporarily change the way people perceive and interact with the real world—with potential implications for the growing number of industries that use these technologies for training purposes.

The study, published recently in the journal Scientific Reports, not only found that people moved differently in VR and AR, but that these changes led to temporary errors in movement in the real world. In particular, participants who used VR tended to undershoot their targets by not reaching far enough, while those who used AR tended to overshoot their targets by reaching too far.

This effect was noticeable immediately after using VR or AR, but gradually disappeared as participants readjusted to .

Aug 28, 2024

Scientists develop DMG equalization strategy via femtosecond laser micromachining induced refractive index tailoring

Posted by in categories: economics, internet, virtual reality

Optical fiber, as a physical medium for information transmission, is the “highway” of modern economic and social development. However, with the continuous emergence of high-speed and high-capacity communication scenarios such as virtual reality, 5G, intelligent driving, and the Internet of Things (IoT), there is an upper limit to the communication capacity (traffic flow) of the traditional single-mode fiber-optic communication system (highway).

Page 2 of 10512345678Last