Menu

Blog

Archive for the ‘information science’ category: Page 5

Dec 7, 2024

Large language models can be squeezed onto your phone — rather than needing 1000s of servers to run — after breakthrough

Posted by in categories: information science, mobile phones, robotics/AI

Running massive AI models locally on smartphones or laptops may be possible after a new compression algorithm trims down their size — meaning your data never leaves your device. The catch is that it might drain your battery in an hour.

Dec 7, 2024

From Dictation to Automation: The Rise of AI Scribes in Healthcare

Posted by in categories: biotech/medical, health, information science, robotics/AI

Despite technological advances like electronic health records (EHRs) and dictation tools, the administrative load on healthcare providers has only grown, often overshadowing the time and energy dedicated to direct patient care. This escalation in clerical tasks is a major contributor to physician burnout and dissatisfaction, affecting not only the well-being of providers but also the quality of care they deliver.

During consultations, the focus on documentation can detract from meaningful patient interactions, resulting in fragmented, rushed, and sometimes impersonal communication. The need for a solution that both streamlines documentation and restores the patient-centred nature of healthcare has never been more pressing. This is where AI-powered medical scribes come into play, offering a promising path from traditional dictation to fully automated, integrated documentation support.

AI medical scribe software utilises advanced artificial intelligence and machine learning to transcribe, in real time, entire patient-physician consultations without the need for traditional audio recordings. Leveraging sophisticated speech recognition and natural-language processing (NLP) algorithms, AI scribes are capable of interpreting and processing complex medical conversations with impressive accuracy. These systems can intelligently filter out non-essential dialogue, such as greetings and small talk, to create a streamlined and detailed clinical note.

Dec 6, 2024

Algorithm analyzes multiple mammograms to improve breast cancer risk prediction

Posted by in categories: biotech/medical, information science

A new study from Washington University School of Medicine in St. Louis describes an innovative method of analyzing mammograms that significantly improves the accuracy of predicting the risk of breast cancer development over the following five years.

Using up to three years of previous mammograms, the new method identified individuals at high risk of developing 2.3 times more accurately than the standard method, which is based on questionnaires assessing clinical risk factors alone, such as age, race and family history of breast cancer.

The study is published Dec. 5 in JCO Clinical Cancer Informatics.

Dec 5, 2024

Building a “Google Maps” for Biology: Human Cell Atlas Revolutionizes Medicine

Posted by in categories: biotech/medical, genetics, health, information science, robotics/AI

New research from the Human Cell Atlas offers insights into cell development, disease mechanisms, and genetic influences, enhancing our understanding of human biology and health.

The Human Cell Atlas (HCA) consortium has made significant progress in its mission to better understand the cells of the human body in health and disease, with a recent publication of a Collection of more than 40 peer-reviewed papers in Nature and other Nature Portfolio journals.

The Collection showcases a range of large-scale datasets, artificial intelligence algorithms, and biomedical discoveries from the HCA that are enhancing our understanding of the human body. The studies reveal insights into how the placenta and skeleton form, changes during brain maturation, new gut and vascular cell states, lung responses to COVID-19, and the effects of genetic variation on disease, among others.

Dec 4, 2024

For news, algorithmic social networks are a failed experiment

Posted by in categories: information science, robotics/AI

Meta might yet teach its AI to more consistently show the right posts at the right time. Still, there’s a bigger lesson it could learn from Bluesky, though it might be an uncomfortable one for a tech giant to confront. It’s that introducing algorithms into a social feed may cause more problems than it solves—at least if timeliness matters, as it does with any service that aspires to scoop up disaffected Twitter users.

For a modern social network, Bluesky stays out of your way to a shocking degree. (So does Mastodon; I’m a fan, but it seems to be more of an acquired taste.) Bluesky’s primary view is “Following”—the most recent posts from the people you choose to follow, just as in the golden age of Twitter. (Present-day Twitter and Threads have equivalent views, but not as their defaults.) Starter Packs, which might be Bluesky’s defining feature, let anyone curate a shareable list of users. You can follow everyone in one with a single click, or pick and choose, but either way, you decide.

Dec 4, 2024

Novel framework can generate images more aligned with user expectations

Posted by in categories: information science, robotics/AI

Generative models, artificial neural networks that can generate images or texts, have become increasingly advanced in recent years. These models can also be advantageous for creating annotated images to train algorithms for computer vision, which are designed to classify images or objects contained within them.

While many generative models, particularly generative adversarial networks (GANs), can produce synthetic images that resemble those captured by cameras, reliably controlling the content of the images they produce has proved challenging. In many cases, the images generated by GANs do not meet the exact requirements of users, which limits their use for various applications.

Researchers at Seoul National University of Science and Technology recently introduced a new image framework designed to incorporate the content users would like generated images to contain. This framework, introduced in a paper published on the arXiv preprint server, allows users to exert greater control over the image generation process, producing images that are more aligned with the ones they were envisioning.

Dec 3, 2024

The Role Of Quantum Computing In Personalized Medicine

Posted by in categories: biotech/medical, computing, genetics, information science, quantum physics

The integration of quantum computing into personalized medicine holds great promise for revolutionizing disease diagnosis, treatment development, and patient outcomes. Quantum computers have the potential to process vast amounts of genetic data much faster than classical computers, enabling researchers to identify patterns and correlations that may not be apparent with current technology. This could lead to breakthroughs in understanding the genetic basis of complex diseases and developing targeted treatments.

Quantum computing also has the potential to revolutionize medical imaging by enabling the simulation of complex magnetic resonance imaging (MRI) and positron emission tomography (PET) scans. Quantum algorithms can efficiently process large-scale imaging data, enabling researchers to reconstruct high-resolution images that reveal subtle details about tissue structure and function. This has significant implications for disease diagnosis and treatment, where accurate imaging is critical for developing effective treatments.

The use of quantum computing in personalized medicine raises important ethical considerations, such as concerns about privacy and informed consent. The ability to rapidly analyze large amounts of genetic data also raises questions about how this information should be used and shared with patients. Regulatory frameworks will play a crucial role in shaping the development and deployment of quantum computing in personalized medicine, balancing the need to promote innovation with the need to protect patient safety and privacy.

Dec 3, 2024

Liquid AI’s new STAR model architecture outshines Transformer efficiency

Posted by in categories: information science, robotics/AI

As described in that paper and henceforth, a transformer is a deep learning neural network architecture that processes sequential data, such as text or time-series information.

Now, MIT-birthed startup Liquid AI has introduced STAR (Synthesis of Tailored Architectures), an innovative framework designed to automate the generation and optimization of AI model architectures.

The STAR framework leverages evolutionary algorithms and a numerical encoding system to address the complex challenge of balancing quality and efficiency in deep learning models.

Dec 2, 2024

Novel quantum computing algorithm enhances single-cell analysis

Posted by in categories: biological, computing, information science, quantum physics

A new quantum algorithm developed by University of Georgia statisticians addresses one of the most complex challenges in single-cell analysis, signaling significant impact in both the fields of computational biology and quantum computing.

The study, “Bisection Grover’s Search Algorithm and Its Application in Analyzing CITE-seq Data,” was published in the Journal of the American Statistical Association on Sept. 20.

While traditional approaches struggle to handle the immense amount of data generated from measuring both RNA and in individual cells, the new enables analysis of data from a single-cell technology known as CITE-seq. It allows for selection of the most important markers from billions of possible combinations—a task that would be formidable using classical methods.

Nov 30, 2024

Physicists Just Found a Quirk in Einstein’s Predictions of Space-Time

Posted by in categories: information science, physics, space

The fabric of space and time is not exempt from the effects of gravity. Plop in a mass and space-time curves around it, not dissimilar to what happens when you put a bowling ball on a trampoline.

This dimple in space-time is the result of what we call a gravity well, and it was first described over 100 years ago by Albert Einstein’s field equations in his theory of general relativity. To this day, those equations have held up. We’d love to know what Einstein was putting in his soup. Whatever it was, general relativity has remained pretty solid.

Continue reading “Physicists Just Found a Quirk in Einstein’s Predictions of Space-Time” »

Page 5 of 329First23456789Last