Toggle light / dark theme

The concept of ‘anti-realism’ is widely seen as a fact of life for many physicists studying the mysterious effects of quantum mechanics. However, it also seems to contradict the assumptions of many other fields of research. In his research, Dr William Sulis at McMaster University in Canada explores the issue from a new perspective, by using a novel mathematical toolset named the ‘process algebra model’. In suggesting that reality itself is generated by interacting processes more fundamental than quantum particles, his theories could improve researchers’ understanding of fundamental processes in a wide variety of fields.

The concept of ‘locality’ states that objects and processes can only be influenced by other objects and processes in their immediate surroundings. It is a fundamental aspect of many fields of research and underpins all of the most complex systems we observe in nature, including living organisms. “Biologists and psychologists have known for centuries that the physical world is dominated by processes which are characterized by factors including transformation, interdependence, and information”, Dr Sulis explains. “Organisms are born, develop, continually exchange physical components and information with their environment, and eventually die.”

Beyond biology, the principle of locality also extends to Einstein’s theory of special relativity. Since the speed of light sets a fundamental speed limit on all processes in the universe, the theory states that no process can occur if it has not been triggered by another event in its past, at a close enough distance for light to travel between them within the time separating them. In general, these theories are unified by a concept which physicists call ‘realism’. Yet despite this seemingly intuitive rule, physicists have increasingly come to accept the idea that it doesn’t present a full description of how all processes unfold.

Researchers at Tohoku University, the University of Messina, and the University of California, Santa Barbara (UCSB) have developed a scaled-up version of a probabilistic computer (p-computer) with stochastic spintronic devices that is suitable for hard computational problems like combinatorial optimization and machine learning.

Moore’s law predicts that computers get faster every two years because of the evolution of semiconductor chips. While this is what has historically happened, the continued evolution is starting to lag. The revolutions in machine learning and means much higher computational ability is required. Quantum computing is one way of meeting these challenges, but significant hurdles to the practical realization of scalable quantum computers remain.

A p-computer harnesses naturally stochastic building blocks called probabilistic bits (p-bits). Unlike bits in traditional computers, p-bits oscillate between states. A p-computer can operate at room-temperature and acts as a domain-specific computer for a wide variety of applications in machine learning and artificial intelligence. Just like quantum computers try to solve inherently quantum problems in , p-computers attempt to tackle probabilistic algorithms, widely used for complicated computational problems in combinatorial optimization and sampling.

In 1916, Einstein finished his Theory of General Relativity, which describes how gravitational forces alter the curvature of spacetime. Among other things, this theory predicted that the Universe is expanding, which was confirmed by the observations of Edwin Hubble in 1929. Since then, astronomers have looked farther into space (and hence, back in time) to measure how fast the Universe is expanding – aka. the Hubble Constant. These measurements have become increasingly accurate thanks to the discovery of the Cosmic Microwave Background (CMB) and observatories like the Hubble Space Telescope.

Astronomers have traditionally done this in two ways: directly measuring it locally (using variable stars and supernovae) and indirectly based on redshift measurements of the CMB and cosmological models. Unfortunately, these two methods have produced different values over the past decade. As a result, astronomers have been looking for a possible solution to this problem, known as the “Hubble Tension.” According to a new paper by a team of astrophysicists, the existence of “Early Dark Energy” may be the solution cosmologists have been looking for.

The study was conducted by Marc Kamionkowski, the William R. Kenan, a junior professor of physics and astronomy at Johns Hopkins University (JHU), and Adam G. Riess – an astrophysicist and Bloomberg Distinguished Professor at JHU and the Space Telescope Science Institute (STScI). Their paper, titled “The Hubble Tension and Early Dark Energy,” is being reviewed for publication in the Annual Review of Nuclear and Particle Science (ARNP). As they explain in their paper, there are two methods for measuring cosmic expansion.

Establishing a moon base will be critical for the U.S. in the new space race and building safe and cost-effective landing pads for spacecraft to touch down there will be key.

These pads will have to stop and particles from sandblasting everything around them at more than 10,000 miles per hour as a rocket takes off or lands since there is no air to slow the rocket plume down.

However, how to build these landing pads is not so clear, as hauling materials and heavy equipment more than 230,000 miles into space quickly becomes cost prohibitive.

Dark matter makes up about 27% of the matter and energy budget in the universe, but scientists do not know much about it. They do know that it is cold, meaning that the particles that make up dark matter are slow-moving. It is also difficult to detect dark matter directly because it does not interact with light. However, scientists at the U.S. Department of Energy’s Fermi National Accelerator Laboratory (Fermilab) have discovered a way to use quantum computers to look for dark matter.

Aaron Chou, a senior scientist at Fermilab, works on detecting dark matter through quantum science. As part of DOE’s Office of High Energy Physics QuantISED program, he has developed a way to use qubits, the main component of quantum computing.

Performing computation using quantum-mechanical phenomena such as superposition and entanglement.

Stephen Wolfram is at his jovial peak in this technical interview regarding the Wolfram Physics project (theory of everything).
Sponsors: https://brilliant.org/TOE for 20% off. http://algo.com for supply chain AI.

Link to the Wolfram project: https://www.wolframphysics.org/

Patreon: https://patreon.com/curtjaimungal.
Crypto: https://tinyurl.com/cryptoTOE
PayPal: https://tinyurl.com/paypalTOE
Twitter: https://twitter.com/TOEwithCurt.
Discord Invite: https://discord.com/invite/kBcnfNVwqs.
iTunes: https://podcasts.apple.com/ca/podcast/better-left-unsaid-wit…1521758802
Pandora: https://pdora.co/33b9lfP
Spotify: https://open.spotify.com/show/4gL14b92xAErofYQA7bU4e.
Subreddit r/TheoriesOfEverything: https://reddit.com/r/theoriesofeverything.
Merch: https://tinyurl.com/TOEmerch.

TIMESTAMPS:
00:00:00 Introduction.
00:02:26 Behind the scenes.
00:04:00 Wolfram critiques are from people who haven’t read the papers (generally)
00:10:39 The Wolfram Model (Theory of Everything) overview in under 20 minutes.
00:29:35 Causal graph vs. multiway graph.
00:39:42 Global confluence and causal invariance.
00:44:06 Rulial space.
00:49:05 How to build your own Theory of Everything.
00:54:00 Computational reducibility and irreducibility.
00:59:14 Speaking to aliens / communication with other life forms.
01:06:06 Extra-terrestrials could be all around us, and we’d never see it.
01:10:03 Is the universe conscious? What is “intelligence”?
01:13:03 Do photons experience time? (in the Wolfram model)
01:15:07 “Speed of light” in rulial space.
01:16:37 Principle of computational equivalence.
01:21:13 Irreducibility vs undecidability and computational equivalence.
01:23:47 Is infinity “real”?
01:28:08 Discrete vs continuous space.
01:33:40 Testing discrete space with the cosmic background radiation (CMB)
01:34:35 Multiple dimensions of time.
01:36:12 Defining “beauty” in mathematics, as geodesics in proof space.
01:37:29 Particles are “black holes” in branchial space.
01:39:44 New Feynman stories about his abjuring of woo woo.
01:43:52 Holographic principle / AdS CFT correspondence, and particles as black holes.
01:46:38 Wolfram’s view on cryptocurrencies, and how his company trades in crypto [Amjad Hussain]
01:57:38 Einstein field equations in economics.
02:03:04 How to revolutionize a field of study as a beginner.
02:04:50 Bonus section of Curt’s thoughts and questions.

Just wrapped (April 2021) a documentary called Better Left Unsaid http://betterleftunsaidfilm.com on the topic of “when does the left go too far?” Visit that site if you’d like to watch it.

All proton-proton data collected by the CMS experiment during LHC Run-1 (2010−2012) are now available through the CERN Open Data Portal. Today’s release of 491 TB of collision data collected during 2012 culminates the process that started in 2014 with the very first release of research-grade open data in experimental particle physics. Completing the delivery of Run-1 data within 10 years after data taking reaffirms the CMS collaboration’s commitment to its open data policy.

The newly released data consist of 42 collision datasets from CMS data taken in early and late 2012 and count an additional 8.2 fb-1 of integrated luminosity for anyone to study. Related data assets, such as luminosity information and validated data filters, have been updated to cover the newly released data.

To foster reusability, physics analysis code examples to extract physics objects from these data are now included as CERN Open Data Portal records. This software has been successfully used to demonstrate the intricacies of the experimental particle data in the CMS Open Data workshop during the last three years. In addition, the CMS Open Data guide covers details of accessing physics objects using this software, giving open data users the possibility to expand on this example code for studies of their own interest.

Japanese scientists were able to prove that rare earth elements are made by looking at the spectra of light coming from neutron stars that were colliding.

For the first time, Japanese scientists have found evidence that rare earth elements are indeed made when two neutron stars merge. The Astrophysical Journal just published the specifics of the scientists’ discoveries.

The first verified incidence of this process, GW 170,817, occurred in 2017.


University of Warwick/Mark Garlick/Wikimedia Commons.

Many of the heavy atoms that make up our universe are created in the explosion that occurs when two neutron stars spiral inward and merge.

Imagine you are at a museum. After a long day admiring the exhibitions, you are exiting the museum. But to be able to get out, you will need to exit through the gift shop. The layout of the gift shop can be set up in several ways. Maybe you can take a short and direct path to the exit, maybe there are long winding corridors stuffed with merchandise you need to pass through. If you take the longer path, you are more likely to lose more of your money before you get outside. The scientists at the CMS collaboration have recently observed a similar phenomenon in high-energy heavy ion collisions, as those illustrated in the event display.

The life of the tiniest particles making up ordinary matter — quarks and gluons — is governed by the laws of quantum chromodynamics. These laws require quarks and gluons to form bound states, like protons and neutrons, under normal conditions. However, conditions like in the early universe, when the energy density and temperature far exceeded those of ordinary matter, can be achieved in giant particle accelerators. In the Large Hadron Collider at CERN this is done by colliding lead nuclei that are accelerated close to the speed of light. In these conditions, a new state of matter, called the quark-gluon plasma, is formed for a tiny fraction of a second. This new state of matter is special, since within the volume of the matter, quarks and gluons act as free particles, without the need to form bound states.

Figure 1: A schematic presentation of a non-central (left) and central (right) heavy ion collision. The outlines of the ions are presented by dashed lines, while the overlap region in which the quark-gluon plasma is produced is colored in orange. The red star shows a position where two quarks might scatter, and green and blue arrows are alternative paths the scattered quark can take to escape the quark-gluon plasma.

Conventional light sources for fiber-optic telecommunications emit many photons at the same time. Photons are particles of light that move as waves. In today €™s telecommunication networks, information is transmitted by modulating the properties of light waves traveling in optical fibers, similar to how radio waves are modulated in AM and FM channels.

In quantum communication, however, information is encoded in the phase of a single photon – the photon €™s position in the wave in which it travels. This makes it possible to connect quantum sensors in a network spanning great distances and to connect quantum computers together.

Researchers recently produced single-photon sources with operating wavelengths compatible with existing fiber communication networks. They did so by placing molybdenum ditelluride semiconductor layers just atoms thick on top of an array of nano-size pillars (Nature Communications, “Site-Controlled Telecom-Wavelength Single-Photon Emitters in Atomically-thin MoTe 2 ”).