Toggle light / dark theme

What is the Church-Turing Thesis?

Modern-day computers have proved to be quite powerful in what they can do. The rise of AI has made things we previously only imagined possible. And the rate at which computers are increasing their computational power certainly makes it seem like we will be able to do almost anything with them. But as we’ve seen before, there are fundamental limits to what computers can do regardless of the processors or algorithms they use. This naturally leads us to ask what computers are capable of doing at their best and what their limits are. Which requires formalizing various definitions in computing.

This is exactly what happened in the early 20th century. Logicians & mathematicians were trying to formalize the foundations of mathematics through logic. One famous challenge based on this was the Entscheidungsproblem posed by David Hilbert and Wilhelm Ackermann. The problem asked if there exists an algorithm that can verify whether any mathematical statement is true or false based on provided axioms. Such an algorithm could be used to verify if any mathematical system is internally consistent. Kurt Gödel proved in 1931 that this problem could not be answered one way or the other through his incompleteness theorems.

Years later, Alan Turing and Alonzo Church proved the same through separate, independent means. Turing did so by developing Turing machines (called automatic machines at the time) and the Halting problem. Church did so by developing lambda calculus. Later on, it was proved that Turing machines and lambda calculus are mathematically equivalent. This led many mathematicians to theorize that computability could be defined by either of these systems. That in turn caused Turing and Church to make their thesis: every effectively calculable function is a computable function. In simpler terms, it states that any computation from any model can be carried out by a Turing machine or lambda calculus. To better understand the implications of the Church-Turing thesis, we need to explore the different kinds of computational machines.

A chaos-modulated metasurface for physical-layer secure communications

With so many people using devices that can be connected to the internet, reliably securing wireless communications and protecting the data they are exchanging is of growing importance. While computer scientists have devised increasingly advanced security measures over the past decades, the most effective techniques rely on complex algorithms and intensive computations, which can consume a lot of energy.

Researchers at Peking University, Southeast University, University of Sannio and other institutes recently introduced a new approach for securing communications both effectively and energy-efficiently, which relies on a reconfigurable metasurface with properties that are modulated by chaotic patterns.

This approach, outlined in a paper published in Nature Communications, is based on an idea conceived by the senior authors Vincenzo Galdi, Lianlin Li and Tie Jun Cui, who oversaw the project. The idea was then realized at Peking University and Southeast University by junior authors JiaWen Xu Menglin Wei and Lei Zhang.

Speed test of ‘tunneling’ electrons challenges alternative interpretation of quantum mechanics

As the traveled along the waveguide and tunneled into the barrier, they also tunneled into the secondary waveguide, jumping back and forth between the two at a consistent rate, allowing the research team to calculate their speed.

By combining this element of time with measurements of the photon’s rate of decay inside the barrier, the researchers were able to calculate dwell time, which was found to be finite.

The researchers write, “Our findings contribute to the ongoing tunneling time debate and can be viewed as a test of Bohmian trajectories in . Regarding the latter, we find that the measured energy–speed relationship does not align with the particle dynamics postulated by the guiding equation in Bohmian mechanics.”

Calculating the electron’s magnetic moment: State-dependent values emerge from Dirac equation

Quantum mechanics has a reputation that precedes it. Virtually everyone who has bumped up against the quantum realm, whether in a physics class, in the lab, or in popular science writing, is left thinking something like, “Now, that is really weird.” For some, this translates to weird and wonderful. For others it is more like weird and disturbing.

Chip Sebens, a professor of philosophy at Caltech who asks foundational questions about physics, is firmly in the latter camp. “Philosophers of physics generally get really frustrated when people just say, ‘OK, here’s quantum mechanics. It’s going to be weird. Don’t worry. You can make the right predictions with it. You don’t need to try to make too much sense out of it, just learn to use it.’ That kind of thing drives me up the wall,” Sebens says.

One particularly weird and disturbing area of physics for people like Sebens is theory. Quantum field theory goes beyond quantum mechanics, incorporating the and allowing the number of particles to change over time (such as when an electron and positron annihilate each other and create two photons).

Breaking the Bottleneck: All-Optical Chip Could Unlock Light-Speed Communication

New optical chip enables ultra-fast computing and data processing. Built using silicon photonics for next-gen networks. The rise of the big data era presents major challenges for information processing, particularly in terms of handling large volumes of data and managing energy consumption. These

Black-hole solutions in quantum gravity with Vilkovisky-DeWitt effective action

Physicists propose that calculations of certain aspects of quantum gravity can currently be done even without a full theory of quantum gravity itself. Basically, they work backwards from the fact that quantum gravity on the macro scale must conform to Einstein’s relativity theories. This approach is effective until the small scale of a black hole singularity is close.

(See my Comment below for an article link to POPULAR MECHANICS that discussed the scientific article in an accessible manner.


We study new black-hole solutions in quantum gravity. We use the Vilkovisky-DeWitt unique effective action to obtain quantum gravitational corrections to Einstein’s equations. In full analogy to previous work done for quadratic gravity, we find new black-hole–like solutions. We show that these new solutions exist close to the horizon and in the far-field limit.

Made in China: new memristors speed up AI data processing by 7.7 times

Chinese researchers present new technology for data processing and sorting AI systems using memristors.

Memristors are electronic elements that can change their resistance depending on the electric charges that flow through them. They are able to mimic the functions of processing and storing information in a similar way to the human brain.

Chinese scientists have combined the use of memristors with an advanced data sorting algorithm to process large amounts of information more efficiently. According to them, this approach can help overcome performance limitations not only in computing, but also in artificial intelligence systems and equipment design.

Data Science and Machine Learning: Mathematical and

D.P. Kroese, Z.I. Botev, T. Taimre, R. Vaisman. Data Science and Machine Learning: Mathematical and Statistical Methods, Chapman and Hall/CRC, Boca Raton, 2019.

The purpose of this book is to provide an accessible, yet comprehensive textbook intended for students interested in gaining a better understanding of the mathematics and statistics that underpin the rich variety of ideas and machine learning algorithms in data science.