Toggle light / dark theme

In 1884, Edwin Abbott wrote the novel Flatland: A Romance in Many Dimensions as a satire of Victorian hierarchy. He imagined a world that existed only in two dimensions, where the beings are 2D geometric figures. The physics of such a world is somewhat akin to that of modern 2D materials, such as graphene and transition metal dichalcogenides, which include tungsten disulfide (WS2), tungsten diselenide (WSe2), molybdenum disulfide (MoS2) and molybdenum diselenide (MoSe2).

Modern 2D materials consist of single-atom layers, where electrons can move in two dimensions but their motion in the third dimension is restricted. Due to this ‘squeeze’, 2D materials have enhanced optical and that show great promise as next-generation, ultrathin devices in the fields of energy, communications, imaging and quantum computing, among others.

Typically, for all these applications, the 2D materials are envisioned in flat-lying arrangements. Unfortunately, however, the strength of these materials is also their greatest weakness—they are extremely thin. This means that when they are illuminated, light can interact with them only over a tiny thickness, which limits their usefulness. To overcome this shortcoming, researchers are starting to look for new ways to fold the 2D materials into complex 3D shapes.

In 2001 at the Brookhaven National Laboratory in Upton, New York, a facility used for research in nuclear and high-energy physics, scientists experimenting with a subatomic particle called a muon encountered something unexpected.

To explain the fundamental physical forces at work in the universe and to predict the results of high-energy particle experiments like those conducted at Brookhaven, Fermilab in Illinois, and at CERN ’s Large Hadron Collider in Geneva, Switzerland, physicists rely on the decades-old theory called the Standard Model, which should explain the precise behavior of muons when they are fired through an intense magnetic field created in a superconducting magnetic storage ring. When the muon in the Brookhaven experiment reacted in a way that differed from their predictions, researchers realized they were on the brink of a discovery that could change science’s understanding of how the universe works.

Earlier this month, after a decades-long effort that involved building more powerful sensors and improving researchers’ capacity to process 120 terabytes of data (the equivalent of 16 million digital photographs every week), a team of scientists at Fermilab announced the first results of an experiment called Muon g-2 that suggests the Brookhaven find was no fluke and that science is on the brink of an unprecedented discovery.

Engineers at Duke University have developed the world’s first fully recyclable printed electronics. Their recycling process recovers nearly 100% of the materials used—and preserves most of their performance capabilities for reuse.

By demonstrating a crucial and relatively complex computer component—the transistor—created with three carbon-based inks, the researchers hope to inspire a new generation of recyclable electronics.

“Silicon-based computer components are probably never going away, and we don’t expect easily recyclable electronics like ours to replace the technology and devices that are already widely used,” said Aaron Franklin, the Addy Professor of Electrical and Computer Engineering at Duke. “But we hope that by creating new, fully recyclable, easily printed electronics and showing what they can do, that they might become widely used in future applications.”

John Martinis has done groundbreaking research on coherent superconducting devices since his PhD at the University of California, Berkeley, in 1985. These superconducting devices can be modeled as lumped-element electric circuits using Josephson junctions, capacitors and inductors as components. The fact that a superconducting phase across a Josephson junction can display coherent quantum behavior – even though it is a property of the wave function of an immense number of electrons – can be viewed as a fundamental discovery [1], kickstarting, in retrospect, the field of superconducting quantum computing.

John Martinis invented and developed the superconducting phase qubit, based on a current-biased Josephson junction, for the purpose of scalable multi-qubit quantum computing [2]. In 2002, he first demonstrated coherent Rabi oscillations and quantum measurement for such superconducting phase qubit [3]. He has had a longstanding interest in understanding the origin of noise in superconducting electric circuits as these sources of noise naturally limit qubit coherence. In particular, his understanding of noise sources such as dielectric loss, flux noise and the presence and dynamics of quasi-particles [4], by means of simple physical models, have been instrumental in the field. The effect and mitigation of quasi-particles and how they are affected by radiation and cosmic rays continues to be of high interest for the future of superconducting quantum devices [5, 6].

An important step showing his leadership and commitment to building a quantum computer came with his 2014 move, as a Professor at UCSB, to Google, where he gathered a large team of physicists and engineers to tackle the challenge of making a multi-qubit programmable processor. This team has excelled in its relentless focus on optimizing device performance by implementing successful engineering choices for qubit design, couplers and scalable I/O.

After doubling its sample size, the largest study of genetic data from autistic people has identified 255 genes associated with the condition, an increase of more than 40 genes since the researchers’ 2019 update; 71 of the genes rise above a stringent statistical bar the team had not previously used. The new analysis also adds data from people with developmental delay or schizophrenia and considers multiple types of mutations.

“It’s a really significant step forward in what we do,” said Kyle Satterstrom, a computational biologist in Mark Daly’s lab at the Broad Institute in Cambridge, Massachusetts. Satterstrom presented the findings virtually on Tuesday at the 2021 International Society for Autism Research annual meeting. (Links to abstracts may work only for registered conference attendees.)

The team’s previous analyses used data from the Autism Sequencing Consortium, which enrolls families through their doctors. The researchers mainly scoured the genetic data to find rare, non-inherited mutations linked to autism.

Our physical space-time reality isn’t really “physical” at all, its apparent solidity of objects, as well as any other associated property such as time, is an illusion. As a renowned physicist Niels Bohr once said: “Everything we call real is made of things that cannot be regarded as real.” But what’s not an illusion is your subjective experience, i.e., your consciousness; that’s the only “real” thing, according to proponents of Experiential Realism. It refers to interacting entangled conscious agents at various ontological levels, giving rise to conscious experience all the way down, and I’d argue all the way up, seemingly ad infinitum. It’s a “matryoshka” of embedded realities: conscious minds within larger minds.

#ExperientialRealism


So, why Experiential Realism? From the bigger picture perspective, we are here for experience necessary for evolution of our conscious minds. Our limitations, such as our ego, belief traps, political correctness, our very human condition define who we are, but the realization that we largely impose those limitations on ourselves gives us more evolvability and impetus to overcome these self-imposed limits to move towards higher goals and state of being.

We are what we’ve experienced — the sum of our experiences define who we are. In this sense, as free will agents, we are co-creators within this experiential matrix. Non-duality is the essence of Experiential Realism — experience and experiencer are one. How can you possibly separate your own existence from the world, the observer from the observed? Today, philosophers and scientists argue that information is fundamental but consciousness is required to assign meaning to it. That makes consciousness (our experience in a broader sense) the most fundamental, irreducible ground of existence itself, while some philosophers suggest consciousness is all that is.

Experiential realism refers to interacting entangled conscious agents at various ontological levels, giving rise to conscious experience all the way down, and I’d argue all the way up, seemingly ad infinitum. It is a “matryoshka” of embedded realities: conscious minds within larger minds. Experiential Realism is a non-physicalist, monistic idealism. It is not to be confused with Naïve Realism, the idea that we see the world around us objectively, as it is. We don’t. Just the opposite is true. Experiential Realism is predicated on the centrality of observers and all-encompassing quantum computational principles. The objective world, i.e., the world whose existence does not depend on the perceptions of a particular observer, consists entirely of conscious agents, more precisely their experiences. What exists in the objective world, independent of your perceptions, is a world of conscious agents, not a world of unconscious particles and fields.

Physics has long looked to harmony to explain the beauty of the Universe. But what if dissonance yields better insights?


Quantum physics is weird and counterintuitive. For this reason, the word ‘quantum’ has become shorthand for anything powerful or mystical, whether or not it has anything whatsoever to do with quantum mechanics. As a quantum physicist, I’ve developed a reflexive eyeroll upon hearing the word applied to anything outside of physics. It’s used to describe homeopathy, dishwasher detergents and deodorant.

If I hadn’t first heard of Quantum Music from a well-respected physicist, I would have scoffed the same way I did at the other ridiculous uses of the word. But coming from Klaus Mølmer it was intriguing. In the Quantum Music project, physicists and musicians worked together to unite ‘the mysterious worlds of quantum physics and music for the first time’. They developed a device that attaches to each key of a piano so that, when the pianist plays, the information is piped to a computer and synthesiser, which plays ‘quantum’ tones in addition to the familiar reverberations in the piano.

Among the tones used are those that represent a very quantum object: a Bose-Einstein condensate (BEC). This is a cloud of atoms that have been cooled down to just above absolute zero. At this low temperature, the microscopic quantum properties of the individual particles can all be treated collectively as a single, macroscopic quantum entity. Studying BECs is a way of examining the consequences of quantum mechanics on a larger scale than is typically possible.