An international research team led by the University of Liverpool and McMaster University has made a significant breakthrough in the search for new states of matter.
Category: quantum physics – Page 687
Why it is dangerous to build ever larger big bang machines
Posted in alien life, astronomy, cosmology, energy, engineering, ethics, existential risks, general relativity, governance, gravity, innovation, law, nuclear energy, nuclear weapons, particle physics, philosophy, physics, policy, quantum physics, science, scientific freedom, security, singularity, space travel, supercomputing, theory, time travel | 1 Comment on Why it is dangerous to build ever larger big bang machines
CERN has revealed plans for a gigantic successor of the giant atom smasher LHC, the biggest machine ever built. Particle physicists will never stop to ask for ever larger big bang machines. But where are the limits for the ordinary society concerning costs and existential risks?
CERN boffins are already conducting a mega experiment at the LHC, a 27km circular particle collider, at the cost of several billion Euros to study conditions of matter as it existed fractions of a second after the big bang and to find the smallest particle possible – but the question is how could they ever know? Now, they pretend to be a little bit upset because they could not find any particles beyond the standard model, which means something they would not expect. To achieve that, particle physicists would like to build an even larger “Future Circular Collider” (FCC) near Geneva, where CERN enjoys extraterritorial status, with a ring of 100km – for about 24 billion Euros.
Experts point out
This was the first part in an interview series with Scott Aaronson — this one is on quantum computing — other segments are on Existential Risk, consciousness (including Scott’s thoughts on IIT) and thoughts on whether the universe is discrete or continuous.
First part in an interview series with Scott Aaronson — this one is on quantum computing — future segments will be on Existential Risk, consciousness (including Scott’s thoughts on IIT) and thoughts on whether the universe is discrete or continuous.
See ‘Complexity-Theoretic Foundations of Quantum Supremacy Experiments’
Interview with Scott Aaronson — covering whether quantum computers could have subjective experience, whether information is physical and what might be important for consciousness — he touches on classic philosophical conundrums and the observation that while people want to be thorough-going materialists, unlike traditional computers brain-states are not obviously copyable. Aaronson wrote about this his paper ‘The Ghost in the Quantum Turing Machine’ (found here https://arxiv.org/abs/1306.0159). Scott also critiques Tononi’s integrated information theory (IIT).
Scott discusses whether quantum computers could have subjective experience, whether information is physical and what might be important for consciousness — he touches on classic philosophical conundrums and the observation that while people want to be thorough-going materialists, unlike traditional computers brain-states are not obviously copyable. Aaronson wrote about this his paper ‘The Ghost in the Quantum Turing Machine’ (found here https://arxiv.org/abs/1306.0159). Scott also critiques Tononi’s integrated information theory (IIT).
Questions include:
Back in the first moment of the universe, everything was hot and dense and in perfect balance. There weren’t any particles as we’d understand them, much less any stars or even the vacuum that permeates space today. The whole of space was filled with homogeneous, formless, compressed stuff.
Then, something slipped. All that monotonous stability became unstable. Matter won out over its weird cousin, antimatter, and came to dominate the whole of space. Clouds of that matter formed and collapsed into stars, which became organized into galaxies. Everything that we know about started to exist.
So, what happened to tip the universe out of its formless state? [How Quantum Entanglement Works (Infographic)].
Electronegativity is one of the most well-known models for explaining why chemical reactions occur. Now, Martin Rahm from Chalmers University of Technology, Sweden, has redefined the concept with a new, more comprehensive scale. His work, undertaken with colleagues including a Nobel Prize-winner, has been published in the Journal of the American Chemical Society.
The theory of electronegativity is used to describe how strongly different atoms attract electrons. By using electronegativity scales, one can predict the approximate charge distribution in different molecules and materials, without needing to resort to complex quantum mechanical calculations or spectroscopic studies. This is vital for understanding all kinds of materials, as well as for designing new ones. Used daily by chemists and materials researchers all over the world, the concept originates from Swedish chemist Jöns Jacob Berzelius’ research in the 19th century and is widely taught at high-school level.
Now, Martin Rahm, Assistant Professor in Physical Chemistry at Chalmers University of Technology, has developed a brand-new scale of electronegativity.
The production of entropy, which means increasing the degree of disorder in a system, is an inexorable tendency in the macroscopic world owing to the second law of thermodynamics. This makes the processes described by classical physics irreversible and, by extension, imposes a direction on the flow of time. However, the tendency does not necessarily apply in the microscopic world, which is governed by quantum mechanics. The laws of quantum physics are reversible in time, so in the microscopic world, there is no preferential direction to the flow of phenomena.
One of the most important aims of contemporary scientific research is knowing exactly where the transition occurs from the quantum world to the classical world and why it occurs — in other words, finding out what makes the production of entropy predominate. This aim explains the current interest in studying mesoscopic systems, which are not as small as individual atoms but nevertheless display well-defined quantum behavior.
The quantum computing revolution is upon us. Like the first digital computers, quantum computers offer the possibility of technology exponentially more powerful than current systems. They stand to change companies, entire industries, and the world by solving problems that seem impossible today and will likely disrupt every industry.
MIT is offering online courses for professionals in Quantum Computing. Learn the business implifications, and applications of quantum, and take the next step in your career.
Several experiments over the past few years have reportedly violated Bell’s inequality – last year, the first Bell’s inequality experiment was completed without loopholes, but there’s still dispute over whether or not local realism actually holds up.
The new worldwide experiment aims to settle the matter once and for all, by using a huge amount of random, user-generated data to test Bell’s inequality.
Basically, the researchers are holding what’s called the ‘BIG Bell Test: worldwide quantum experiments powered by human randomness’, and they aim to conduct a range of Bell’s inequality tests around the world, controlled by human decisions made by volunteers (which they call Bellsters).