Toggle light / dark theme

The Japanese electronics giant Sony has announced its first steps into quantum computing by joining other investment groups in a £42m venture in the UK quantum computing firm Quantum Motion. The move by the investment arm of Sony aims to boost the company’s expertise in silicon quantum chip development as well as to assist in a potential quantum computer roll-out onto the Japanese market.

Quantum Motion was founded in 2017 by scientists from University College London and the University of Oxford. It already raised a total of £20m via “seed investment” in 2017 and a “series A” investment in 2020. Quantum Motion uses qubits based on standard silicon chip technology and can therefore exploit the same manufacturing processes that mass-produces chips such as those found in smartphones.

A full-scale quantum computer, when built, is likely to require a million logical qubits to perform quantum-based calculations, with each logical qubit needing thousands of physical qubits to allow for robust error checking. Such demands will, however, require a huge amount of associated hardware if they are to be achieved. Quantum Motion claims that its technology could tackle this problem because it develops scalable arrays of qubits based on CMOS silicon technology to achieve high-density qubits.

“Apple today unveiled Apple Vision Pro, a revolutionary spatial computer that seamlessly blends digital content with the physical world, while allowing users to stay present and connected to others. Vision Pro creates an infinite canvas for apps that scales beyond the boundaries of a traditional display and introduces a fully three-dimensional user interface controlled by the most natural and intuitive inputs possible — a user’s eyes, hands, and voice.”


CUPERTINO, CALIFORNIA Apple today unveiled Apple Vision Pro, a revolutionary spatial computer that seamlessly blends digital content with the physical world, while allowing users to stay present and connected to others. Vision Pro creates an infinite canvas for apps that scales beyond the boundaries of a traditional display and introduces a fully three-dimensional user interface controlled by the most natural and intuitive inputs possible — a user’s eyes, hands, and voice. Featuring visionOS, the world’s first spatial operating system, Vision Pro lets users interact with digital content in a way that feels like it is physically present in their space. The breakthrough design of Vision Pro features an ultra-high-resolution display system that packs 23 million pixels across two displays, and custom Apple silicon in a unique dual-chip design to ensure every experience feels like it’s taking place in front of the user’s eyes in real time.

“Today marks the beginning of a new era for computing,” said Tim Cook, Apple’s CEO. “Just as the Mac introduced us to personal computing, and iPhone introduced us to mobile computing, Apple Vision Pro introduces us to spatial computing. Built upon decades of Apple innovation, Vision Pro is years ahead and unlike anything created before — with a revolutionary new input system and thousands of groundbreaking innovations. It unlocks incredible experiences for our users and exciting new opportunities for our developers.”

“Creating our first spatial computer required invention across nearly every facet of the system,” said Mike Rockwell, Apple’s vice president of the Technology Development Group. “Through a tight integration of hardware and software, we designed a standalone spatial computer in a compact wearable form factor that is the most advanced personal electronics device ever.”

Daniel Lidar, the Viterbi Professor of Engineering at USC and Director of the USC Center for Quantum Information Science & Technology, and Dr. Bibek Pokharel, a Research Scientist at IBM Quantum, have achieved a quantum speedup advantage in the context of a “bitstring guessing game.” They managed strings up to 26 bits long, significantly larger than previously possible, by effectively suppressing errors typically seen at this scale. (A bit is a binary number that is either zero or one). Their paper is published in the journal Physical Review Letters.

Quantum computers promise to solve certain problems with an advantage that increases as the problems increase in complexity. However, they are also highly prone to errors, or noise. The challenge, says Lidar, is “to obtain an advantage in the real world where today’s quantum computers are still ‘noisy.’” This noise-prone condition of current is termed the “NISQ” (Noisy Intermediate-Scale Quantum) era, a term adapted from the RISC architecture used to describe classical computing devices. Thus, any present demonstration of quantum speed advantage necessitates noise reduction.

The more unknown variables a problem has, the harder it usually is for a to solve. Scholars can evaluate a computer’s performance by playing a type of game with it to see how quickly an algorithm can guess hidden information. For instance, imagine a version of the TV game Jeopardy, where contestants take turns guessing a secret word of known length, one whole word at a time. The host reveals only one correct letter for each guessed word before changing the secret word randomly.

Cactus Materials touted the emerging talent pool at local universities and the emerging ecosystem of the semiconductor industry as reasons to do business in Arizona.

The White House has designated Phoenix as a workforce hub to help meet the demand for qualified and diverse talent in semiconductors, renewable energy and electric vehicles.

Over the next five years, Cactus Materials said it intends to make further upgrades at its facility and invest up to $300 million. The company had previously been awarded grants from NASA and the U.S. Department of Energy and has applied for funding earmarked for the semiconductor sector through the CHIPS and Science Act.

University of Washington researchers have discovered they can detect atomic “breathing,” or the mechanical vibration between two layers of atoms, by observing the type of light those atoms emitted when stimulated by a laser. The sound of this atomic “breath” could help researchers encode and transmit quantum information.

The researchers also developed a device that could serve as a new type of building block for quantum technologies, which are widely anticipated to have many future applications in fields such as computing, communications and sensor development.

The researchers published these findings June 1 in Nature Nanotechnology.

Whether it’s baking a cake, building a house, or developing a quantum device, the quality of the end product significantly depends on its ingredients or base materials. Researchers working to improve the performance of superconducting qubits, the foundation of quantum computers, have been experimenting using different base materials in an effort to increase the coherent lifetimes of qubits.

The coherence time is a measure of how long a retains quantum information, and thus a primary measure of performance. Recently, scientists discovered that using tantalum in makes them perform better, but no one has been able to determine why—until now.

Scientists from the Center for Functional Nanomaterials (CFN), the National Synchrotron Light Source II (NSLS-II), the Co-design Center for Quantum Advantage (C2QA), and Princeton University investigated the fundamental reasons that these qubits perform better by decoding the chemical profile of tantalum.

Mindfulness-based awareness training can help people learn to better control brain-computer interfaces. But a new study has found that a single guided mindfulness meditation exercise isn’t enough to boost performance. The findings, published in Frontiers in Human Neuroscience, suggest that a longer period of meditation is needed in order for people to experience observable improvements.

The authors of the research are interested in exploring the potential benefits of using mindfulness meditation as a training tool to improve the performance of brain-computer interfaces, which allow individuals to control machines or computers directly from their brain, bypassing the traditional neuromuscular pathway. These devices have the potential to greatly benefit people with conditions such as spinal cord injuries, stroke, and neurodegenerative diseases like amyotrophic lateral sclerosis (ALS).

Previous studies have shown that one of the most effective signals for brain-computer interface control is the sensorimotor rhythm produced in the primary sensorimotor areas during motor imagery. However, not everyone is able to effectively control brain-computer interfaces, with approximately 20% of the population being “BCI-inefficient” even with extensive training. Therefore, researchers are looking for ways to improve performance, and one potential method is through meditation.

When you’re putting together a computer workstation, what would you say is the cleanest setup? Wireless mouse and keyboard? Super-discrete cable management? How about no visible keeb, no visible mouse, and no obvious display?

That’s what [Basically Homeless] was going for. Utilizing a Flexispot E7 electronically raisable standing desk, an ASUS laptop, and some other off-the-shelf parts, this project is taking the idea of decluttering to the extreme, with no visible peripherals and no visible wires.

There was clearly a lot of learning and much painful experimentation involved, and the guy kind of glazed over how a keyboard was embedded in the desk surface. By forming a thin layer of resin in-plane with the desk surface, and mounting the keyboard just below, followed by lots of careful fettling of the openings meant the keys could be depressed. By not standing proud of the surface, the keys were practically invisible when painted. After all, you need that tactile feedback, and a projection keeb just isn’t right.