Toggle light / dark theme

Researchers at the quantum computing firm D-Wave Systems have shown that their quantum processor can simulate the behaviour of an “untwisting” quantum magnet much faster than a classical machine. Led by D-Wave’s director of performance research Andrew King, the team used the new low-noise quantum processor to show that the quantum speed-up increases for harder simulations. The result shows that even near-term quantum simulators could have a significant advantage over classical methods for practical problems such as designing new materials.

The D-Wave simulators are specialized quantum computers known as quantum annealers. To perform a simulation, the quantum bits, or qubits, in the annealer are initialized in a classical ground state and allowed to interact and evolve under conditions programmed to mimic a particular system. The final state of the qubits is then measured to reveal the desired information.

King explains that the quantum magnet they simulated experiences both quantum fluctuations (which lead to entanglement and tunnelling) and thermal fluctuations. These competing effects create exotic topological phase transitions in materials, which were the subject of the 2016 Nobel Prize in Physics.

The debate holds a special interest for neuroscientists; since computer programming has only been around for a few decades, the brain has not evolved any special region to handle it. It must be repurposing a region of the brain normally used for something else.

So late last year, neuroscientists in MIT tried to see what parts of the brain people use when dealing with computer programming. “The ability to interpret computer code is a remarkable cognitive skill that bears parallels to diverse cognitive domains, including general executive functions, math, logic, and language,” they wrote.

Since coding can be learned as an adult, they figured it must rely on some pre-existing cognitive system in our brains. Two brain systems seemed like likely candidates: either the brain’s language system, or the system that tackles complex cognitive tasks such as solving math problems or a crossword. The latter is known as the “multiple demand network.”

With powerful engines, near-photorealistic graphics, and the ability to build incredible, immersive worlds, it’s hard to imagine what the next big technological advance in gaming might be.

Based on a recent tweet by Neuralink co-founder and President Max Hodak, the word might not even apply. In it, he hinted — vaguely, to be fair — that whatever forms of entertainment get programmed into neural implants and brain-computer interfaces will represent a paradigm shift that moves beyond the current terminology.

“We’re gonna need a better term than ‘video game’ once we start programming for more of the sensorium,” Hodak tweeted.

Catastrophic collapse of materials and structures is the inevitable consequence of a chain reaction of locally confined damage—from solid ceramics that snap after the development of a small crack to metal space trusses that give way after the warping of a single strut.

In a study published this week in Advanced Materials, engineers at the University of California, Irvine and the Georgia Institute of Technology describe the creation of a new class of mechanical metamaterials that delocalize deformations to prevent failure. They did so by turning to tensegrity, a century-old design principle in which isolated rigid bars are integrated into a flexible mesh of tethers to produce very lightweight, self-tensioning truss structures.

Starting with 950 nanometer-diameter members, the team used a sophisticated direct laser writing technique to generate elementary cells sized between 10 and 20 microns. These were built up into eight-unit supercells that could be assembled with others to make a continuous structure. The researchers then conducted computational modeling and laboratory experiments and observed that the constructs exhibited uniquely homogenous deformation behavior free from localized overstress or underuse.

By the middle of the decade, the team from PsiQuantum will have a commercial quantum computer, according to the Financial Times. The founders are also indicating they are ready to emerge from stealth.

PsiQuantum has been mostly silent about its quantum computer development but with its scientific bench composed of leading UK physicists and nearly $300 million in venture capital funding, according to The Quantum Insider, that silence has been deafening.

CAMBRIDGE, Mass.—()—Engineers from HyperLight, a leader in the commercialization of thin-film lithium niobate (LN) photonic integrated circuits (PICs), have achieved breakthrough voltage-bandwidth performances in integrated electro-optic modulators. The broadband electro-optic PIC could lead to orders of magnitude energy consumption reduction for next generation optical networking.

“We believe the significantly improved electro-optic modulation performance in our integrated LN platform will lead to a paradigm shift for both analog and digital ultra-high speed RF links” Tweet this

Energy consumption in optical networking for ethernet, data centers and 5G is soaring as a result of the rapidly growing data traffic. This is because of the limited performance of existing electro-optic modulators, the key element in converting data from the electrical to optical domain at high speed for optical networks. Current electro-optic modulators require extremely high radio-frequency (RF) driving voltages (5 V) as the analog bandwidth in ethernet ports approaches 100 GHz for future terabits per sec capacity transceivers. In comparison, a typical CMOS RF modulator driver delivers less than 0.5 V at such frequencies. Compound semiconductor modulator drivers can deliver voltage 1 V at significantly increased cost and energy consumption but still fall short to meet the optimum driving voltage. The limited voltage-bandwidth performance in electro-optic modulators poses a serious challenge for meeting tight power consumption requirements from network builders.

Transistors, devices that can amplify, conduct or switch electronic signals or electric current, are key components of many electronics on the market today. These devices can be fabricated using a variety of inorganic and organic semiconducting materials.

Metals are generally considered unsuitable for fabricating , as they screen electric fields and thus make it difficult to realize devices with tunable electrical conductivity. A possible way to create based on metals is to use gradients of counterions in films of metal nanoparticles functionalized with charged organic ligands.

In the past, engineers have successfully used this strategy to create a variety of devices, ranging from resistors to diodes and sensors. Nonetheless, modulating the electrical conductivity of these devices has often proved to be very challenging.

The State of the Edge report is based on analysis of the potential growth of edge infrastructure from the bottom up across multiple sectors modeled by Tolaga Research. The forecast evaluates 43 use cases spanning 11 vertical industries.

The one thing these use cases have in common is a growing need to process and analyze data at the point where it is being created and consumed. Historically, IT organizations have deployed applications that process data in batch mode overnight. As organizations embrace digital business transformation initiatives, it’s becoming more apparent that data needs to be processed and analyzed at the edge in near real time.

Of course, there are multiple classes of edge computing platforms, ranging from smartphones and internet of things (IoT) gateways to complete hyperconverged infrastructure (HCI) platforms that are being employed to process data at scale at the edge of a telecommunications network.