БЛОГ

Archive for the ‘computing’ category: Page 2

Dec 11, 2024

Google says its new quantum chip indicates that multiple universes exist

Posted by in categories: computing, cosmology, quantum physics

Google on Monday announced Willow, its latest, greatest quantum computing chip. The speed and reliability performance claims Google’s made about this chip were newsworthy in themselves, but what really caught the tech industry’s attention was an even wilder claim tucked into the blog post about the chip.

Google Quantum AI founder Hartmut Neven wrote in his blog post that this chip was so mind-boggling fast that it must have borrowed computational power from other universes.

Ergo the chip’s performance indicates that parallel universes exist and “we live in a multiverse.”

Dec 11, 2024

Quantum computing’s next step: New algorithm boosts multitasking

Posted by in categories: computing, information science, quantum physics

Quantum computers differ fundamentally from classical ones. Instead of using bits (0s and 1s), they employ “qubits,” which can exist in multiple states simultaneously due to quantum phenomena like superposition and entanglement.

For a quantum computer to simulate dynamic processes or process data, among other essential tasks, it must translate complex input data into “quantum data” that it can understand. This process is known as quantum compilation.

Essentially, quantum compilation “programs” the quantum computer by converting a particular goal into an executable sequence. Just as the GPS app converts your desired destination into a sequence of actionable steps you can follow, quantum compilation translates a high-level goal into a precise sequence of quantum operations that the quantum computer can execute.

Dec 11, 2024

Graphene Interconnects to Moore’s Law’s Rescue

Posted by in categories: computing, materials

The semiconductor industry’s long held imperative—Moore’s Law, which dictates that transistor densities on a chip should double roughly every two years—is getting more and more difficult to maintain. The ability to shrink down transistors, and the interconnects between them, is hitting some basic physical limitations. In particular, when copper interconnects are scaled down, their resistivity skyrockets, which decreases how much information they can carry and increases their energy draw.

The industry has been looking for alternative interconnect materials to prolong the march of Moore’s Law a bit longer. Graphene is a very attractive optionin many ways: The sheet-thin carbon material offers excellent electrical and thermal conductivity, and is stronger than diamond.

However, researchers have struggled to incorporate graphene into mainstream computing applications for two main reasons. First, depositing graphene requires high temperatures that are incompatible with traditional CMOS manufacturing. And second, the charge carrier density of undoped, macroscopic graphene sheets is relatively low.

Continue reading “Graphene Interconnects to Moore’s Law’s Rescue” »

Dec 11, 2024

Beyond Silicon: How DNA Is Powering Next-Gen Computers

Posted by in categories: biotech/medical, computing

Researchers have developed a new, fast, and rewritable method for DNA computing that promises smaller, more powerful computers.

This method mimics the sequential and simultaneous gene expression in living organisms and incorporates programmable DNA circuits with logic gates. The improved process places DNA on a solid glass surface, enhancing efficiency and reducing the need for manual transfers, culminating in a 90-minute reaction time in a single tube.

Advancements in DNA-Based Computation.

Dec 11, 2024

Bastrop County approves incentives for billion-dollar data center campus

Posted by in category: computing

The project is described in public documents as a “four-building data center campus facility,” with the costs of improvements estimated at $1.4 billion.

Dec 11, 2024

Leaner Large Language Models could enable Efficient Local Use on Phones and Laptops

Posted by in categories: computing, engineering, information science, mobile phones

Large language models (LLMs) are increasingly automating tasks like translation, text classification and customer service. But tapping into an LLM’s power typically requires users to send their requests to a centralized server—a process that’s expensive, energy-intensive and often slow.

Now, researchers have introduced a technique for compressing an LLM’s reams of data, which could increase privacy, save energy and lower costs. Their findings are published on the arXiv preprint server.

The new algorithm, developed by engineers at Princeton and Stanford Engineering, works by trimming redundancies and reducing the precision of an LLM’s layers of information. This type of leaner LLM could be stored and accessed locally on a device like a phone or laptop and could provide performance nearly as accurate and nuanced as an uncompressed version.

Dec 11, 2024

Ultrafast Control of Nonlinear Hot Dirac Electrons in Graphene: An International Collaboration

Posted by in categories: biotech/medical, computing, quantum physics

Nonlinear optics explores how powerful light (e.g. lasers) interacts with materials, resulting in the output light changing colour (i.e. frequency) or behaving differently based on the intensity of the incoming light. This field is crucial for developing advanced technologies such as high-speed communication systems and laser-based applications. Nonlinear optical phenomena enable the manipulation of light in novel ways, leading to breakthroughs in fields like telecommunications, medical imaging, and quantum computing. Two-dimensional (2D) materials, such as graphene—a single layer of carbon atoms in a hexagonal lattice—exhibit unique properties due to their thinness and high surface area. Graphene’s exceptional electronic properties, related to relativistic-like Dirac electrons and strong light-matter interactions, make it promising for nonlinear optical applications, including ultrafast photonics, optical modulators, saturable absorbers in ultrafast lasers, and quantum optics.

Dr. Habib Rostami, from the Department of Physics at the University of Bath, has co-authored pioneering research published in Advanced Science. This study involved an international collaboration between an experimental team at Friedrich Schiller University Jena in Germany and theoretical teams at the University of Pisa in Italy and the University of Bath in the UK. The research aimed to investigate the ultrafast opto-electronic and thermal tuning of nonlinear optics in graphene.

This study discovers a new way to control high-harmonic generation in a graphene-based field-effect transistor. The team investigated the impact of lattice temperature, electron doping, and all-optical ultrafast tuning of third-harmonic generation in a hexagonal boron nitride-encapsulated graphene opto-electronic device. They demonstrated up to 85% modulation depth along with gate-tuneable ultrafast dynamics, a significant improvement over previous static tuning. Furthermore, by changing the lattice temperature of graphene, the team could enhance the modulation of its optical response, achieving a modulation factor of up to 300%. The experimental fabrication and measurement took place at Friedrich Schiller University Jena. Dr. Rostami played a crucial role in the study by crafting theoretical models. These models were developed in collaboration with another theory team at the University of Pisa to elucidate new effects observed in graphene.

Dec 11, 2024

Rethinking the quantum chip: Engineers present new design for superconducting quantum processor

Posted by in categories: computing, engineering, quantum physics

Researchers at the UChicago Pritzker School of Molecular Engineering (UChicago PME) have realized a new design for a superconducting quantum processor, aiming at a potential architecture for the large-scale, durable devices the quantum revolution demands.

Unlike the typical quantum chip design that lays the information-processing qubits onto a 2D grid, the team from the Cleland Lab has designed a modular quantum processor comprising a reconfigurable router as a central hub. This enables any two qubits to connect and entangle, where in the older system, qubits can only talk to the qubits physically nearest to them.

“A quantum computer won’t necessarily compete with a classical computer in things like memory size or CPU size,” said UChicago PME Prof. Andrew Cleland.

Dec 11, 2024

Scientists develop cost-effective lasers for extended short-wave infrared applications

Posted by in categories: chemistry, computing, quantum physics

Current laser technologies for the extended short-wave infrared (SWIR) spectral range rely on expensive and complex materials, limiting their scalability and affordability. To address these challenges, ICFO researchers have presented a novel approach based on colloidal quantum dots in an Advanced Materials article. The team managed to emit coherent light (a necessary condition to create lasers) in the extended SWIR range with large colloidal quantum dots made of lead sulfide (PbS).

This new CQD-based technology offers a solution to the aforementioned challenges while maintaining compatibility with silicon CMOS platforms (the technology used for constructing integrated circuit chips) for on-chip integration.

Their PbS colloidal quantum dots are the first semiconductor lasing material to cover such a broad wavelength range. Remarkably, the researchers accomplished this without altering the dots’ chemical composition. These results pave the way towards the realization of more practical and compact lasers.

Dec 10, 2024

New quantum computing milestone smashes entanglement world record

Posted by in categories: computing, quantum physics

Researchers have set a new record for quantum entanglement — bringing reliable quantum computers a step closer to reality. The scientists successfully entangled 24 “logical qubits” — low-error quantum bits of information created by combining multiple physical qubits. This is the highest number ever achieved to date.

They also demonstrated that logical qubits can maintain error correction as the number of qubits increases, a crucial step toward larger, more fault-tolerant quantum systems. The researchers detailed their work in a study published Nov. 18 on the preprint database arXiv.

Page 2 of 87312345678Last