Toggle light / dark theme

In today’s AI news, Mukesh Ambani’s Reliance Industries is set to build the world’s largest data centre in Jamnagar, Gujarat, according to a *Bloomberg News* report. The facility would dwarf the current largest data center, Microsoft’s 600-megawatt site in Virginia. The project could cost between $20 billion to $30 billion.

S most popular consumer-facing AI app. The Beijing-based company introduced its closed-source multimodal model Doubao 1.5 Pro, emphasizing a “resource-efficient” training approach that it said does not sacrifice performance. ‘ + And, OpenAI’s CEO Sam Altman announced that the free tier of ChatGPT will now use the o3-mini model, marking a significant shift in how the popular AI chatbot serves its user base. In the same tweet announcing the change, Altman revealed that paid subscribers to ChatGPT Plus and Pro plans will enjoy “tons of o3-mini usage,” giving people an incentive to move to a paid account with the company.

Then, researchers at Sakana AI, an AI research lab focusing on nature-inspired algorithms, have developed a self-adaptive language model that can learn new tasks without the need for fine-tuning. Called Transformer², the model uses mathematical tricks to align its weights with user requests during inference.

In videos, Demis Hassabis, CEO of Google DeepMind joins the Big Technology Podcast with Alex Kantrowitz to discuss the cutting edge of AI and where the research is heading. In this conversation, they cover the path to artificial general intelligence, how long it will take to get there, how to build world models, and much more.

Squawk Box Then, join IBM’s Meredith Mante as she takes you on a deep dive into Lag Llama, an open-source foundation model, and shows you how to harness its power for time series forecasting. Learn how to load and preprocess data, train a model, and evaluate its performance, gaining a deeper understanding of how to leverage Lag Llama for accurate predictions.

Quantum entanglement—a phenomenon where particles are mysteriously linked no matter how far apart they are—presents a long-standing challenge in the physical world, particularly in understanding its behavior within complex quantum systems.

A research team from the Department of Physics at The University of Hong Kong (HKU) and their collaborators have recently developed a novel algorithm in quantum physics known as ‘entanglement microscopy’ that enables visualization and mapping of this extraordinary phenomenon at a microscopic scale.

By zooming in on the intricate interactions of entangled particles, one can uncover the hidden structures of quantum matter, revealing insights that could transform technology and deepen the understanding of the universe.

Summary: A study reveals that London taxi drivers prioritize complex and distant junctions during their initial “offline thinking” phase when planning routes, rather than sequentially considering streets. This efficient, intuitive strategy leverages spatial awareness and contrasts with AI algorithms, which typically follow step-by-step approaches.

The findings highlight the unique planning abilities of expert human navigators, influenced by their deep memory of London’s intricate street network. Researchers suggest that studying human expert intuition could improve AI algorithms, especially for tasks involving flexible planning and human-AI collaboration.

As artificial intelligence models become increasingly advanced, electronics engineers have been trying to develop new hardware that is better suited for running these models, while also limiting power-consumption and boosting the speed at which they process data. Some of the most promising solutions designed to meet the needs of machine learning algorithms are platforms based on memristors.

Memristors, or memory resistors, are electrical components that can retain their resistance even in the absence of electrical power, adjusting their resistance based on the electrical charge passing through them. This means that they can simultaneously support both the storage and processing of information, which could be advantageous for running machine learning algorithms.

Memristor-based devices could be used to develop more compact and energy-efficient hardware for running AI models, including emerging distributed computing solutions referred to as edge computing systems. Despite their advantages, many existing -based platforms have been found to have notable limitations, adversely impacting their reliability and endurance.

The challenge for researchers is to develop the often complicated series of equations that are needed to describe these phenomena and ensure that they can be solved to recover information on the location of the objects over time. Often the systems of equations needed to describe such phenomena are based on partial differential equations: the series of equations that describe the location and time-evolution of a system are known as a distributed parameter system.

Mathematical models can help us not just understand historical behaviour but predict where the smoke particles will spread next.

Professor Francisco Jurado at the Tecnológico Nacional de México has been working on approaches to solve the problem of distributed parameter systems to describe diffusion–convection systems. He has recently developed an approach using a combination of approaches, including the Sturm-Liouville differential operator and the regulator problem, to develop a model for diffusion–convection behaviour that is sufficiently stable and free of external disturbances. Importantly, this approach allows us to yield meaningful information for real systems.

A new era in computing is emerging as researchers overcome the limitations of Moore’s Law through photonics.

This cutting-edge approach boosts processing speeds and slashes energy use, potentially revolutionizing AI and machine learning.

Machine learning is a subset of artificial intelligence (AI) that deals with the development of algorithms and statistical models that enable computers to learn from data and make predictions or decisions without being explicitly programmed to do so. Machine learning is used to identify patterns in data, classify data into different categories, or make predictions about future events. It can be categorized into three main types of learning: supervised, unsupervised and reinforcement learning.

Observing the effects of special relativity doesn’t necessarily require objects moving at a significant fraction of the speed of light. In fact, length contraction in special relativity explains how electromagnets work. A magnetic field is just an electric field seen from a different frame of reference.

So, when an electron moves in the of another electron, this special relativistic effect results in the moving electron interacting with a magnetic field, and hence with the electron’s spin angular momentum.

The interaction of spin in a magnet field was, after all, how spin was discovered in the 1920 Stern Gerlach experiment. Eight years later, the pair spin-orbit interaction (or ) was made explicit by Gregory Breit in 1928 and then found in Dirac’s special relativistic quantum mechanics. This confirmed an equation for energy splitting of atomic energy levels developed by Llewellyn Thomas in 1926, due to 1) the special relativistic magnetic field seen by the electron due to its movement (“orbit”) around the positively charged nucleus, and 2) the electron’s spin magnetic moment interacting with this .

Artificial intelligence (AI) once seemed like a fantastical construct of science fiction, enabling characters to deploy spacecraft to neighboring galaxies with a casual command. Humanoid AIs even served as companions to otherwise lonely characters. Now, in the very real 21st century, AI is becoming part of everyday life, with tools like chatbots available and useful for everyday tasks like answering questions, improving writing, and solving mathematical equations.

AI does, however, have the potential to revolutionize —in ways that can feel like but are within reach.

At the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, scientists are already using AI to automate experiments and discover new materials. They’re even designing an AI scientific companion that communicates in ordinary language and helps conduct experiments. Kevin Yager, the Electronic Nanomaterials Group leader at the Center for Functional Nanomaterials (CFN), has articulated an overarching vision for the role of AI in scientific research.

Physically Intuitive Anisotropic Model of Hardness https://arxiv.org/abs/2412.


Skoltech researchers have presented a new simple physical model for predicting the hardness of materials based on information about the shear modulus and equations of the state of crystal structures. The model is useful for a wide range of practical applications—all parameters in it can be determined through basic calculations or measured experimentally.

The results of the study are presented in the Physical Review Materials journal.

Hardness is an important property of materials that determines their ability to resist deformations and other damage (dents, scratches) due to external forces. It is typically determined by pressing the indenter into the test sample, and the indenter must be made of a harder material, usually diamond.