Toggle light / dark theme

MIT’s Optical AI Chip That Could Revolutionize 6G at the Speed of Light

As more connected devices require greater bandwidth for activities like teleworking and cloud computing, managing the limited wireless spectrum shared by all users is becoming increasingly difficult.

To address this, engineers are turning to artificial intelligence.

Elon Musk: Digital Superintelligence, Multiplanetary Life, How to Be Useful

A fireside with Elon Musk at AI Startup School in San Francisco.

Before rockets and robots, Elon Musk was drilling holes through his office floor to borrow internet. In this candid talk, he walks through the early days of Zip2, the Falcon 1 launches that nearly ended SpaceX, and the “miracle” of Tesla surviving 2008.

He shares the thinking that guided him—building from first principles, doing useful things, and the belief that we’re in the middle of an intelligence big bang.

Chapters:

00:00 — Intro.
01:25 — His origin story.
02:00 — Dream to help build the internet.
04:40 — Zip2 and lessons learned.
08:00 — PayPal.
14:30 — Origin of SpaceX
18:30 — Building rockets from first principles.
23:50 — Lessons in leadership.
27:10 — Building up xAI
39:00 — Super intelligence and synthetic data.
39:30 — Multi-planetary future.
43:00 — Nueralink, AI safety and the singularity.

Andrej Karpathy: Software Is Changing (Again)

Andrej Karpathy’s keynote at AI Startup School in San Francisco. Slides provided by Andrej: https://drive.google.com/file/d/1a0h1mkwfmV2PlekxDN8isMrDA5evc4wW

Drawing on his work at Stanford, OpenAI, and Tesla, Andrej sees a shift underway. Software is changing, again. We’ve entered the era of “Software 3.0,” where natural language becomes the new programming interface and models do the rest.

He explores what this shift means for developers, users, and the design of software itself— that we’re not just using new tools, but building a new kind of computer.

More content from Andrej: / @andrejkarpathy.

Chapters and Thoughts (From Andrej Karpathy!)
0:00 — Imo fair to say that software is changing quite fundamentally again. LLMs are a new kind of computer, and you program them *in English*. Hence I think they are well deserving of a major version upgrade in terms of software.
6:06 — LLMs have properties of utilities, of fabs, and of operating systems → New LLM OS, fabbed by labs, and distributed like utilities (for now). Many historical analogies apply — imo we are computing circa ~1960s.
14:39 — LLM psychology: LLMs = \.

Scientists propose blueprint for ‘universal translator’ in quantum networks

UBC researchers are proposing a solution to a key hurdle in quantum networking: a device that can “translate” microwave to optical signals and vice versa.

The technology could serve as a universal translator for quantum computers—enabling them to talk to one another over long distances and converting up to 95% of a signal with virtually no noise. And it all fits on a , the same material found in everyday computers.

“It’s like finding a translator that gets nearly every word right, keeps the message intact and adds no background chatter,” says study author Mohammad Khalifa, who conducted the research during his Ph.D. at UBC’s faculty of applied science and the Stewart Blusson Quantum Matter Institute (SBQMI).

Neuron–astrocyte associative memory

For decades, scientists believed that glial cells—the brain’s “support staff”—were just passive helpers to the neurons that do the heavy lifting of thinking and remembering. But that view is rapidly changing.


Astrocytes, the most abundant type of glial cell, play a fundamental role in memory. Despite most hippocampal synapses being contacted by an astrocyte, there are no current theories that explain how neurons, synapses, and astrocytes might collectively contribute to memory function. We demonstrate that fundamental aspects of astrocyte morphology and physiology naturally lead to a dynamic, high-capacity associative memory system. The neuron–astrocyte networks generated by our framework are closely related to popular machine learning architectures known as Dense Associative Memories. Adjusting the connectivity pattern, the model developed here leads to a family of associative memory networks that includes a Dense Associative Memory and a Transformer as two limiting cases.

Traversal Emerges From Stealth With $48 Million From Sequoia And Perkins To Reimagine Site Reliability In The AI Era

With more code created by AI, there is more surface area to troubleshoot. There is a need for AI to autonomously troubleshoot, mediate and even prevent complex incidents at scale—self-healing codegen.

Artificial neural networks reveal how peripersonal neurons represent the space around the body

The brains of humans and other primates are known to execute various sophisticated functions, one of which is the representation of the space immediately surrounding the body. This area, also sometimes referred to as “peripersonal space,” is where most interactions between people and their surrounding environment typically take place.

Researchers at Chinese Academy of Sciences, Italian Institute of Technology (IIT) and other institutes recently investigated the neural processes through which the brain represents the area around the body, using brain-inspired computational models. Their findings, published in Nature Neuroscience, suggest that receptive fields surrounding different parts of the body contribute to building a modular model of the space immediately surrounding a person or (AI) agent.

“Our journey into this field began truly serendipitously, during unfunded experiments done purely out of curiosity,” Giandomenico Iannetti, senior author of the paper, told Medical Xpress. “We discovered that the hand-blink reflex, which is evoked by electrically shocking the hand, was strongly modulated by the position of the hand with respect to the eye.

Electron microscopy technique captures nanoparticle organizations to forge new materials

A research team including members from the University of Michigan have unveiled a new observational technique that’s sensitive to the dynamics of the intrinsic quantum jiggles of materials, or phonons.

This work will help scientists and engineers better design metamaterials—substances that possess exotic properties that rarely exist in nature—that are reconfigurable and made from solutions containing nanoparticles that self-assemble into larger structures, the researchers said. These materials have wide-ranging applications, from shock absorption to devices that guide acoustic and optical energy in high-powered computer applications.

“This opens a new research area where nanoscale building blocks—along with their intrinsic optical, electromagnetic and —can be incorporated into mechanical metamaterials, enabling emerging technologies in multiple fields from robotics and mechanical engineering to information technology,” said Xiaoming Mao, U-M professor of physics and co-author of the new study.