Toggle light / dark theme

How to See the Dead

Looking at the bench and readings, he concluded that the previous night’s firmware update had introduced a timing mismatch. The wires hadn’t burnt out, but the clock that told them when to fire had been off by a microsecond, so the expected voltage response never lined up. He suspected half the channels had dropped out, even though the hardware itself wasn’t damaged. Fifteen minutes and a simple firmware rollback later, and everything worked perfectly.

Now, Lyre and I swapped the saline for neuron cultures to check if the wires could trigger and record real biological data. While we confirmed, Aux fine-tuned his AI encoder and processed April’s data.

We were finally ready to test the integrated system, without yet risking its insertion into April’s brain. We built something we only half jokingly called a “phantom cortex,” a benchtop stand-in: a synthetic cortical sheet of cultured neurons on a chip designed to act as April’s visual cortex. On one side, we put a lab-grown retinal implant that carried live sensory input. On the other, Aux’s playback device pushed reconstructed memories. The phantom cortex’s visual field was rendered on a lab monitor so that we could assess the pattern projections. The phantom cortex rig buzzing faintly in the background, gelled neuron sheets twitching under the microscope with each ripple of charge.

A New Era Of Trusted Payments

If you read my last post, you may have had the same reaction as the legendary fintech blogger Chris Skinner. On the blog entitled “Fintechs New Power Couple: AI and Trust, he politely corrected, ” AI, trust and DLT sir” as a comment on my post.

As soon as I read his input I knew he was right. I had to write a follow up post, to correct my glaring omission. As there are three forces converging here rather than two, I will update the title to make it both more contemporary, and more accurate at the same time…

Fintech’s New Power Throuple is the convergence of AI, Trust, and Distributed Ledger Technology (DLT).

If I drew a diagram of the relationships between the three different factors I would put it in the form of a triangle. From my viewpoint Trust would hold the uppermost position, with Blockchain and Artificial Intelligence occupying the two lower positions.

They are kind of the technology layer that makes that makes Trust possible.

As Trust isn’t a technology — or is it? 🤔

(https://fintechconfidential-newsletter.beehiiv.com/p/m2020-a…-payments)

Traumatic Brain Injury and Artificial Intelligence: Shaping the Future of Neurorehabilitation—A Review

AI has emerged as a pivotal tool in redefining TBI rehabilitation, bridging gaps in traditional care with innovative, data-driven approaches. While its potential to enhance diagnostic accuracy, outcome prediction, and individualized therapy is evident, challenges such as bias in datasets and ethical implications must be addressed. Continued research and multidisciplinary collaboration will be key to harnessing AI’s full potential, ensuring equitable access and optimizing recovery outcomes for TBI patients.

Overall, the integration of AI in TBI rehabilitation presents numerous opportunities to advance patient care and enhance the effectiveness of therapeutic interventions.

The simulated Milky Way: 100 billion stars using 7 million CPU cores

Researchers have successfully performed the world’s first Milky Way simulation that accurately represents more than 100 billion individual stars over the course of 10 thousand years. This feat was accomplished by combining artificial intelligence (AI) with numerical simulations. Not only does the simulation represent 100 times more individual stars than previous state-of-the-art models, but it was produced more than 100 times faster.

Published in Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis, the study represents a breakthrough at the intersection of astrophysics, high-performance computing, and AI. Beyond astrophysics, this new methodology can be used to model other phenomena such as and .

MAKER: Large Language Models (LLMs) have achieved remarkable breakthroughs in reasoning, insight generation, and tool use

They can plan multi-step actions, generate creative solutions, and assist in complex decision-making. Yet these strengths fade when tasks stretch over long, dependent sequences. Even small per-step error rates compound quickly, turning an impressive short-term performance into complete long-term failure.

That fragility poses a fundamental obstacle for real-world systems. Most large-scale human and organizational processes – from manufacturing and logistics to finance, healthcare, and governance – depend on millions of actions executed precisely and in order. A single mistake can cascade through an entire pipeline. For AI to become a reliable participant in such processes, it must do more than reason well. It must maintain flawless execution over time, sustaining accuracy across millions of interdependent steps.

Apple’s recent study, The Illusion of Thinking, captured this challenge vividly. Researchers tested advanced reasoning models such as Claude 3.7 Thinking and DeepSeek-R1 on structured puzzles like Towers of Hanoi, where each additional disk doubles the number of required moves. The results revealed a sharp reliability cliff: models performed perfectly on simple problems but failed completely once the task crossed about eight disks, even when token budgets were sufficient. In short, more “thinking” led to less consistent reasoning.

AI at the speed of light just became a possibility

Researchers at Aalto University have demonstrated single-shot tensor computing at the speed of light, a remarkable step towards next-generation artificial general intelligence hardware powered by optical computation rather than electronics.

Tensor operations are the kind of arithmetic that form the backbone of nearly all modern technologies, especially , yet they extend beyond the simple math we’re familiar with. Imagine the mathematics behind rotating, slicing, or rearranging a Rubik’s cube along multiple dimensions. While humans and classical computers must perform these operations step by step, light can do them all at once.

Today, every task in AI, from image recognition to , relies on tensor operations. However, the explosion of data has pushed conventional digital computing platforms, such as GPUs, to their limits in terms of speed, scalability and energy consumption.

AI Bubble

A lot of people are talking about an AI bubble since it is normal for tech to explode in growth for a while, then collapse a bit, and then eventually move forward again.

WE ARE NOT IN AN AI BUBBLE. THE SINGULARITY HAS BEGUN.

There will not be a year between now and the upcoming AI takeover where AI data center spending will decline worldwide.

/* */