Toggle light / dark theme

New Research Suggests That We Are Nearing the Limit of Human Life Expectancy

Life expectancy growth has slowed since 1990, with average gains of only 6.5 years in the longest-living populations, suggesting a possible biological limit. A new study emphasizes shifting focus from merely extending life to improving the quality of life through advancements in aging science. Life expectancy saw dramatic increases throughout…

What If We Became A Type 3 Civilization? 15 Predictions

This video explores what life would be like if we became a Type 3 Civilization. Watch this next video about us becoming a Type 2 civilization: • What If We Became A Type 2 Civilizati…
🎁 5 Free ChatGPT Prompts To Become a Superhuman: https://www.futurebusinesstech.com/su
🤖 AI for Business Leaders (Udacity Program): https://bit.ly/3Qjxkmu.
☕ My Patreon: / futurebusinesstech.
➡️ Official Discord Server: / discord.

SOURCES:
https://www.futuretimeline.net.
• The Singularity Is Near: When Humans Transcend Biology (Ray Kurzweil): https://amzn.to/3ftOhXI
• The Future of Humanity (Michio Kaku): https://amzn.to/3Gz8ffA

💡 Future Business Tech explores the future of technology and the world.

Examples of topics I cover include:
• Artificial Intelligence \& Robotics.
• Virtual and Augmented Reality.
• Brain-Computer Interfaces.
• Transhumanism.
• Genetic Engineering.

SUBSCRIBE: https://bit.ly/3geLDGO

Disclaimer:

Bio-Circuitry Mimics Synapses and Neurons — Accelerates Routes to Brain-Like Computing

Researchers at the Department of Energy’s Oak Ridge National Laboratory, the University of Tennessee, and Texas A&M University demonstrated bio-inspired devices that accelerate routes to neuromorphic, or brain-like, computing.

Results published in Nature Communications report the first example of a lipid-based “memcapacitor,” a charge storage component with memory that processes information much like synapses do in the brain. Their discovery could support the emergence of computing networks modeled on biology for a sensory approach to machine learning.

“Our goal is to develop materials and computing elements that work like biological synapses and neurons—with vast interconnectivity and flexibility—to enable autonomous systems that operate differently than current computing devices and offer new functionality and learning capabilities,” said Joseph Najem, a recent postdoctoral researcher at ORNL’s Center for Nanophase Materials Sciences, a DOE Office of Science User Facility, and current assistant professor of mechanical engineering at Penn State.

Overcoming ‘catastrophic forgetting’: Algorithm inspired by brain allows neural networks to retain knowledge

Neural networks have a remarkable ability to learn specific tasks, such as identifying handwritten digits. However, these models often experience “catastrophic forgetting” when taught additional tasks: They can successfully learn the new assignments, but “forget” how to complete the original. For many artificial neural networks, like those that guide self-driving cars, learning additional tasks thus requires being fully reprogrammed.

Biological brains, on the other hand, are remarkably flexible. Humans and animals can easily learn how to play a new game, for instance, without having to re-learn how to walk and talk.

Inspired by the flexibility of human and animal brains, Caltech researchers have now developed a new type of that enables neural networks to be continuously updated with new data that they are able to learn from without having to start from scratch. The algorithm, called a functionally invariant path (FIP) algorithm, has wide-ranging applications from improving recommendations on online stores to fine-tuning self-driving cars.

New Algorithm Enables Neural Networks to Learn Continuously

Neural networks have a remarkable ability to learn specific tasks, such as identifying handwritten digits. However, these models often experience “catastrophic forgetting” when taught additional tasks: They can successfully learn the new assignments, but “forget” how to complete the original. For many artificial neural networks, like those that guide self-driving cars, learning additional tasks thus requires being fully reprogrammed.

Biological brains, on the other hand, are remarkably flexible. Humans and animals can easily learn how to play a new game, for instance, without having to re-learn how to walk and talk.

Inspired by the flexibility of human and animal brains, Caltech researchers have now developed a new type of algorithm that enables neural networks to be continuously updated with new data that they are able to learn from without having to start from scratch. The algorithm, called a functionally invariant path (FIP) algorithm, has wide-ranging applications from improving recommendations on online stores to fine-tuning self-driving cars.

Scientists invent artificial plant that cleans indoor air and generates electricity

Scientists have invented an artificial plant that can simultaneously clean indoor air while generating enough electricity to power a smartphone.

A team from Binghamton University in New York created an artificial leaf “for fun” using five biological solar cells and their photosynthetic bacteria, before realising that the device could be used for practical applications.

A proof-of-concept plant with five artificial leaves was capable of generating electricity and oxygen, while removing CO2 at a far more efficient rate than natural plants.

DoD launches new biological defense supercomputer at Lawrence Livermore Lab

The US government has launched a new supercomputer in Livermore, California.

The Department of Defense (DoD) and National Nuclear Security Administration (NNSA) this month inaugurated a new supercomputing system dedicated to biological defense at the Lawrence Livermore National Laboratory (LLNL).


Specs not shared, but same architecture as upcoming El Capitan system.

Decoding Nature’s Hidden Messages

Living organisms constantly navigate dynamic and noisy environments, where they must efficiently sense, interpret, and respond to a wide range of signals. The ability to accurately process information is vital for both executing interspecies survival strategies and for maintaining stable cellular functions, which operate across multiple temporal and spatial scales [1] (Fig. 1). However, these systems often have access to only limited information. They interact with their surroundings through a subset of observable variables, such as chemical gradients or spatial positions, all while operating within constrained energy budgets. In this context, Giorgio Nicoletti of the Swiss Federal Institute of Technology in Lausanne (EPFL) and Daniel Maria Busiello of the Max Planck Institute for the Physics of Complex Systems in Germany applied information theory and stochastic thermodynamics to provide a unified framework addressing this topic [2]. Their work has unraveled potential fundamental principles behind transduction mechanisms that extract information from a noisy environment.

Bacteria, cells, swarms, and other organisms have been observed acquiring information about the environment at extraordinarily high precision. Bacteria can read surrounding chemical gradients to reach regions of high nutrients consistently [3], and cells form patterns during development repetitively and stably by receiving information on the distribution and concentration of external substances, called morphogens [4]. In doing so, they must interact with a noisy environment where the information available is scrambled and needs to be retrieved without corrupting the relevant signal [5]. All this comes at a cost.

The idea that precision is not free is an old one in the field of stochastic thermodynamics, and the cost usually comes in the form of energy dissipation [6]. This trade-off is even more relevant for biological systems that have limited access to energy sources. Living systems are pushed to find optimal strategies to achieve maximum precision while minimizing energy consumption. Consequently, a complete quantitative description of how these strategies are implemented requires the simultaneous application of information theory and stochastic—that is, noisy—thermodynamics.