Toggle light / dark theme

An ancient gene stolen from bacteria set the stage for human sight

Hoping to improve on those earlier efforts, Matthew Daugherty, a biochemist at the University of California San Diego, and colleagues used sophisticated computer software to trace the evolution of hundreds of human genes by searching for similar sequences in hundreds of other species. Genes that seemed to have appeared first in vertebrates and had no predecessors in earlier animals were good candidates for having jumped across from bacteria, particularly if they had counterparts in modern microbes. Among the dozens of potentially alien genes, one “blew me away,” Daugherty recalls.

The gene, called IRBP (for interphotoreceptor retinoid-binding protein), was already known to be important for seeing. The protein it encodes resides in the space between the retina and the retinal pigment epithelium, a thin layer of cells overlying the retina. In the vertebrate eye, when light hits a light-sensitive photoreceptor in the retina, vitamin A complexes become kinked, setting off an electrical pulse that activates the optic nerve. IRBP then shifts these molecules to the epithelium to be unkinked. Finally, it shuttles the restored molecules back to the photoreceptor. “IRBP,” Zhu explains, “is essential for the vision of all vertebrates.”

Vertebrate IRBP most closely resembles a class of bacterial genes called pepsidases, whose proteins recycle other proteins. Since IRBP is found in all vertebrates but generally not in their closest invertebrate relatives, Daugherty and his colleagues propose that more than 500 million years ago microbes transferred a pepsidase gene into an ancestor of all living vertebrates. Once the gene was in place, the protein’s recycling function was lost and the gene duplicated itself twice, explaining why IRBP has four copies of the original pepsidase DNA. Even in its microbial forebears, this protein may have had some ability to bind to light-sensing molecules, Daugherty suggests. Other mutations then completed its transformation into a molecule that could escape from cells and serve as a shuttle.

Sarah Bakewell on Posthumanism, Transhumanism, and What it Actually Means to Be “Human”

Every time a person dies, writes Russian novelist Vasily Grossman in Life and Fate, the entire world that has been built in that individual’s consciousness dies as well: “The stars have disappeared from the night sky; the Milky Way has vanished; the sun has gone out… flowers have lost their color and fragrance; bread has vanished; water has vanished.” Elsewhere in the book, he writes that one day we may engineer a machine that can have human-like experiences; but if we do, it will have to be enormous—so vast is this space of consciousness, even within the most “average, inconspicuous human being.”

And, he adds, “Fascism annihilated tens of millions of people.” Trying to think those two thoughts together is a near-impossible feat, even for the immense capacities of our consciousness. But will machine minds ever acquire anything like our ability to have such thoughts, in all their seriousness and depth? Or to reflect morally on events, or to equal our artistic and imaginative reach? Some think that this question distracts us from a more urgent one: we should be asking what our close relationship with our machines is doing to us.

Jaron Lanier, himself a pioneer of computer technology, warns in You Are Not a Gadget that we are allowing ourselves to become ever more algorithmic and quantifiable, because this makes us easier for computers to deal with. Education, for example, becomes less about the unfolding of humanity, which cannot be measured in units, and more about tick boxes.

Intelligence Explosion — Part 1/3

The GPT phenomenon and the future of humanity in the face of advances in Artificial Intelligence.

The Age of Artificial Intelligence is an increasingly present reality in our daily lives. With the rise of technologies such as Natural Language Processing (NLP) and Artificial Neural Networks (ANN), the possibility of creating machines capable of performing tasks that were previously exclusive to humans has emerged.

One of these technologies is the Generative Pre-trained Transformer, better known as GPT. It’s the Large Language Model (LLM) developed by OpenAI.

OpenAI was founded in San Francisco, California in 2015 by Sam Altman, Reid Hoffman, Jessica Livingston, Elon Musk, Ilya Sutskever, Peter Thiel, among others, who collectively pledged $1 billion. Musk resigned from the board in 2018, but continued to be a donor to the project.

Quantum cyber-physical systems

This paper aims to promote a quantum framework that analyzes Industry 4.0 cyber-physical systems more efficiently than traditional simulations used to represent integrated systems. The paper proposes a novel configuration of distributed quantum circuits in multilayered complex networks that enable the evaluation of industrial value creation chains. In particular, two different mechanisms for the integration of information between circuits operating at different layers are proposed, where their behavior is analyzed and compared with the classical conditional probability tables linked to the Bayesian networks. With the proposed method, both linear and nonlinear behaviors become possible while the complexity remains bounded. Applications in the case of Industry 4.0 are discussed when a component’s health is under consideration, where the effect of integration between different quantum cyber-physical digital twin models appears as a relevant implication.

Subject terms: Quantum simulation, Qubits.

Cyber-physical systems (CPS) are integrations of computational and physical components that can interact with humans through new and different modalities. A key to future technological development is precisely this new and different capacity of interaction together with the new possibilies that these systems pose for expanding the capabilities of the physical world through computation, communication and control1. When CPS are understood within the industrial practice fueled by additional technologies such as Internet of Things (IoT), people refer to the Industry 4.0 paradigm2. The design of many industrial engineering systems has been performed by separately considering the control system design from the hardware and/or software implementation details.

(Extra) Quantum Computing Explained and Overview

Playlist: https://www.youtube.com/playlist?list=PLnK6MrIqGXsJfcBdppW3CKJ858zR8P4eP
Download PowerPoint: https://github.com/hywong2/Intro_to_Quantum_Computing.
Book (Free with institution subscription): https://link.springer.com/book/10.1007/978-3-030-98339-0
Book: https://www.amazon.com/Introduction-Quantum-Computing-Layper…atfound-20

Can quantum computing replace classical computing? State, Superposition, Measurement, Entanglement, Qubit Implementation, No-cloning Theorem, Error Correction, Caveats.

“Can Consciousness be Explained?” — Royal Institute of Philosophy Annual Debate 2023

How can flesh and blood brains give rise to pains and pleasures, dreams and desires, sights and sounds? Some believe this ‘hard problem’ of consciousness can never be solved. Can we expect any breakthroughs as the science of the mind progresses?

Our annual debate this year considers whether the problem of consciousness really is intractable. Our illustrious panel is neuroscientist Anil Seth and philosophers Louise Antony, Maja Spener and Philip Goff, with the BBC’s Ritula Shah chairing.

Speakers.
Anil Seth is Professor of Cognitive and Computational Neuroscience at the University of Sussex.
Louise Antony is Professor Emerita at the University of Massachusetts, Amherst.
Maja Spener is Associate Professor in Philosophy at the University of Birmingham.
Philip Goff is Associate Professor in the Department of Philosophy at Durham University.

Chair.
Ritula Shah is a journalist and presenter of The World Tonight on BBC Radio 4.

Bio-Inspired Quantum Technologies

The Oxford Martin Programme on Bio-Inspired Technologies is investigating the possibility of making computers real.

We aim to develop a completely new methodology for overcoming the extreme fragility of memory. By learning how biological molecules shield fragile states from the environment, we hope to create the building blocks of future computers.

The unique power of computers comes from their ability to carry out all possible calculations in parallel.

Multiscale quantum algorithms for quantum chemistry

As quantum advantage has been demonstrated on different quantum computing platforms using Gaussian boson sampling,1–3 quantum computing is moving to the next stage, namely demonstrating quantum advantage in solving practical problems. Two typical problems of this kind are computational-aided material design and drug discovery, in which quantum chemistry plays a critical role in answering questions such as ∼Which one is the best?∼. Many recent efforts have been devoted to the development of advanced quantum algorithms for solving quantum chemistry problems on noisy intermediate-scale quantum (NISQ) devices,2,4–14 while implementing these algorithms for complex problems is limited by available qubit counts, coherence time and gate fidelity. Specifically, without error correction, quantum simulations of quantum chemistry are viable only if low-depth quantum algorithms are implemented to suppress the total error rate. Recent advances in error mitigation techniques enable us to model many-electron problems with a dozen qubits and tens of circuit depths on NISQ devices,9 while such circuit sizes and depths are still a long way from practical applications.

The difference between the available and actually required quantum resources in practical quantum simulations has renewed the interest in divide and conquer (DC) based methods.15–19 Realistic material and (bio)chemistry systems often involve complex environments, such as surfaces and interfaces. To model these systems, the Schrödinger equations are much too complicated to be solvable. It therefore becomes desirable that approximate practical methods of applying quantum mechanics be developed.20 One popular scheme is to divide the complex problem under consideration into as many parts as possible until these become simple enough for an adequate solution, namely the philosophy of DC.21 The DC method is particularly suitable for NISQ devices since the sub-problem for each part can in principle be solved with fewer computational resources.15–18,22–25 One successful application of DC is to estimate the ground-state potential energy surface of a ring containing 10 hydrogen atoms using the density matrix embedding theory (DMET) on a trapped-ion quantum computer, in which a 20-qubit problem is decomposed into ten 2-qubit problems.18

DC often treats all subsystems at the same computational level and estimates physical observables by summing up the corresponding quantities of subsystems, while in practical simulations of complex systems, the particle–particle interactions may exhibit completely different characteristics in and between subsystems. Long-range Coulomb interactions can be well approximated as quasiclassical electrostatic interactions since empirical methods, such as empirical force filed (EFF) approaches,26 are promising to describe these interactions. As the distance between particles decreases, the repulsive exchange interactions from electrons having the same spin become important so that quantum mean-field approaches, such as Hartree–Fock (HF), are necessary to characterize these electronic interactions.

Amazon Looks to Grow Diamonds in Bid to Boost Computer Networks

Quantum networking uses subatomic matter to deliver data in a way that goes beyond today’s fiber-optic systems. Amazon wants to grow diamonds which would be part of a component that lets the data travel farther without breaking down.

Pretty futuristic!


Amazon.com Inc. is teaming up with a unit of De Beers Group to grow artificial diamonds, betting that custom-made gems could could help revolutionize computer networks.