Toggle light / dark theme

Silence for Thought: Special Interneuron Networks in the Human Brain

Summary: Human cortical networks have evolved a novel neural network that relies on abundant connections between inhibitory interneurons.

Source: Max Planck Institute.

The analysis of the human brain is a central goal of neuroscience. However, for methodological reasons, research has largely focused on model organisms, in particular the mouse.

Lyme Disease Is Even More Common Than Experts Realized, New Research Finds

It’s extremely important to check yourself for ticks this summer.


Up to 14.5% of the global population may have already had Lyme disease, according to a new meta-analysis published in BMJ Global Health. The researchers behind the report analyzed 89 previously published studies to calculate the figure, which sheds a harrowing light on the worldwide toll of the tick-borne illness.

From 1991 to 2018, the incidence of Lyme disease in the United States nearly doubled, according to data from the United States Environmental Protection Agency (EPA). In 1991, there were nearly four reported cases per 100,000 people; that number jumped to about seven cases per 100,000 people by 2018. The Centers for Disease Control and Prevention (CDC) estimates that about 470,000 Americans are diagnosed and treated for Lyme disease each year.

The bacterium that most commonly causes Lyme, Borrelia burgdorferi, is transmitted to humans via the bite of an infected black-legged tick, also known as a deer tick. These especially tiny ticks are often found in the Northeast, Mid-Atlantic, Upper Midwest, and Pacific Coast of the United States, per the U.S. National Library of Medicine (NLM). Once a person has been infected, they may develop short-term, flu-like symptoms including fever, headache, and fatigue, as well as a signature bull’s-eye-shaped rash that appears in up to 80% of Lyme disease cases, according to the CDC. In rare instances, when Lyme is left untreated, a person may experience long-term, potentially life-threatening complications, including joint pain, severe headaches and neck stiffness, heart issues, and inflammation of the brain and spinal cord, among others.

Researchers discover two important novel aspects of APOE4 gene in Alzheimer’s patients

Alzheimer’s disease (AD) is a progressive neurodegenerative disorder and the most common cause of dementia, affecting more than 5.8 million individuals in the U.S. Scientists have discovered some genetic variants that increase the risk for developing Alzheimer’s; the most well-known of these for people over the age of 65 is the APOE ε4 allele. Although the association between APOE4 and increased AD risk is well-established, the mechanisms responsible for the underlying risk in human brain cell types has been unclear until now.

Researchers from Boston University School of Medicine (BUSM) have discovered two important novel aspects of the gene: 1) human genetic background inherited with APOE4 is unique to APOE4 patients and 2) the mechanistic defects due to APOE4 are unique to human cells.

Our study demonstrated what the APOE4 gene does and which brain cells get affected the most in humans by comparing human and mouse models. These are important findings as we can find therapeutics if we understand how and where this risk gene is destroying our brain.

Beyond longevity: The DIY quest to cheat death and stop aging

At 79, he’s already outlived the CDC’s official life expectancy by two years and he has no intention of dying — or even slowing down — anytime soon. An active man, Scott jets between his homes in upstate New York and Florida, flies to exotic locations such as Panama City for business and still finds time for the odd cruise. His secret? A DIY regime of self-experimentation and untested therapies he believes will keep him going well past the next century.

Self-experimenters litter the history of medical science. Dentist Horace Wells dosed himself with nitrous oxide in 1,844 to see if it could kill pain, Nicholas Senn inflated his innards with hydrogen a few decades later to work out if it could diagnose a ruptured bowel, and more recently, Barry Marshall drank a solution containing H. pylori in 1985 to prove the bacterium caused ulcers.

These scientists risked their own health to make a medical breakthrough or prove a theory, but Scott is not a scientist. He’s an amateur enthusiast, also known as a biohacker. Biohackers engage in DIY biology, experimenting on themselves to enhance their brain and body. And many of them — like Scott — see longevity as the ultimate prize.

Discrete Wavelet Transform Analysis of the Electroretinogram in Autism Spectrum Disorder and Attention Deficit Hyperactivity Disorder

Background: To evaluate the electroretinogram waveform in autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD) using a discrete wavelet transform (DWT) approach.

Methods: A total of 55 ASD, 15 ADHD and 156 control individuals took part in this study. Full field light-adapted electroretinograms (ERGs) were recorded using a Troland protocol, accounting for pupil size, with five flash strengths ranging from −0.12 to 1.20 log photopic cd.s.m–2. A DWT analysis was performed using the Haar wavelet on the waveforms to examine the energy within the time windows of the a-and b-waves and the oscillatory potentials (OPs) which yielded six DWT coefficients related to these parameters. The central frequency bands were from 20–160 Hz relating to the a-wave, b-wave and OPs represented by the coefficients: a20, a40, b20, b40, op80, and op160, respectively. In addition, the b-wave amplitude and percentage energy contribution of the OPs (%OPs) in the total ERG broadband energy was evaluated.

Results: There were significant group differences (p < 0.001) in the coefficients corresponding to energies in the b-wave (b20, b40) and OPs (op80 and op160) as well as the b-wave amplitude. Notable differences between the ADHD and control groups were found in the b20 and b40 coefficients. In contrast, the greatest differences between the ASD and control group were found in the op80 and op160 coefficients. The b-wave amplitude showed both ASD and ADHD significant group differences from the control participants, for flash strengths greater than 0.4 log photopic cd.s.m–2 (p < 0.001).

Neocortex saves energy

Despite constituting less than 2% of the body’s mass, the human brain consumes approximately 20% of total caloric intake, with 50% of the energy being used by cortex (Herculano-Houzel, 2011). The majority of this energy is spent by neurons to reverse the ion fluxes associated with electrical signaling via Na+/K+ ATPase (Attwell and Laughlin, 2001; Harris et al., 2012). Excitatory synaptic currents and action potentials are particularly costly in this regard, accounting for approximately 57% and 23% of the energy budget for electrical signaling in gray matter, respectively (Harris et al., 2012; Sengupta et al., 2010). Given this cost, and the scarcity of resources, the brain is thought to have evolved an energy-efficient coding strategy that maximizes information transmission per unit energy (i.e., ATP) (Barlow, 2012; Levy and Baxter, 1996). This strategy accounts for a number of cellular features, including the low mean firing rate of neurons and the high failure rate of synaptic transmission, as well as higher order features, such as the structure of neuronal receptive fields (Albert et al., 2008; Attwell and Laughlin, 2001; Harris et al., 2015; Levy and Baxter, 1996; Olshausen and Field, 1997; Sterling and Laughlin, 2015). Scarcity of food, therefore, appears to have strongly sculpted information coding in the brain throughout evolution.

Energy intake is not fixed but can vary substantially across individuals, environments, and time (Hladik, 1988; Knott, 1998). Given that the brain is energy limited, one hypothesis is that in times of food scarcity, neuronal networks should save energy by reducing information processing. There is some evidence to suggest that this is the case in invertebrates (Kauffman et al., 2010; Longden et al., 2014; Plaçais et al., 2017; Placais and Preat, 2013). In Drosophila 0, food deprivation inactivates neural pathways required for long-term memory to preserve energy (Plaçais et al., 2017; Placais and Preat, 2013). Experimental re-activation of these pathways restores memory formation but significantly reduces survival rates (Placais and Preat, 2013). Similar memory impairments are seen with reduced food intake in C. elegans (Kauffman et al., 2010). Moreover, in blowfly, food deprivation reduces visual interneuron responses during locomotion, consistent with energy savings (Longden et al., 2014). However, it remains unclear whether and how the mammalian brain, and cortical networks in particular, regulate information processing and energy use in times of food scarcity.

Here we used the mouse primary visual cortex (V1) as a model system to examine how food restriction affects information coding and energy consumption in cortical networks. We assessed neuronal activity and ATP consumption using whole-cell patch-clamp recordings and two-photon imaging of V1 layer 2/3 excitatory neurons in awake, male mice. We found that food restriction, resulting in a 15% reduction of body weight, led to a 29% reduction in ATP expenditure associated with excitatory postsynaptic currents, which was mediated by a decrease in single-channel AMPA receptor (AMPAR) conductance. Reductions in AMPAR current were compensated by an increase in input resistance and a depolarization of the resting membrane potential, which preserved neuronal excitability; neurons were therefore able to generate a comparable rate of spiking as controls, while spending less ATP on the underlying excitatory currents. This energy-saving strategy, however, had a cost to coding precision. Indeed, we found that an increase in input resistance and depolarization of the resting membrane potential also increased the subthreshold variability of visual responses, which increased the probability for small depolarizations to cross spike threshold, leading to a broadening of orientation tuning by 32%. Broadened tuning was associated with reduced coding precision of natural scenes and behavioral impairment in fine visual discrimination. We found that these deficits in visual coding under food restriction correlated with reduced circulating levels of leptin, a hormone secreted by adipocytes in proportion to fat mass (Baile et al., 2000), and were restored by exogenous leptin supplementation. Our findings reveal key metabolic state-dependent mechanisms by which the mammalian cortex regulates coding precision to preserve energy in times of food scarcity.

Dr Thomas V Johnson III, MD, PhD — Neuro-Protection & Neuro-Regeneration R&D For Optic Pathologies

Neuro-Protection & Neuro-Regeneration R&D For Optic Pathologies — Dr. Thomas V. Johnson, MD, PhD, Johns Hopkins Medicine


Dr. Thomas V. Johnson III, M.D., Ph.D. (https://www.hopkinsmedicine.org/profiles/details/thomas-johnson) is a glaucoma specialist and the Allan and Shelley Holt Rising Professor in Ophthalmology at Wilmer Eye Institute, at Johns Hopkins University. He is also a member of the Retinal ganglion cell (RGC) Repopulation, Stem cell Transplantation, and Optic nerve Regeneration (RReSTORe) consortium (https://www.hopkinsmedicine.org/wilmer/research/storm/rrestore/index.html), an initiative focused on advancing translational development of vision restoration therapies for glaucoma and other primary optic neuropathies by assembling an international group of more than 100 leading and emerging investigators from related fields.

Dr. Johnson received his BA (summa cum laude) in Biological Sciences from Northwestern University in 2005. As a Gates-Cambridge Scholar and an NIH-OxCam Scholar, he earned his PhD in Clinical Neuroscience from the University of Cambridge (UK) in 2010. He completed his medical training (AOA) at the Johns Hopkins School of Medicine in 2014 and served as an intern on the Johns Hopkins Osler Medical Service prior to completing his ophthalmology residency and glaucoma fellowship at the Wilmer Eye Institute.

Dr. Johnson’s research interests are focused on understanding the pathophysiology of retinal and optic nerve neurodegenerative disorders, and on the development of neuroprotective and neuroregenerative therapies for these conditions. His doctoral thesis work evaluated intraocular stem and progenitor cell transplantation as a possible neuroprotective therapy for glaucoma. His research contributions have been recognized with a World Glaucoma Association Award nomination, the National Eye Institute’s Scientific Director’s Award, and the Association for Research in Vision and Ophthalmology’s Merck Innovative Ophthalmology Research Award. He also founded and served as director of the Student Sight Savers Program, a program that provides vision screening services to low-income residents of Baltimore, and helps them obtain access to clinical ophthalmological care.

Presently, Dr. Johnson is interested in the neurobiological processes that lead to retinal ganglion cell death and dysfunction in glaucoma and other optic neuropathies. In particular, he seeks to better understand the molecular mechanisms underlying axonal degeneration, dendrite retraction and afferent synapse loss, and cell body death in glaucoma. His goal is to utilize knowledge of these processes to develop targeted neuroprotective strategies to slow or halt RGC death and preserve vision for patients with glaucoma. He is also leading new investigations into the use of stem cell transplantation to achieve retinal ganglion cell placement, as a potential regenerative treatment for optic nerve disease, with a focus on anatomic incorporation of cell grafts, neurite growth and synapse formation, and electrophysiological retinal circuit integration.

What Is It About the Human Brain That Makes Us Smarter Than Other Animals? New Research

Humans are unrivaled in the area of cognition. After all, no other species has sent probes to other planets, produced lifesaving vaccines, or created poetry. How information is processed in the human brain to make this possible is a question that has drawn endless fascination, yet no definitive answers.

Our understanding of brain function has changed over the years. But current theoretical models describe the brain as a “distributed information-processing system.” This means it has distinct components that are tightly networked through the brain’s wiring. To interact with each other, regions exchange information though a system of input and output signals.

However, this is only a small part of a more complex picture. In a study published last week in Nature Neuroscience, using evidence from different species and multiple neuroscientific disciplines, we show that there isn’t just one type of information processing in the brain. How information is processed also differs between humans and other primates, which may explain why our species’ cognitive abilities are so superior.

Light-activated “photoimmunotherapy” kills brain cancer, reduces relapse

Scientists at the Institute of Cancer Research in London have developed a new light-activated “photoimmunotherapy” that could help treat brain cancer. The key is a compound that glows under light to guide surgeons to the tumor, while near-infrared light activates a cancer-killing mechanism.

The new study builds on a common technique called Fluorescence Guided Surgery (FGS), which involves introducing a fluorescent agent to the body which glows under exposure to light. This is paired with a synthetic molecule that binds to a specific protein, such as those expressed by cancer cells. The end result is tumors that glow under certain lighting conditions or imaging, guiding surgeons to remove the affected cells more precisely.

For the new study, the researchers gave the technique an extra ability – killing the cancer as well. They added a new molecule that binds to a protein called EGFR, which is often mutated in cases of the brain cancer glioblastoma. After the fluorescence has helped surgeons remove the bulk of the tumor, they can shine near-infrared light on the site, which switches the compound into a tumor-killing mode by releasing reactive oxygen species. The idea is to kill off any remaining cells that could – and often do – stage an aggressive comeback after surgery.

/* */