21. May 2017 · Comments Off on Cryofixation and Chemopreservation · Categories: Cryonics, Neuroscience

The most common modern protocol for imaging brain structure at high magnification is to chemically fix the brain with aldehydes (formaldehyde, glutaraldehyde) and heavy metals like osmium and then prepare it for electron microscopy imaging. Using this method, a tremendous amount of detailed anatomical information about the structure of the brain in its healthy and pathological state has been obtained, including the effects of (prolonged) ischemia.

Almost from its inception, however, the limitations of this method have been recognized. In particular, when fixatives are introduced to the brain through the process of perfusion a number of distinct artifacts are produced, notably shrinking of the brain and a reduction of the extracellular space. While different solutions and protocols have been developed to reduce these artifacts, the gold standard for ultrastructural analysis is a method that does not use aldehydes at all; cryofixation.

In cryofixation small tissue samples are rapidly cooled (without freezing) and then prepared for electron microscopy. This method produces the most realistic images of the ultrastructure of the brain, as evidenced by papers that compared this method with aldehyde fixation or used advanced tools to understand the properties of the brain without doing electron microscopy.

Although the word “vitrification” is rarely used in the context of cryofixation, the pristine images in this method can only be achieved when ice formation is avoided through ultra-rapid cooling. Vitrification without the use of high concentrations of (toxic) cryoprotectants would be quite attractive if it could be scaled to the size of organs (or even humans!) but unfortunately this method can only be used on very small tissue samples.

The pristine images obtained from cryofixation raise some important issues. Does conventional aldehyde fixation produce only predictable distortions or is identity-specific information irreversibly lost? What are the ultrstructural effects of the heavy metal exposure when cryofixed samples are prepared for electron microscopy? In a more general sense, to what degree can we be confident that a technology can produce a completely realistic image of the ultrastructure of the brain?

Will computer simulations of scanned fixed brains need extensive correction if they are to serve as a simulation of the brain? One clear advantage of using viability assays in addition to electron microscopy is that we can test brain slices or whole brains for resumption of function (or retention of memory) after subjecting them to experimental protocols. This is a clear advantage of the use of cryopreservation technologies over chemical fixation. In a cryonics case we can monitor the patient from the start of our procedures to the point of long term care and collect data and viability information. In the case of chemopreservation no such feedback is possible and taking brain biopsies for electron microscopy is all we can do to assess the effects of our cryopreservation procedures.

It is tempting for a cryonics organization to choose the method of preservation that produces the most crisp electron micrographs. In reality, however, there are challenges and unknown issues. Cryofixation cannot be scaled to work for cryonics. What is the effect of conventional aldehyde perfusion in ischemic brains? How do aldehyde fixed brains look on the molecular level compared to cryopreserved brains? How can we know that identity-critical information is not irreversibly altered? And, last but not least, any preservation technology that renders tissue dead by conventional criteria cannot be considered as a means for achieving true human suspended animation.

Originally published as a column (Quod incepimus conficiemus) in Cryonics magazine, September, 2015

10. May 2017 · Comments Off on Medical Myopia and Brain Death · Categories: Cryonics, Death, Neuroscience

Recently someone sent me a number of papers that discussed the biophilosophical underpinnings of brain death. Medical doctors increasingly find themselves in the midst of heated debates about what constitutes death by neurological criteria. It is not hard to understand how controversies can occur in this area. Whenever a patient who satisfies the criteria for brain death shows signs of improvement or recovery, these criteria are called into question. Or, perhaps more troublesome, some people will simply not concede that a patient is dead because recovery can be envisioned. In such cases, the concept of death becomes more like a subjective “decision” than an objective property of the brain.

To someone sympathetic to cryonics these debates are mildly infuriating because it shows the reckless medical myopia with which matters of life and death are approached. When bioethicists debate what constitutes “permanent and irreversible loss of the capacity for consciousness and self-awareness” there is little recognition of the possibility that what looks hopeless and irreversible by contemporary medical technologies may be rather straightforward to repair or recover by future medical technologies. Would we abandon a patient if a cure would be available tomorrow? What about next month? Next year? 50 years?

The standard rejoinder to this position is that cryopreservation of the patient (cryonics) itself produces irreversible damage to the brain and is thus not suitable to stabilize the patient longterm until more advanced treatments are available. But how can we know what will be considered irreversible damage in the future? Should we simply pull the plug based on our guesswork about the limits of future technologies? Would it not be more prudent to let future doctors make that determination?

This does look a lot like saying that cryonics is just an argument in favor of prudence based on ignorance. A sophisticated way of saying, “well, you never know!” Not quite. If a healthy brain without damage gives rise to consciousness and identity, it follows that if the original state of the brain can be inferred from the damaged state, the capacity to restore consciousness and identity is preserved in principle. Ice formation undeniably alters the structure of the brain but it does not make the ultrastructure “disappear.” In fact, at cryogenic temperatures nothing “disappears,” a point that is not even sufficiently recognized by many cryonics advocates. Today we can do better than freezing, though, and use vitrification agents, which solidify into a glass upon cooling to cryogenic temperatures. While these vitrification agents exhibit some toxicity, at the ultrastructural level this expresses itself at most as alteration of cell membranes, protein denaturation, etc., not wholesale destruction.

Where does this leave us on the issue of brain death? For starters, looking at a monitor and concluding that the patient is dead because of the absence of organized electrical activity will tell us little about the ultrastructure of the brain (case in point, at 15 degrees Celsius even a healthy brain will show a flat EEG). It is true that in some cases of brain death absence of electrical activity corresponds to substantial decomposition of brain tissue but it is important to recognize that in many such cases the brain has been permitted to self-destruct at body temperature as a result of trauma and ischemia. When a hospital is faced with a traumatic event of such magnitude that profound cell death can be expected, the most prudent action is to quickly cool the patient and prevent “information-theoretic death.” If the capacity for consciousness and awareness resides in the neuroanatomy of the brain, the first mandate of medicine is to preserve this.

Originally published as a column (Quod incepimus conficiemus) in Cryonics magazine, March, 2015

27. March 2017 · Comments Off on The Case for Brain Cryopreservation · Categories: Cryonics, Neuroscience

Cryopreservation of just the head is as old as Alcor itself. In fact, some people identify Alcor with its “neuro-preservation” option. It is important, however, to recognize that the objective of preserving the head is really to preserve what is inside the head, i.e. the brain. While I am aware of (contrived) technical arguments that prefer head preservation over brain preservation for information-theoretical reasons, I suspect that no advocate of neuro-preservation is anxious about the prospect of having only his/her brain preserved in a pristine state.

This raises an important question – one that is not immediately evident to the general public. Why not just preserve the naked brain instead? I am aware of at least three major arguments against it and I think that these arguments are based on incomplete information or a lack of imagination.

Myth 1: The isolated brain is not a stable organ and will collapse upon itself in a jellylike state if it is removed from the skull.

Answer: In human cryopreservation the brain would only be extracted at low temperatures which provide a lot more stability to the brain. In addition, in a good case the brain will also be loaded with a cryoprotectant and exist in a dehydrated state, which will provide even more stability.

Myth 2: Removing the brain from the skull will damage the brain and will erase identity-critical information.

Answer: It is correct that morticians typically remove the brain with little regard for its ultrastructural integrity but there is no reason why a cryonics organization should engage in such traumatic brain removal. Safe brain removal protocols are technically possible and cryonics organizations have a strong incentive to develop and refine such techniques.

Myth 3: The skull is necessary to provide protection to the brain.

Answer: It is undeniable that the skull provides robust protection to the brain but from that it does not follow that a cryonics organization cannot design a long-term enclosure and maintenance method that provides strong protection of the naked brain, too.

I do not claim that brain preservation is equal in all respects to neuro-preservation. For example, extraction of the brain from the skull requires additional time after completion of cryoprotectant perfusion and during this time the brain will be exposed to high levels of cryoprotectant (strictly speaking, isolated brain perfusion is possible but this requires a very advanced surgical procedure). Keeping the brain temperature low and uniform during brain removal is also a challenge.

On the other hand, there are potential advantages as well. An isolated brain can be placed in the cryoprotectant to allow diffusion of the vitrification agent prior to cryogenic cooldown to compensate for any ischemia-induced cortical perfusion impairment. In fact, if perfusion is no longer an option, immersion of the (fixed) brain in cryoprotectant is the only means to mitigate ice formation during cryostasis. Another advantage is a decrease in long-term care costs (at least 50%), which allows for lower cryopreservation minimums.

But the most important advantage of brain preservation is that public perception and negative PR would be substantially lower than that with neuro-preservation. Even if the procedure were a little riskier (technically speaking) one could still argue that it is safer in general because images of cryopreserved brains do not risk the kind of visceral response that neuro-preservation triggers.

I cannot do justice to all the technical, logistical, and financial issues associated with brain-only cryopreservation here but the topic requires more study for the reason alone that cryonics organizations occasionally receive fixed brains, or patients with long ischemic times, for whom immersion cryoprotection could be superior to straight freezing. Brain cryopreservation does not exist as an option yet, but it has been the reality for a number of patients.

Originally published as a column (Quod incepimus conficiemus) in Cryonics magazine, January, 2014

27. March 2017 · Comments Off on Multiple Sclerosis and Human Enhancement · Categories: Health, Neuroscience

Multiple sclerosis is a disease that raises a lot of interesting questions for people interested in biogerontology, human enhancement, and even cryonics. It raises questions about immunosenescence and draws attention to possible immune improvements for biological human enhancement. Biotechnologies to induce myelin repair may even be useful for the repair of cryopreserved brains. Before I discuss multiple sclerosis from these perspectives, let us take a closer look at this medical condition.

Multiple sclerosis (MS) is an inflammatory autoimmune disorder of the central nervous system that results in axonal degeneration in the brain and spinal cord. In simple terms, multiple sclerosis is a disease wherein the body’s immune system attacks and damages the myelin sheath, the fatty tissue that surrounds axons in the central nervous system. The myelin sheath is important because it facilitates the conduction of electrical signals along neural pathways. Like electrical wires, neuronal axons require insulation to ensure that they are able to transmit a signal accurately and at high speeds. It is these millions of nerves that carry messages from the brain to other parts of the body and vice versa.

More specifically, MS involves the loss of oligodendrocytes, the cells responsible for creating and maintaining the myelin sheath. This results in a thinning or complete loss of myelin (i.e., demyelination) and, as the disease advances, the breakdown of the axons of neurons. A repair process, called remyelination, takes place in early phases of the disease, but the oligodendrocytes are unable to completely rebuild the cell’s myelin sheath. Repeated attacks lead to successively less effective remyelinations, until a scar-like plaque is built up around the damaged axons.

The name multiple sclerosis refers to the scars (sclerae—better known as plaques or lesions) that form in the nervous system. These scars most commonly affect the white matter in the optic nerve, brain stem, basal ganglia, and spinal cord or white matter tracts close to the lateral ventricles of the brain. The peripheral nervous system is rarely involved. These lesions are the origin of the symptoms during an MS “attack.”

In addition to immune-mediated loss of myelin, which is thought to be carried out by T lymphocytes, B lymphocytes, and macrophages, another characteristic feature of MS is inflammation caused by a class of white blood cells called T cells, a kind of lymphocyte that plays an important role in the body’s defenses. In MS, T cells enter the brain via disruptions in the blood-brain barrier. The T cells recognize myelin as foreign and attack it, which is why these cells are also called “autoreactive lymphocytes.”

The attack of myelin starts inflammatory processes which trigger other immune cells and the release of soluble factors like cytokines and antibodies. Further breakdown of the blood–brain barrier in turn causes a number of other damaging effects such as swelling, activation of macrophages, and more activation of cytokines and other destructive proteins. These inflammatory factors could lead to or enhance the loss of myelin, or they may cause the axon to break down completely.

Because multiple sclerosis is not selective for specific neurons, and can progress through the brain and spinal cord at random, each patient’s symptoms may vary considerably. When a patient experiences an “attack” of increased disease activity, the impairment of neuronal communication can manifest as a broad spectrum of symptoms affecting sensory processing, locomotion, and cognition.

Some of the most common symptoms include: numbness and/or tingling of the limbs, like pins and needles; extreme and constant fatigue; slurring or stuttering; dragging of feet; vision problems, especially blurred vision; loss of coordination; inability to walk without veering and bumping into things; weakness; tremors; pain, especially in the legs; dizziness; and insomnia. There are many other symptoms, as well, such as loss of bowel or bladder control, the inability to process thoughts (which leads to confusion), and passing out. Some MS patients lose their vision and many lose their ability to walk. The symptoms are not necessarily the same for all patients and, in fact, an individual MS patient does not always have the same symptoms from day to day or even from minute to minute.

One of the most prevalent symptoms of MS is extreme and chronic fatigue. Assessment of fatigue in MS is difficult because it may be multifactorial, caused by immunologic abnormalities as well as other conditions that contribute to fatigue such as depression and disordered sleep (Braley and Chervin, 2010). Pharmacologic treatments such as amantadine and modafinil have shown favorable results for subjective measures of fatigue. Both drugs are well tolerated and have a mild side-effect profile (Life Extension Foundation, 2013).

It is estimated that multiple sclerosis affects approximately 85 out of every 100,000 people (Apatoff, 2002). The number of known patients is about 400,000 in the United States and about 2.5 million worldwide (Braley & Chervin, 2010). In recent years, there has been an increase of identified multiple sclerosis patients with about 50 percent more women reporting the disease. Indeed, between two and three times as many women have MS than men. Most patients are diagnosed between the ages of 20 and 50 but MS can strike at any age (National Multiple Sclerosis Society, 2013).

Incidence of multiple sclerosis varies by geographic region and certain demographic groups (Apatoff, 2002; Midgard, 2001). There is evidence that worldwide distribution of MS may be linked to latitude (Midgard, 2001). In the U.S., for instance, there is a lower rate of MS in the South than in other regions (Apatoff, 2002). Data regarding race shows 54 percent of MS patients are white, 25 percent are black and 19 percent are classified as other (Apatoff, 2002).

There are four disease courses identified in MS:

Relapsing-Remitting: Patients have clearly defined acute attacks or flare-ups that are referred to as relapses. During the relapse, the patient experiences worsening of neurologic function—the body or mind will not function properly. The relapse is followed by either partial or total recovery, called remissions, when symptoms are alleviated. About 85 percent of MS patients fall into this category (National Multiple
Sclerosis Society, 2013).

Primary-Progressive: The disease slowly and consistently gets worse with no relapses or remissions. Progression of the disease occurs over time and the patient may experience temporary slight improvements of functioning. About 10 percent of MS patients fall into this category (National Multiple Sclerosis Society, 2013).

Secondary-Progressive: Patient appears to have relapsing-remitting MS, but after time the disease becomes steadily worse. There may or may not be plateaus, flareups, or remissions. About half the people originally diagnosed with relapsing remitting will move into this category within 10 years (National Multiple Sclerosis Society, 2013).

Progressive-Relapsing: Quick disease progression with few, if any, remissions. About 5 percent of MS patients fall into this category at diagnosis (National Multiple Sclerosis Society, 2003).

The cause(s) of multiple sclerosis remain unknown although research suggests that both genetic and environmental factors contribute to the development of the disease (National Multiple Sclerosis Society, 2013; Compston and Coles, 2002). The current prevailing theory is that MS is a complex multifactorial disease based on a genetic susceptibility but requiring an environmental trigger, and which causes tissue damage through inflammatory/ immune mechanisms. Widely varying environmental factors have been found to be associated with the disease, ranging from infectious agents to Vitamin D deficiency and smoking. The debate these days revolves primarily around whether immune pathogenesis is primary, or acts secondarily to some other trigger (Braley & Chervin, 2010).

Risk factors for multiple sclerosis include genetics and family history, though it is believed that up to 75% of MS must be attributable to non-genetic or environmental factors. Infection is one of the more widely suspected non-genetic risk factors. A commonly held theory is that viruses involved in the development of autoimmune diseases could mimic the proteins found on nerves, making those nerves a target for antibodies. The potential roles of several viruses have been investigated including herpes simplex virus (HSV), rubella, measles, mumps, and Epstein Barr virus (EBV). The strongest correlation between a virus and MS exists with EBV—virtually 100% of patients who have MS are seropositive for EBV (the rate in the general public is about 90%)— but potential causality remains strongly debated (Ludwin and Jacobson, 2011).

It is important to keep in mind that infectious agents such as viruses may, in fact, have nothing to do with causing MS. The association of a virus with MS is based on increased antibody response and may be epiphenomenal of a dysregulated global immune response. “Proving” causality will require consistent molecular findings as well as consistent results from well-controlled clinical trials of virus-specific antiviral therapies (as yet to be developed). In the end, any theory concerning causality in MS should also account for the strong association with other environmental factors such as Vitamin D deficiency and smoking. Indeed, a landmark study found that, compared to those with the highest levels of vitamin D, those with the lowest blood levels were 62% more likely to develop MS. Additionally, a literature review evaluating more than 3000 MS cases and 45,000 controls indicates that smoking increases the risk of developing MS by approximately 50% (Life Extension Foundation, 2013).

Recently, researchers have pinpointed a specific toxin they believe may be responsible for the onset of MS. Epsilon toxin—a byproduct of the bacterium Clostridium perfringens—is able to permeate the blood brain barrier and has been demonstrated to kill oligodendrocytes and meningeal cells. Loss of oligodendrocytes and meningeal inflammation are both part of the MS disease process, and may be triggered by exposure to epsilon toxin.

The fact that females are more susceptible to inflammatory autoimmune diseases, including multiple sclerosis, points to the potential role of hormones in the etiology of multiple sclerosis. Interestingly, the course of disease is affected by the fluctuation of steroid hormones during the female menstrual cycle and female MS patients generally experience clinical improvements during pregnancy (Life Extension Foundation, 2013). Additionally, pregnancy appears to be protective against the development of MS. A study in 2012 demonstrated that women who have been pregnant two or more times had a significantly reduced risk of developing MS, while women who have had five or more pregnancies had one-twentieth the risk of developing MS compared to women who were never pregnant. (The increase in MS prevalence over the last few decades could reflect the fact that women are having fewer children.) A growing body of evidence supports the therapeutic potential of hormones (both testosterone and estrogens) in animal models of multiple sclerosis, but more research is needed to understand the pathways and mechanisms underlying the beneficial effects of sex hormones on MS pathology (Gold and Voskuhl, 2009).

No single test gives a definitive diagnosis for MS, and variable symptoms and disease course make early diagnosis a challenge. Most diagnoses are presumptive and are based on the clinical symptoms seen in an acute attack. Supporting evidence of these presumptions is then sought, usually from a combination of magnetic resonance imaging (MRI) of the brain, testing the cerebrospinal fluid (CSF) for antibodies, measuring the
efficiency of nerve impulse conduction, and monitoring symptoms over time.

As there is still much work to be done in understanding the nature of multiple sclerosis, a cure has yet to be discovered. Conventional medical treatment typically focuses on strategies to treat acute attacks, to slow the progression of the disease, and to treat symptoms. Corticosteriods such as methylprednisolone are the first line of defense against acute MS attacks and are administered in high doses to suppress the immune system and decrease the production of proinflammatory factors. Plasma exchange is also used to physically remove antibodies and proinflammatory factors from the blood.

The use of beta interferons is a longstanding MS treatment strategy, originally envisioned as an antiviral compound. Beta interferons reduce inflammation and slow disease progression, but the mechanism of action is poorly understood. Other immunosuppressant drugs such as Mitoxantrone and Fingolimod also slow disease progression, but are not used as first-line treatments due to their severe side effects. More recently, researchers at Oregon Health & Science University have noted that an antioxidant called MitoQ has been shown to significantly reverse symptoms in a mouse model of MS (Mao, Manczak, Shirendeb, and Reddy (2013).

Besides pharmacological treatments, MS patients may benefit from therapies (such as physical and speech therapy) and from an optimized nutritional protocol. Supplementation with Vitamin D, Omega-3 and -6 fatty acids, Vitamin E, lipoic acid, Vitamin b12, and Coenzyme Q10 appear to be of particular potential benefit (Life Extension Foundation, 2013). Until a definitive cause for MS can be defined and a cure developed, such strategies, including hormone therapy, offer possible ways to improve quality of life over the course of disease progression.

Unlike Alzheimer’s disease, there does not appear to be a Mendelian variant of MS that will invariably produce the disease in people who have the gene. A somewhat puzzling variable is that MS predominantly tends to occur between the ages of 20 and 50. This appears to exclude approaching MS as a form of immunosenescence. After all, if MS would be a function of the aging immune system, we would see progressively more cases of MS as people get older (or in AIDS patients), ultimately involving many very old people. More likely, MS is a non age-related form of dysfunction of the immune system that is triggered by environmental factors (such as a viral infection). While many discussions about the role of viruses in debilitating diseases like Alzheimer’s and MS still suffer from an incomplete understanding of cause and effect, it seems reasonable to conclude that enhancement of the human immune system can greatly reduce disease and improve the quality of life, even in healthy humans.

One potential treatment for MS is to induce remyelination (or inhibit processes that interfere with efficient remyelination). Stem cells can be administered to produce oligodendrocyte precursor cells to produce the oligodendrocyte glial cells that are responsible for remyelination of axons. While the myelin sheaths of these remyelated axons are not as thick as the myelin sheaths that are formed during development, remyelination can improve conduction velocity and prevent the destruction of axons. While the dominant repair strategies envisioned for cryonics involve molecular nanotechnologies that can build any biochemical structures that physical law permits, it is encouraging to know that specific stem cell therapies will be available to repair and restore myelin function in cryonics patients as damage to myelin should be expected as a result of (prolonged) ischemia and cryoprotectant toxicity.

An interesting possibility is that remyelination therapies may also be used for human enhancement if these therapies can be tweaked to improve conduction velocity in humans or to induce certain desirable physiological responses by varying the composition and strength of the myelin sheath in various parts of the central nervous system.

References

Apatoff, Brian R. (2002). MS on the rise in the US. Neurology Alert 20(7), 55(2).

Braley, Tiffany J., Chervin, Ronald D. (2010). Fatigue in Multiple Sclerosis: Mechanisms, evaluation, and treatment. Sleep 33(8), 1061-1067.

Compston, Alastair, and Coles, Alasdair (2002). The Lancet 359(9313), 1221.

Gold, Stefan M., and Voskuhl, Rhonda R. (2009). Estrogen and testosterone therapies in multiple sclerosis. Progress in Brain Research 175: 239-251.

Life Extension Foundation (2013). Multiple Sclerosis, in: Disease Prevention and Treatment, 5th edition, 947-956.

Ludwin, SK, and Jacobson, S. (2011). Epstein- Barr Virus and MS: Causality or association? The International MS Journal 17.2: 39-43.

Mao, Peizhong, Manczak, Maria, Shirendeb, Ulziibat P., and Reddy, P. Hemachandra (2013). MitoQ, a mitochondira-targeted antioxidant, delays disease progression and alleviates pathogenesis in an experimental autoimmune encephalomyelitis mouse model of multiple sclerosis. Biochimica et Biophysica Acta Molecular Basis of Disease 1832(12): 2322- 2331.

Midgard, R. (2001). Epidemiology of multiple sclerosis: an overview. Journal of Neurology, Neurosurgery and Psychiatry 71(3), 422.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, February, 2014

09. February 2016 · Comments Off on Recent developments relevant to cryonics · Categories: Cryonics, Neuroscience

A lot of interesting pieces related to cryonics have appeared over the last few months that I thought I would share:

Four professors conclude in MIT Technology Review that there is significant and growing body of evidence in support of human cryopreservation: “The Science Surrounding Cryonics” 

New York Times Cover story by Pulitzer Prize winning journalist on “A Dying 23 Year Young Woman’s Hope in Cryonics and a Future”

Skeptic Michael Sherman writes a piece in Scientific American called  “Can Our Minds Live Forever?”

Here are three recent important peer reviewed papers:

Dr. Greg Fahy and Robert McIntyre of 21st Century Medicine describe here a new cryobiological and neurobiological technique, aldehyde-stabilized cryopreservation (ASC), which demonstrates the relevance and utility of advanced cryopreservation science for the neurobiological research community. The ASC technology is now also competing against Dr Mikula at Max Planck in he brain preservation prize.

The Grand Challenges of Organ Banking and It’s Potential is described by large group of the worlds leading cryobiology scientists:  The first Organ Banking Summit was convened from Feb. 27 – March 1, 2015 in Palo Alto, CA, with events at Stanford University, NASA Research Park, and Lawrence Berkeley National Labs. Experts at the summit outlined the potential public health impact of organ banking, discussed the major remaining scientific challenges that need to be overcome in order to bank organs, and identified key opportunities to accelerate progress toward this goal. Many areas of public health could be revolutionized by the banking of organs and other complex tissues, including transplantation, oncofertility, tissue engineering, trauma medicine and emergency preparedness, basic biomedical research and drug discovery – and even space travel.

Persistence of Long-Term Memory in Vitrified and Revived Caenorhabditis elegans. Two scientists ask the question:  “Can memory be retained after cryopreservation?” and then demonstrate that a form of long-term memory in C. elegans is not been modified by the process of vitrification or slow freezing.

13. February 2015 · Comments Off on Though She Isn’t Really Ill, There’s a Little Yellow Pill… · Categories: Health, Neuroscience, Society

Humans have been ingesting mindand mood-altering substances for millennia, but it has only rather recently become possible to begin to elucidate drug mechanisms of action and to use this information, along with our burgeoning knowledge of neuroscience, to design drugs intended to have a specific effect. And though most people think of pharmaceuticals as “medicine,” it has become increasingly popular to discuss the possibilities for the use of drugs in enhancement, or improvement of “human form or functioning beyond what is necessary to sustain or restore good health” (E.T. Juengst; in Parens, 1998, p 29).

Some (transhumansits) believe that enhancement may not only be possible, but that it may even be a moral duty. Others (bioconservatives) fear that enhancement may cause us to lose sight of what it means to be human altogether. It is not the intention of this article to advocate enhancement or to denounce it. Instead, let’s review some of the drugs (and/or classes of drugs) that have been identified as the most promisingly cognitive- or mood-enhancing. Many of the drugs we will cover can be read about in further depth in Botox for the brain: enhancement of cognition, mood and pro-social behavior and blunting of unwanted memories (Jongh, R., et al., Neuroscience and Biobehavioral Reviews 32 (2008): 760-776).

Of most importance in considering potentially cognitive enhancer drugs is to keep in mind that, to date, no “magic bullets” appear to exist. That is, there are no drugs exhibiting such specificity as to have only the primary, desired effect. Indeed, a general principle of trade-offs (particularly in the form of side effects) appears to exist when it comes to drug administration for any purpose, whether treatment or enhancement. Such facts may constitute barriers to the practical use of pharmacological enhancers and should be taken into consideration when discussing the ethics of enhancement.

Some currently available cognitive enhancers include donepezil, modafinil, dopamine agonists, guanfacine, and methylphenidate. There are also efforts underway to develop memory-enhancing drugs, and we will discuss a few of the mechanisms by which they are proposed to act. Besides cognitive enhancement, the enhancement of mood and prosocial behavior in normal individuals are other types of enhancement that may be affected pharmacologically, most usually by antidepressants or oxytocin. Let’s briefly cover the evidence for the efficacy of each of these in enhancing cognition and/or mood before embarking on a more general discussion of the general principles of enhancement and ethical concerns.

One of the most widely cited cognitive enhancement drugs is donepezil (Aricept®), an acetylcholinesterase inhibitor. In 2002, Yesavage et al. reported the improved retention of training in healthy pilots tested in a flight simulator. In this study, after training in a flight simulator, half of the 18 subjects took 5 mg of donepezil for 30 days and the other half were given a placebo. The subjects returned to the lab to perform two test flights on day 30. The donepezil group was found to perform similarly to the initial test flight, while placebo group performance declined. These results were interpreted as an improvement in the ability to retain a practiced skill. Instead it seems possible that the better performance of the donepezil group could have been due to improved attention or working memory during the test flights on day 30.

Another experiment by Gron et al. (2005) looked at the effects of donepezil (5 mg/day for 30 days) on performance of healthy male subjects on a variety of neuropsychological tests probing attention, executive function, visual and verbal short-term and working memory, semantic memory, and verbal and visual episodic memory. They reported a selective enhancement of episodic memory performance, and suggested that the improved performance in Yesavage et al.’s study is not due to enhanced visual attention, but to increased episodic memory performance.

Ultimately, there is scarce evidence that donepezil improves retention of training. Better designed experiments need to be conducted before we can come to any firm conclusions regarding its efficacy as a cognitive-enhancing.

The wake-promoting agent modafinil (Provigil®) is another currently available drug that is purported to have cognitive enhancing effects. Provigil® is indicated for the treatment of excessive daytime sleepiness and is often prescribed to those with narcolepsy, obstructive sleep apnea, and shift work sleep disorder. Its mechanisms of action are unclear, but it is supposed that modafinil increases hypothalamic histamine release, thereby promoting wakefulness by indirect activation of the histaminergic system. However, some suggest that modafinil works by inhibiting GABA release in the cerebral cortex.

In normal, healthy subjects, modafinil (100-200 mg) appears to be an effective countermeasure for sleep loss. In several studies, it sustained alertness and performance of sleep-deprived subjects(up to 54.5 hours) and has also been found to improve subjective attention and alertness, spatial planning, stop signal reaction time, digit-span and visual pattern recognition memory. However, at least one study (Randall et al., 2003) reported “increased psychological anxiety and aggressive mood” and failed to find an effect on more complex forms of memory, suggesting that modafinil enhances performance only in very specific, simple tasks.

The dopamine agonists d-amphetamine, bromocriptine, and pergolide have all been shown to improve cognition in healthy volunteers, specifically working memory and executive function. Historically, amphetamines have been used by the military during World War II and the Korean War, and more recently as a treatment for ADHD (Adderall®). But usage statistics suggest that it is commonly used for enhancement by normal, healthy people—particularly college students.

Interestingly, the effect of dopaminergic augmentation appears to have an inverted U-relationship between endogenous dopamine levels and working memory performance. Several studies have provided evidence for this by demonstrating that individuals with a low workingmemory capacity benefit from greater improvements after taking a dopamine receptor agonist, while high-span subjects either do not benefit at all or show a decline in performance.

Guanfacine (Intuniv®) is an α2 adrenoceptor agonist, also indicated for treatment of ADHD symptoms in children, but by increasing norepinephrine levels in the brain. In healthy subjects, guanfacine has been shown to improve visuospatial memory (Jakala et al., 1999a, Jakala et al., 1999b), but the beneficial effects were accompanied by sedative and hypotensive effects (i.e., side effects). Other studies have failed to replicate these cognitive enhancing effects, perhaps due to differences in dosages and/or subject selection.

Methylphenidate (Ritalin®) is a well-known stimulant that works by blocking the reuptake of dopamine and norepinephrine. In healthy subjects, it has been found to enhance spatial workingmemory performance. Interestingly, as with dopamine agonists, an inverted U-relationship was seen, with subjects with lower baseline working memory capacity showing the greatest improvement after methylphenidate administration.

Future targets for enhancing cognition are generally focused on enhancing plasticity by targeting glutamate receptors (responsible for the induction of long-term potentiation) or by increasing CREB (known to strengthen synapses). Drugs targeting AMPA receptors, NMDA receptors, or the expression of CREB have all shown some promise in cognitive enhancement in animal studies, but little to no experiments have been carried out to determine effectiveness in normal, healthy humans.

Beyond cognitive enhancement, there is also the potentialfor enhancement of mood and pro-social behavior. Antidepressants are the first drugs that come to mind when discussing the pharmacological manipulation of mood, including selective serotonin reuptake inhibitors (SSRIs). Used for the treatment of mood disorders such as depression, SSRIs are not indicated for normal people of stable mood. However, some studies have shown that administration of SSRIs to healthy volunteers resulted in a general decrease of negative affect (such as sadness and anxiety) and an increase in social affiliation in a cooperative task. Such decreases in negative affect also appeared to induce a positive bias in information processing, resulting in decreased perception of fear and anger from facial expression cues.

Another potential use for pharmacological agents in otherwise healthy humans would be to blunt unwanted memories by preventing their consolidation.Thismay be accomplished by post-training disruption of noradrenergic transmission (as with β-adrenergic receptor antagonist propranolol). Propranolol has been shown to impair the long-term memory of emotionally arousing stories (but not emotionally neutral stories) by blocking the enhancing effect of arousal on memory (Cahill et al., 1994). In a particularly interesting study making use of patients admitted to the emergency department, post-trauma administration of propranolol reduced physiologic responses during mental imagery of the event 3 months later (Pitman et al., 2002). Further investigations have supported the memory blunting effects of propranolol, possibly by blocking the reconsolidation of traumatic memories.

GENERAL PRINCIPLES

Reviewing these drugs and their effects leads us to some general principles of cognitive and mood enhancement. The first is that many drugs have an inverted U-shaped dose-response curve, where low doses improve and high doses impair performance.This is potentially problematic for the practical use of cognition enhancers in healthy individuals, especially when doses that are most effective in facilitating one behavior simultaneously exert null or detrimental effects on other behaviors.

Second, a drug’s effect can be “baseline dependent,” where low-performing individuals experience greater benefit from the drug while higher-performing individuals do not see such benefits (which might simply reflect a ceiling effect), or may, in fact, see a deterioration in performance (which points to an inverted U-model). In the case of an inverted U-model, low performing individuals are found on the up slope of the inverted U and thus benefit from the drug, while high-performing individuals are located near the peak of the inverted U already and, in effect, experience an “overdose” of neurotransmitter that leads to a decline in performance.

Trade-offs exist in the realm of cognitive enhancing drugs as well. As mentioned, unwanted “side effects” are often experienced with drug administration, ranging from mild physiological symptoms such as sweating to more concerning issues like increased agitation, anxiety, and/or depression.

More specific trade-offs may come in the form of impairment of one cognitive ability at the expense of improving another. Some examples of this include the enhancement of long-term memory but deterioration of working memory with the use of drugs that activate the cAMP/protein kinase A (PKA) signaling pathway. Another tradeoff could occur between the stability versus the flexibility of long-term memory, as in the case of certain cannabinoid receptor antagonists which appear to lead to more robust long-term memories, but which also disrupt the ability of new information to modify those memories. Similarly, a trade-off may exist between stability and flexibility of working memory. Obviously, pharmacological manipulations that increase cognitive stability at the cost of a decreased capacity to flexibly alter behavior are potentially problematic in that one generally does not wish to have difficulty in responding appropriately to change.

Lastly, there is a trade-off involving the relationship between cognition and mood. Many mood-enhancing drugs, such as alcohol and even antidepressants, impair cognitive functioning to varying degrees. Cognition-enhancing drugs may also impair emotional functions. Because cognition and emotion are intricately regulated through interconnected brain pathways, inducing change in one area may have effects in the other. Much more research remains to be performed to elucidate these interactions before we can come to any firm conclusions.

ETHICAL CONCERNS

Again, though it is not the place of this article to advocate or denounce the use of drugs for human enhancement, obviously there are considerable ethical concerns when discussing the administration of drugs to otherwise healthy human beings. First and foremost, safety is of paramount importance. The risks and side-effects, including physical and psychological dependence, as well as long-term effects of drug use should be considered and weighed heavily against any potential benefits.

Societal pressure to take cognitive enhancing drugs is another ethical concern, especially in light of the fact that many may not actually produce benefits to the degree desired or expected. In the same vein, the use of enhancers may give some a competitive advantage, thus leading to concerns regarding fairness and equality (as we already see in the case of physical performance-enhancing drugs such as steroids). Additionally, it may be necessary, but very difficult, to make a distinction between enhancement and therapy in order to define the proper goals of medicine, to determine health-care cost reimbursement, and to “discriminate between morally right and morally problematic or suspicious interventions” (Parens, 1998). Of particular importance will be determining how to deal with drugs that are already used off-label for enhancement. Should they be provided by physicians under certain conditions? Or should they be regulated in the private commercial domain?

There is an interesting argument that using enhancers might change one’s authentic identity—that enhancing mood or behavior will lead to a personality that is not really one’s own (i.e., inauthenticity), or even dehumanization—while others argue that such drugs can help users to “become who the really are,” thereby strengthening their identity and authenticity. Lastly, according to the President’s Council on Bioethics, enhancement may “threaten our sense of human dignity and what is naturally human” (The President’s Council, 2003). According to the Council, “the use of memory blunters is morally problematic because it might cause a loss of empathy if we would habitually ‘erase’ our negative experiences, and because it would violate a duty to remember and to bear witness of crimes and atrocities.” On the other hand, many people believe that we are morally bound to transcend humans’ basic biological limits and to control the human condition. But even they must ask: what is the meaning of trust and relationships if we are able to manipulate them?

These are all questions without easy answers. It may be some time yet before the ethical considerations of human cognitive and mood enhancement really come to a head, given the apparently limited benefits of currently available drugs. But we should not avoid dealing with these issues in the meantime; for there will come a day when significant enhancement, whether via drugs or technological means, will be possible and available. And though various factions may disagree about the morality of enhancement, one thing is for sure: we have a moral obligation to be prepared to handle the consequences of enhancement, both positive and negative.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, December, 2013

27. January 2015 · Comments Off on Brain Fitness · Categories: Health, Neuroscience

Book Review: The SharpBrains Guide to Brain Fitness: How to Optimize Brain Health and Performance at Any Age by Alvero Fernandez

Of all the organs in the human body, a cryonicist should be most concerned about the health and integrity of his or her brain. Thousands of books have been written about physical health and fitness, but very few address the topic of how to keep the brain fit and healthy. Happily, interest in brain fitness, once relegated to academics and gerontologists, is now taking root across America and the world.

The importance of lifelong learning and mental stimulation as a component of healthy aging has long been recognized and touted as a way to stay mentally alert and to stave off dementia in old age. As with physical exercise, “use it or lose it” appears to apply to our brains too. And now that scientists are learning more about neuroplasticity and how brains change as a result of aging, they have begun to test the effects of various factors on brain health and cognitive ability across the lifespan.

Unfortunately, like much health-related research, the results reported by the media have often been convoluted, confusing, and even contradictory. Products developed by overzealous entrepreneurs make outlandish claims and frequently don’t deliver the purported results. Consumers and professionals alike are left wondering what works and what doesn’t when it comes to maintaining our brains in optimal working condition.

To aid all those navigating the murky waters of brain fitness, enter SharpBrains—a company dedicated to tracking news, research, technology, and trends in brain health and to disseminating information about the applications of brain science innovation. In so doing, they “maintain an annual state-of-the-market consumer report series, publish consumer guides to inform decision-making, produce an annual global and virtual professional conference,” and maintain SharpBrains.com, a leading educational blog and website with over 100,000 monthly readers.

Most recently, SharpBrains has published a book on brain fitness called The SharpBrains Guide to Brain Fitness: How to Optimize Brain Health and Performance at Any Age. A compilation and condensation of information accumulated over the lifespan of the company, The SharpBrains Guide to Brain Fitness emphasizes credible research and goes to great lengths to provide the most up-to-date research results in specific areas of brain fitness, followed by interviews with scientists doing work in those fields. The goal of the guide is to help the reader begin to “cultivate a new mindset and master a new toolkit that allow us appreciate and take full advantage of our brain’s incredible properties…[by] providing the information and understanding to make sound and personally relevant decisions about how to optimize your own brain health and performance.”

The Guide begins by emphasizing that the brain’s many neuronal networks serve distinct functions including various types of memory, language, emotional regulation, attention, and planning. Plasticity of the brain is defined as its lifelong capacity to change and reorganize itself in response to the stimulation of learning and experience—the foundation upon which “brain training” to improve cognitive performance at any age, and to maintain brain health into old age, is predicated.

The difficulty of making sense of the scientific findings on brain health and neuroplasticity is discussed at length, with the finger of blame pointed squarely at the media for reporting only fragments of the research and for often reporting those results which are not most meaningful. The authors stress that “it is critical to complement popular media sources with independent resources, and above all with one’s own informed judgment.”

The following chapters go on to review what is known today about how physical exercise, nutrition, mental challenge, social engagement, and stress management can positively affect brain health. Along the way they provide dozens of relevant research results (as well as the design of each study) to support their recommendations. Reporting on all of those experiments is beyond the scope of this review, so if you are interested in examining them (and you should be!) please obtain a copy of the Guide for yourself or from your local library.

Physical exercise is discussed first because of the very strong evidence that exercise—especially aerobic, or “cardio,” exercise slows atrophy of the brain associated with aging, actually increasing the brain’s volume of neurons (i.e., “gray matter”) and connections between neurons (i.e., “white matter”). While much of the initial research supporting the effects of exercise on the brain came from animal studies, the authors report that “several brain imaging studies have now shown that physical exercise is accompanied by increased brain volume in humans.”

Staying physically fit improves cognition across all age groups, with particularly large benefits for so-called “executive” functions such as planning, working memory, and inhibition. A 2010 meta-analysis by the NIH also concluded that physical exercise is a key factor in postponing cognitive decline and/or dementia, while other studies have found physical exercise to lower the risk of developing Parkinson’s disease, as well.

But don’t think that just any moving around will do the trick. When it comes to providing brain benefits, a clear distinction is drawn between physical activity and physical exercise. Only exercise will trigger the biochemical changes in the brain that spur neurogenesis and support neuroplasticity. It doesn’t need to be particularly strenuous, but to be most beneficial it should raise your heart rate and increase your breathing rate.

Of course, adequate nutrition is also imperative in obtaining and maintaining optimal brain health. The SharpBrains Guide to Brain Fitness primarily highlights the well-known benefits of the Mediterranean diet, which consists of a high intake of vegetables, fruit, cereals, and unsaturated fats, a low intake of dairy products, meat, and saturated fats, a moderate intake of fish, and regular but moderate alcohol consumption. But I think it is safe to say that the jury is still out on the best diet for the brain, as evidenced by the recent popularity of the Paleo diet among life extentionists. And, of course, ethnicity and genetics are important, too. The authors do stress the importance of omega-3 fatty acids and antioxidants obtained from dietary sources, stating firmly that “to date, no supplement has conclusively been shown to improve cognitive functioning, slow down cognitive decline, or postpone Alzheimer’s disease symptoms beyond placebo effect.” This includes herbal supplements such as Ginko biloba and St. John’s wort.

Beyond what we normally do to keep our bodies healthy, the Guide also discusses the relative effectiveness of different forms of “mental exercise.” Perhaps you’ve heard that doing crossword or Sudoku puzzles will keep you sharp and alert into old age, or that speaking multiple languages is associated with decreased risk of Alzheimer’s disease. The good news is that these things are true—to a degree. The part that is often left out is that it’s the challenge of these activities that is important. As with physical activity vs. physical exercise, mental exercise refers to the subset of mental activities that are effortful and challenging.

Puzzles and games may be challenging at first, but they (and other mental exercises) can quickly become routine and unchallenging. In order to reap the most benefit from mental exercise, the goal is to be exposed to novelty and increasing levels of challenge. Variety is important for stimulating all aspects of cognitive ability and performance, so excessive specialization is not the best strategy for maintaining long-term brain health. If you are an artist, try your hand at strategybased games. If you’re an economist, try an artistic activity. Get out of your comfort zone in order to stimulate skills that you rarely use otherwise.

The SharpBrains Guide states that “lifelong participation in cognitively engaging activities results in delayed cognitive decline in healthy individuals and in spending less time living with dementia in people diagnosed with Alzheimer’s disease.” This is hypothesized to be because doing so builds up one’s “cognitive reserve”—literally an extra reservoir of neurons and neuronal connections—which may be utilized so that a person continues to function normally even in the face of underlying Alzheimer’s or other brain pathology. This observation raises another important point on which neuroscientists and physiologists do not yet fully agree. Will we all eventually get dementia if we live long enough without credible brain rejuvenation biotechnologies? This is a topic I would like to return to in a future installment of Cooler Minds Prevail.

Social engagement also appears to provide brain benefits. The NIH meta-analysis mentioned earlier concluded that higher social engagement in mid- to late life is associated with higher cognitive functioning and reduced risk of cognitive decline. Brain imaging studies indicate an effect of social stimulation on the volume of the amygdala, a structure that plays a major role in our emotional responses and which is closely connected to the hippocampus, which is important for memory.

Yet again, not all activity is equal. When it comes to social stimulation, “you can expect to accrue more benefits within groups that have a purpose (such as a book club or a spiritual group) compared to casual social interactions (such as having a drink with a friend to relax after work).” To keep socially engaged across the lifespan, seek out interactions that naturally involve novelty, variety, and challenge such as volunteering and participating in social groups.

“The lifelong demands on any person have changed more rapidly in the last thousand years than our genes and brains have,” The SharpBrains Guide explains in the intro to the chapter on stress management. The result? It has become much more difficult to regulate stress and emotions. It is great that we have such amazing and complex brains, but humans are among the few animals that can get stressed from their own thoughts. And while there are some (potentially) beneficial effects of short bursts of stress, high and sustained levels of stress can have a number of negative consequences. Those of note include: increased levels of blood cortisol which can lead to sugar imbalances, high blood pressure, loss of muscle tissue and bone density, lower immunity, and cause damage to the brain; a reduction of certain neurotransmitters, such as serotonin and dopamine, which has been linked to depression; and a hampering of our ability to make changes to reduce the stress, resulting in General Adaption Syndrome (aka “burnout”).

Research-based lifestyle solutions to combat stress include exercise, relaxation, socialization, humor and laughter, and positive thinking. In particular, targeted, capacity-building techniques such as biofeedback and meditation are recommended to manage stress and build resilience. Mindfulness Based Stress Reduction (MBSR) programs have provided evidence that meditative techniques can help manage stress and research shows that MBSR can lead to decreases in the density of an area of the amygdala which is correlated with reduction in reported stress.

So it appears that multiple approaches are necessary to develop a highly fit brain capable of adapting to new situations and challenges throughout life. “Consequently,” The SharpBrains Guide to Brain Fitness states, “we expect cross-training the brain to soon become as mainstream as cross-training the body is today, going beyond unstructured mental activity in order to maximize specific brain functions.”

There is growing evidence that brain training can work, but in evaluating what “works” we are mostly looking at two things: how successful the training program is (i.e., does it actually improve the skill(s) being trained?) and the likelihood of transfer from training to daily life. Building on an analysis of documented examples of brain training techniques that “work” or “transfer,” SharpBrains suggests the following five conditions need to be met for brain training to be likely to translate into meaningful real world improvements (condensed excerpt):

  1. Training must engage and exercise a core brain-based capacity or neural circuit identified to be relevant to real-life outcomes.
  2. The training must target a performance bottleneck.
  3. A minimum “dose” of 15 hours total per targeted brain function, performed over 8 weeks or less, is necessary for real improvement.
  4. Training must be adaptive to performance, require effortful attention, and increase in difficulty.
  5. Over the long-term, the key is continued practice for continued benefits.

Meditation, biofeedback, and/or cognitive therapy in concert with cognitive training to optimize targeted brain functions appear to be winning combinations in terms of successful techniques facilitating transfer from training to real life benefits. Top brain training software programs, based on SharpBrains’ analysis and a survey of their users, include Lumosity, Brain games, brainHQ, Cogmed, and emWave.

In the end, brain fitness needs are unique to each individual and brain fitness claims should be evaluated skeptically. SharpBrains recommends asking several questions when evaluating brain fitness claims, particularly whether there is clear and credible evidence of the program’s success documented in peer-reviewed scientific papers published in mainstream scientific journals that analyze the effects of the specific product.

Of course, your own individual experience with the product is ultimately the most important evaluation of all. If you are ready to take the plunge into the emerging brain fitness market, The SharpBrains Guide to Brain Fitness is a good place to start, and I’m sure they’d appreciate your feedback as this field continues to develop.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, August, 2013

26. January 2015 · Comments Off on HIV, Immunosenescence, and Accelerated Aging · Categories: Health, Neuroscience, Science

After a few articles considering Alzheimer disease from several angles, I would like to switch gears this month and talk more generally about the interaction between the immune system and aging.

In his 2012 paper[1], Caleb E. Finch documents the evolution of life expectancy in the course of human history. The life expectancy at birth of our shared ape ancestor 6 millions years ago is hypothesized to approximate that of a chimpanzee, 15 years. The first Homo species appeared 1-2 million years ago and had a life expectancy of ~20 years, while H. sapiens came onto the scene ~100,000 years ago and could expect about 30 years of life. But starting around 200 years ago, concurrent with industrialization, human life expectancy jumped rapidly, to somewhere between 70 and 80 years today.

As many readers are likely aware, the huge recent increases in life expectancy are commonly attributed to improvements in hygiene, nutrition, and medicine during the nineteenth and twentieth centuries that reduced mortality from infections at all ages. Finch hypothesizes, generally, that early age mortality over the course of human history is primarily due to (acute) infection, while old age mortality is primarily due to (chronic) inflammation. Further analysis of mortality rates over the last several hundred years leads him to further hypothesize that aging has been slowed in proportion to the reduced exposure to infections in early life. These hypotheses are supported by twentieth century examples which strongly demonstrate influences of the early life environment on adult health, such as the effects of prenatal and postnatal developmental influences (e.g., nutrition, exposure to infection) on adult chronic metabolic and vascular disorders as well as physical traits and mental characteristics. This leads Finch to suggest “broadening the concept of ‘developmental origins’ to include three groups of factors: nutritional deficits, chronic stress from socioeconomic factors, and direct and indirect damage from infections.”

Finch also considers the effects of inflammation and diet on human evolution, proposing several environmental and foraging factors that may have been important in the genetic basis for evolving lower basal mortality through interactions with chronic inflammation, in particular: dietary fat and caloric content; infections from pathogens ingested from carrion and from exposure to excreta; and noninfectious inflammagens such as those in aerosols and in cooked foods. He hypothesizes that exposure to these proinflammatory factors, which one would expect to shorten life expectancy, actually resulted in humans evolving lower mortality and longer lifespans in response to highly inflammatory environments.

A means for this, he argues, was the development of the apoE4 genotype. Noting that the apoE4 allele favors advantageous fat accumulation and is also associated with enhanced inflammatory responses, Finch argues that heightened inflammatory response and more efficient fat storage would have been adaptive in a pro-inflammatory environment and during times of uncertain nutrition. As has been discussed in prior articles in Cooler Minds Prevail, the apoE alleles also influence diverse chronic non-infectious degenerative diseases and lifespan. “Thus,” Finch concludes, “the apoE allele system has multiple influences relevant to evolution of brain development, metabolic storage, host defense, and longevity.”

With the general relationship between inflammation and the evolution of human aging and life expectancy in mind, let us now consider immune system involvement in more detail, and the relationship between HIV and immunosenescence more specifically.

Immunosenescence refers to the ageassociated deterioration of the immune system. As an organism ages it gradually becomes deficient in its ability to respond to infections and experiences a decline in long-term immune memory. This is due to a number of specific biological changes such as diminished self-renewal capacity of hematopoietic stem cells, a decline in total number of phagocytes, impairment of Natural Killer (NK) and dendritic cells, and a reduction in B-cell population. There is also a decline in the production of new naïve lymphocytes and the functional competence of memory cell populations. As a result, advanced age is associated with increased frequency and severity of pathological health problems as well as an increase in morbidity due to impaired ability to respond to infections, diseases, and disorders.

It is not hard to imagine that an increased viral load leading to chronic inflammatory response may accelerate aging and immunosenescence. Evidence for this is accumulating rapidly since the advent of antiretroviral therapies for treatment of HIV infection. An unforeseen consequence of these successful therapies is that HIV patients are living longer but a striking number of them appear to be getting older faster, particularly showing early signs of dementia usually seen in the elderly. In one study, slightly more than 10% of older patients (avg = 56.7 years) with wellcontrolled HIV infection had cerebrospinal fluid (CSF) marker profiles consistent with Alzheimer disease[2] – more than 10 times the risk prevalence of the general population at the same age. HIV patients are also registering higher rates of insulin resistance and cholesterol imbalances, suffer elevated rates of melanoma and kidney cancers, and seven times the rate of other non-HIV-related cancers. And ultimately, long-term treated HIV-infected individuals also die at an earlier age than HIV-uninfected individuals[3].

Recent research is beginning to explore and unravel the interplay between HIV infection and other environmental factors (such as co-infection with other viruses) in the acceleration of the aging process of the immune system, leading to immunosenescence. In the setting of HIV infection, the immune response is associated with abnormally high levels of activation, leading to a cascade of continued viral spread and cell death, and accelerating the physiologic steps associated with immunosenescence. Despite clear improvements associated with effective antiretroviral therapy, some subjects show persistent alterations in T cell homeostasis, especially constraints on T cell recovery, which are further exacerbated in the setting of co-infection and increasing age.

Unsurprisingly, it has been observed that markers of immunosenescence might predict morbidity and mortality in HIV-infected adults as well as the general population. In both HIV infection and aging, immunosenescence is marked by an increased proportion of CD28- to CD57+, and memory CD8+ T cells with reduced capacity to produce interleukin 2 (IL-2), increased production of interleukin 6 (IL-6), resistance to apoptosis, and shortened telomeres. Levels of markers of inflammation are elevated in HIV infected patients, and elevations in markers such as high-sensitivity C-reactive protein, D-dimer, and interleukin 6 (IL-6) have been associated with increased risk for cardiovascular disease, opportunistic conditions, or all-cause mortality[4].

But even as we are beginning to identify markers that appear to be associated with risk of poor outcome in HIV infection, it is still unclear how patients should be treated on the basis of this information. To that end, several trials are underway to evaluate the effects of modulation of immune activation and inflammation in HIV infection. At the same time, clinicians at the forefront of advancing knowledge and clinical care are performing research aimed at optimizing care for aging HIV patients.

The implications for such research may be far-reaching. In fact, many HIV clinicians and researchers think that HIV may be key to understanding aging in general. Dr. Eric Verdin states, “I think in treated, HIV-infected patients the primary driver of disease is immunological. The study of individuals who are HIV-positive is likely to teach us things that are really new and important, not only about HIV infection, but also about normal aging.”

Dr. Steven Deeks stresses the collaborative efforts of experts across fields. “I think there is a high potential for tremendous progress in understanding HIV if we can assemble a team of experts from the world of HIV immunology and the world of gerontology,” he says. “Each field can dramatically inform the other. I believe HIV is a well described, well studied, distinct disease that can be used as
a model by the larger community to look at issues of aging.”

References

[1] Finch, C (2012). Evolution of the Human Lifespan, Past, Present, and Future: Phases in the Evolution of Human Life Expectancy in Relation to the Inflammatory Load. Proceedings of the American Philosophical Society, 156:1, 9-44.

[2] Mascolini, M (2013). Over 10% in Older HIV Group Fit Alzheimer’s Biomarker Risk Profile. Conference Reports for NATAP: 20th Conference on Retroviruses and Opportunistic Infections, March 3-6, 2013.

[3] Aberg, X (2012). Aging, Inflammation, and HIV Infection. Topics in Antiviral Medicine, 20:3, 101-105.

[4] Deeks, S, Verdin, S. and McCune, JM (2012). Immunosenescence and HIV. Current Opinion in Immunology, 24: 1-6.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, June, 2013

23. January 2015 · Comments Off on Apolipoprotein E Genotype and Viral Infections · Categories: Neuroscience, Science

Last month this column considered current and future progress in Alzheimer Disease (AD) diagnosis, management, and treatment. Because AD is a terrible brain disease with an increasing rate of prevalence with age, and because it represents one of – if not the – worst conditions that can afflict a person with cryopreservation arrangements, I would like to continue our consideration of this well-known and widely-feared neurodegenerative disease. Specifically, our focus will be on apolipoprotein E (apoE) and research regarding its role in the modulation of physiological responses to certain viral infections.

ApoE protein is primarily synthesized peripherally in the liver and mediates cholesterol metabolism systemically, but it is also made in the central nervous system by astroglia and microglia (non-neuronal cell types) where it transports cholesterol to neurons. In the CNS, neurons express receptors for apoE that are part of the low density lipoprotein receptor gene family. Historically, apoE has been recognized for its role in lipoprotein metabolism and its importance in cardiovascular disease. Of course, apoE carrier status is also widely known as the major factor determining one’s risk of developing late-onset Alzheimer disease (AD). But more recent research has indicated that the various isoforms of apoE may also have significant immunological impact by conferring different susceptibilities to other diseases, as well.

The human apoE gene is located on chromosome 19 and is composed of 79 individual single nucleotide polymorphisms (SNPs). The three major alleles of apoE, named Epsilon-2 (Ɛ2), Epsilon-3 (Ɛ3), and Epsilon-4 (Ɛ4), are determined by differences in SNPs s429358 and rs7412. The products of these alleles are the protein isoforms apoE2, apoE3, and apoE4, which differ only by a single amino acid at two residues (amino acid 112 and amino acid 158). These amino acid substitutions affect noncovalent “salt bridge” formation within the proteins, which ultimately impacts on lipoprotein preference, stability of the protein, and receptor binding activities of the isoforms (see Table 1).

Isoform Amino acid 112 Amino acid 158 Relative charge Lipoprotein preference LDL receptor binding ability

apoE2

cysteine

cysteine

0

HDL

low

apoE3

cysteine

arginine

+1

HDL

high

apoE4

arginine

arginine

+2

VLDL, chylomicrons

high

Table 1. ApoE isoform amino acid differences and resulting chemical and physiological changes.

There are also two minor alleles, Epsilon-1 (Ɛ1) and Epsilon-5 (Ɛ5), which are present in less than 0.1% of the population. The three major alleles are responsible for three homozygous (Ɛ2/Ɛ2, Ɛ3/Ɛ3, Ɛ4/Ɛ4) and three heterozygous (Ɛ2/Ɛ3, Ɛ2/Ɛ4, Ɛ3/Ɛ4) genotypes. [I will pause to mention here that it is now quite easy to determine one’s genotype through services such as 23andme.com.]

An interesting document in the field is the literature review by Inga Kuhlman, et al. (Lipids in Health and Disease 2010, 9:8) which assesses hepatitis C, HIV and herpes simplex disease risk by ApoE genotype. An important finding is that the Ɛ4 allele is found less frequently in populations as they age (e.g., 14% of the general German population vs. 5% in centenarians), indicating that Ɛ4 is a major mortality factor in the elderly. This is assumed to be a result of the Ɛ4 allele’s well-known predisposition to Alzheimer and cardiovascular diseases.

The authors explain that “apoE4 carriers have a tendency for 5-10% higher fasting total cholesterol, LDL-cholesterol and triglyceride levels relative to homozygote Ɛ3/Ɛ3” and that this tendency towards higher lipid levels is probably responsible for the 40-50% greater cardiovascular disease risk in Ɛ4 carriers. They also point out that “although the molecular basis of the pathology is poorly understood, and likely to be in part due to apoE genotype associated differences in brain lipid metabolism, an apoE4 genotype has been highly consistently associated with the risk of an age-related loss of cognitive function, in an allele dose fashion.” This means, of course, that Ɛ4/Ɛ4 carriers are at greatest risk for cognitive dysfunction with increasing age.

In the field of immune regulation, a growing number of studies point to apoE’s interaction with many immunological processes. In their article, Kuhlman, et al., summarize the impact of the Ɛ4 allele on the susceptibility to specific infectious viral disease. The authors review a number of studies of the effects of apoE4 genotype on hepatitis C (HCV), human immunodeficiency virus (HIV), and herpes simplex (HSV) infection and outcome in humans.

In general, apoE4 was found to be protective against hepatitis C infection vs. (Ɛ3/Ɛ3) controls. Though the exact mechanisms of apoE genotype-specific effects on HCV life cycle remain uncertain, apoE seems to be involved because “available data indicate that the outcome of chronic HCV infection is better among Ɛ4 carriers due to slower fibrosis progression.”

Concerning the possible influence of apoE genotype on HIV infection and HIV-associated dementia, the authors call attention to the fact that “cholesterol is a crucial component of the HIV envelope and essential for viral entry and assembly.” Given that apoE is essential for cholesterol transport, they hypothesize that apoE genotype influences HIV-induced effects on neurological function. Subsequent review of available research suggests that the 4 allele is associated with higher steady-state viral load and faster disease progression due to accelerated virus entry in 4 carriers, but a correlation between apoE4 and HIV-associated dementia “remains controversial and needs to be clarified by further studies.”

Lastly, a review of the literature regarding the effects of apoE4 genotype on herpes simplex virus (HSV)-1 infection and outcome in humans indicates that apoE4 enhances the susceptibility for HSV-1 “as well as the neuroinvasiveness of HSV-1 compared to other apoE variants” (i.e., HSV-1 is found in more frequently in the CNS of 4 carriers). Importantly, the authors also note that “the combination of apoE4 and HSV-1 may lead to a higher risk of Alzheimer disease (AD) than either factor in isolation.”

Due to its generally being associated with higher risk of cardiovascular disease, dementia, and increased susceptibility to and/or accelerated progression of various viral infections, one may wonder why the 4 allele has not been eliminated by evolutionary selection. This may be explained, in part, by the protective and beneficial effects it exhibits in certain harmful infectious diseases, as demonstrated for hepatitis C.

The exact mechanisms of apoE influence on susceptibility to and course of viral infection remain shrouded. Because the mechanisms of HCV, HIV, and HSV infection are quite similar (i.e., all three viruses compete with apoE for cell attachment and receptor binding), it is interesting to find differences in receptor binding among them.

Involvement or interaction between the immune system, cognition, and brain diseases such as AD is an as-yet widely untouched field of inquiry. Further elucidation of the mechanisms by which apoE may influence the pathogenesis of infectious viral diseases can lead to new developments in the treatment of disease based on an individual’s apoE genotype.

Aside from the role that ApoE plays in susceptibility and progression of infectious disease, there is growing interest in the role that infection or a compromised immune system plays in the development of dementia. For example, despite the successful management of HIV with antiretroviral drugs, some patients are showing signs of memory impairment and dementia at a relatively young age. Interestingly, these people seem to show accelerated aging, too, which raises important questions about the relationship between the immune system, immunosenescence, and aging.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, May, 2013

14. November 2014 · Comments Off on Alzheimer Disease in 2020 · Categories: Health, Neuroscience

Any terminal illness is a terrible thing; but to a cryonics member, a brain-destroying neurodegenerative disease is the worst contemporary medical “death sentence” one can receive. There are several flavors of neurodegenerative disorders, many of which primarily affect the patient’s movement, strength, coordination, or the peripheral nervous system. And there are numerous contributory mechanisms in the causation of neurodegeneration, including prion infection and toxin related disease. But the most common – and the most feared – neurodegenerative disease is one that affects not movement, but cognition.

Of course, I am speaking of Alzheimer disease (AD). Originally described in a 51- year old woman by the Bavarian psychiatrist Alois Alzheimer in 1906, neuropathologists have increasingly recognized that AD is also the most common basis for latelife cognitive failure. Culminating in neuronal dystrophy and death leading to the progressive loss of memory and other cognitive functions (i.e., dementia), and affecting individuals of both sexes and of all races and ethnic groups at a rate of occurrence in the U.S. ranging from approximately 1.3% (age 65-74) to 45% (age 85-93), it is easy to see why AD has generated so much intense scientific interest in recent years.

In the recently published work “The Biology of Alzheimer Disease” (2012), most of what is known about AD today is described in detail in the various chapters covering topics such as the neuropsychological profile and neuropathological alterations in AD, biomarkers of AD, the biochemistry and cell biology of the various proteins involved in AD, animal models of AD, the role of inflammation in AD, the genetics of AD, and treatment strategies. The editors’ selection of contributions has resulted in the most up-to-date compendium on Alzheimer disease to date.

The book culminates in a chapter called Alzheimer Disease in 2020, where the editors extol “the remarkable advances in unraveling the biological underpinnings of Alzheimer disease…during the last 25 years,” and yet also recognize that “we have made only the smallest of dents in the development of truly disease-modifying treatments.” So what can we reasonably expect over the course of the next 7 years or so? Will we bang our heads against the wall of discovery, or will there be enormous breakthroughs in identification and treatment of AD?

Though a definitive diagnosis of AD is only possible upon postmortem histopathological examination of the brain, a thorough review of the book leads me to believe that the greatest progress currently being made is in developing assays to diagnose AD at earlier stages. It is now known that neuropathological changes associated with AD may begin decades before symptoms manifest. This, coupled with the uncertainty inherent in a clinical diagnosis of AD, has driven a search for diagnostic markers. Two particular approaches have shown the most promise: brain imaging and the identification of fluid biomarkers of AD.

Historically, imaging was used only to exclude potentially surgically treatable causes of cognitive decline. Over the last few decades, imaging has moved from this minor role to a central position of diagnostic value with ever-increasing specificity. The ability to differentiate AD from alternative or contributory pathologies is of significant value now, but the need for an earlier and more certain diagnosis will only increase as disease-modifying therapies are identified. This will be particularly true if these therapies work best (or only) when initiated at the preclinical stage. Improvements in imaging have also greatly increased our understanding of the biology and progression of AD temporally and spatially. Importantly, the clinical correlations of these changes and their relationships to other biomarkers and to prognosis can be studied.

The primary modalities that have contributed to progress in AD imaging are structural magnetic resonance imaging (MRI), functional MRI, fluorodeoxyglucose (FDG) positron emission tomography (PET), and amyloid PET. Structural MRI, which is used to image the structure of the brain, has obvious utility in visualizing the progressive cerebral atrophy characteristic of AD. Such images can be used as a marker of disease progression and as a means of measuring effective treatments (which would slow the rate of atrophy). Functional MRI, on the other hand, measures changes in blood oxygen leveldependent (BOLD) MR signal. This signal, which can be acquired during cognitive tasks, may provide the clinician with a tool to compare brain activity across conditions in order to assess and detect early brain dysfunction related to AD and to monitor therapeutic response over relatively short time periods.

FDG PET primarily indicates brain metabolism and synaptic activity by measuring glucose analog fluorodeoxyglucose (which can be detected by PET after labeling it with Fluorine-18). A large body of FDG-PET work has identified an endophenotype of AD – that is, a signature set of regions that are typically hypometabolic in AD patients. FDG hypometabolism parallels cognitive function along the trajectory of normal, preclinical, prodromal, and established AD. Over the course of three decades of investigation, FDG PET has emerged as a robust marker of brain dysfunction in AD. Imaging of β-amyloid (Aβ) – the peptide that makes up the plaques found in the brains of AD patients – is accomplished via amyloid PET to determine brain Aβ content. Historically, this has only been possible upon postmortem examination, so the utility of amyloid imaging is in moving this assessment from the pathology laboratory to the clinic. Because amyloid deposition begins early on, however, amyloid PET is not useful as a marker of disease progression.

The well-known hallmarks of AD, the plaques and neurofibrillary tangles first described by Alouis Alzheimer in 1906, were discovered in 1985 to be composed primarily of β-amyloid and hyperphosphorylated tau protein, respectively. Advances in our knowledge of Aβ generation and tau protein homeostasis have led to substantial research into disease-modifying drugs aimed at decreasing overall plaque and tangle load in an effort to halt neurodegeneration. Such treatments will likely be most effective if started early in the disease process, making sensitive and accurate fluid biomarkers of Aβ and tau especially important.

Outside of imaging, progress in AD diagnostics stems primarily from the assessment of fluid biomarkers of AD. These biomarkers are generally procured from the cerebrospinal fluid (CSF) and blood plasma and include total tau (T-tau), phosphorylated tau (P-tau) and the 42 amino acid form of of β-amyloid (Aβ42). These core biomarkers reflect AD pathology and have high diagnostic accuracy, which is especially useful in diagnosing AD in prodromal and mild cognitive impairment cases.

Because the CSF is in direct contact with the extracellular space of the brain, biochemical changes in the brain can be detected in the CSF. Assays to detect Aβ42 led to the discovery that Aβ42 in AD is decreased to approximately 50% of control levels, making the measurement of Aβ42 a useful clinical tool. Measurements of T-tau (around 300% of control in AD patients) and P-tau biomarkers (a marked increase in AD patients) in combination with Aβ42, however, provide an even more powerful diagnostic assay.

Fluid biomarkers for AD other than Aβ and tau have been posited, but positive results have been difficult to replicate. Novel biomarkers with the most promise inlcude the amyloid precursor proteins sAPPβ and sAPPα, β-site APP cleaving enzyme-1 (BACE1), Aβ oligomers, and other Aβ isoforms. Additionally, neuronal and synaptic proteins as well as various inflammatory molecules and markers of oxidative stress may prove valuable as CSF biomarkers. Studies of plasma biomarkers such as those investigating plasma Aβ have yielded contradictory results, but promising novel blood biomarkers for AD may be found in certain signaling and inflammatory proteins.

Taken together, progress in brain imaging and identification of fluid biomarkers hold great promise in improved diagnosis of AD cases. When combined with expected drug therapies we may be able to delay the onset of neurodegeneration and associated cognitive impairment significantly. In the meantime, early diagnosis is helpful in stratifying AD cases, monitoring potential treatments for safety, and monitoring the biochemical effect of drugs. For cryonicists, early diagnosis can help guide treatment and end-of-life care decisions in order to optimize cryopreservation of the brain.

So – back to the original question. What can we predict about the AD landscape in 2020?

Besides continued progress in early diagnosis through brain imaging and fluid biomarkers, the authors anticipate that advances in whole-genome and exome sequencing will lead to a better understanding of all of the genes that contribute to overall genetic risk of AD. Additionally, improved ability to sense and detect the proteins that aggregate in AD and to distinguish these different assembly forms and to correlate the various conformations with cellular, synaptic, and brain network dysfunction should be forthcoming in the next few years. Lastly, we will continue to improve our understanding of the cell biology of neurodegeneration as well as cell-cell interactions and inflammation, providing new insights into what is important and what is not in AD pathogenesis and how it differs across individuals, which will lead, in turn, to improved clinical trials and treatment strategies.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, April, 2013