13. February 2015 · Comments Off on Though She Isn’t Really Ill, There’s a Little Yellow Pill… · Categories: Health, Neuroscience, Society

Humans have been ingesting mindand mood-altering substances for millennia, but it has only rather recently become possible to begin to elucidate drug mechanisms of action and to use this information, along with our burgeoning knowledge of neuroscience, to design drugs intended to have a specific effect. And though most people think of pharmaceuticals as “medicine,” it has become increasingly popular to discuss the possibilities for the use of drugs in enhancement, or improvement of “human form or functioning beyond what is necessary to sustain or restore good health” (E.T. Juengst; in Parens, 1998, p 29).

Some (transhumansits) believe that enhancement may not only be possible, but that it may even be a moral duty. Others (bioconservatives) fear that enhancement may cause us to lose sight of what it means to be human altogether. It is not the intention of this article to advocate enhancement or to denounce it. Instead, let’s review some of the drugs (and/or classes of drugs) that have been identified as the most promisingly cognitive- or mood-enhancing. Many of the drugs we will cover can be read about in further depth in Botox for the brain: enhancement of cognition, mood and pro-social behavior and blunting of unwanted memories (Jongh, R., et al., Neuroscience and Biobehavioral Reviews 32 (2008): 760-776).

Of most importance in considering potentially cognitive enhancer drugs is to keep in mind that, to date, no “magic bullets” appear to exist. That is, there are no drugs exhibiting such specificity as to have only the primary, desired effect. Indeed, a general principle of trade-offs (particularly in the form of side effects) appears to exist when it comes to drug administration for any purpose, whether treatment or enhancement. Such facts may constitute barriers to the practical use of pharmacological enhancers and should be taken into consideration when discussing the ethics of enhancement.

Some currently available cognitive enhancers include donepezil, modafinil, dopamine agonists, guanfacine, and methylphenidate. There are also efforts underway to develop memory-enhancing drugs, and we will discuss a few of the mechanisms by which they are proposed to act. Besides cognitive enhancement, the enhancement of mood and prosocial behavior in normal individuals are other types of enhancement that may be affected pharmacologically, most usually by antidepressants or oxytocin. Let’s briefly cover the evidence for the efficacy of each of these in enhancing cognition and/or mood before embarking on a more general discussion of the general principles of enhancement and ethical concerns.

One of the most widely cited cognitive enhancement drugs is donepezil (Aricept®), an acetylcholinesterase inhibitor. In 2002, Yesavage et al. reported the improved retention of training in healthy pilots tested in a flight simulator. In this study, after training in a flight simulator, half of the 18 subjects took 5 mg of donepezil for 30 days and the other half were given a placebo. The subjects returned to the lab to perform two test flights on day 30. The donepezil group was found to perform similarly to the initial test flight, while placebo group performance declined. These results were interpreted as an improvement in the ability to retain a practiced skill. Instead it seems possible that the better performance of the donepezil group could have been due to improved attention or working memory during the test flights on day 30.

Another experiment by Gron et al. (2005) looked at the effects of donepezil (5 mg/day for 30 days) on performance of healthy male subjects on a variety of neuropsychological tests probing attention, executive function, visual and verbal short-term and working memory, semantic memory, and verbal and visual episodic memory. They reported a selective enhancement of episodic memory performance, and suggested that the improved performance in Yesavage et al.’s study is not due to enhanced visual attention, but to increased episodic memory performance.

Ultimately, there is scarce evidence that donepezil improves retention of training. Better designed experiments need to be conducted before we can come to any firm conclusions regarding its efficacy as a cognitive-enhancing.

The wake-promoting agent modafinil (Provigil®) is another currently available drug that is purported to have cognitive enhancing effects. Provigil® is indicated for the treatment of excessive daytime sleepiness and is often prescribed to those with narcolepsy, obstructive sleep apnea, and shift work sleep disorder. Its mechanisms of action are unclear, but it is supposed that modafinil increases hypothalamic histamine release, thereby promoting wakefulness by indirect activation of the histaminergic system. However, some suggest that modafinil works by inhibiting GABA release in the cerebral cortex.

In normal, healthy subjects, modafinil (100-200 mg) appears to be an effective countermeasure for sleep loss. In several studies, it sustained alertness and performance of sleep-deprived subjects(up to 54.5 hours) and has also been found to improve subjective attention and alertness, spatial planning, stop signal reaction time, digit-span and visual pattern recognition memory. However, at least one study (Randall et al., 2003) reported “increased psychological anxiety and aggressive mood” and failed to find an effect on more complex forms of memory, suggesting that modafinil enhances performance only in very specific, simple tasks.

The dopamine agonists d-amphetamine, bromocriptine, and pergolide have all been shown to improve cognition in healthy volunteers, specifically working memory and executive function. Historically, amphetamines have been used by the military during World War II and the Korean War, and more recently as a treatment for ADHD (Adderall®). But usage statistics suggest that it is commonly used for enhancement by normal, healthy people—particularly college students.

Interestingly, the effect of dopaminergic augmentation appears to have an inverted U-relationship between endogenous dopamine levels and working memory performance. Several studies have provided evidence for this by demonstrating that individuals with a low workingmemory capacity benefit from greater improvements after taking a dopamine receptor agonist, while high-span subjects either do not benefit at all or show a decline in performance.

Guanfacine (Intuniv®) is an α2 adrenoceptor agonist, also indicated for treatment of ADHD symptoms in children, but by increasing norepinephrine levels in the brain. In healthy subjects, guanfacine has been shown to improve visuospatial memory (Jakala et al., 1999a, Jakala et al., 1999b), but the beneficial effects were accompanied by sedative and hypotensive effects (i.e., side effects). Other studies have failed to replicate these cognitive enhancing effects, perhaps due to differences in dosages and/or subject selection.

Methylphenidate (Ritalin®) is a well-known stimulant that works by blocking the reuptake of dopamine and norepinephrine. In healthy subjects, it has been found to enhance spatial workingmemory performance. Interestingly, as with dopamine agonists, an inverted U-relationship was seen, with subjects with lower baseline working memory capacity showing the greatest improvement after methylphenidate administration.

Future targets for enhancing cognition are generally focused on enhancing plasticity by targeting glutamate receptors (responsible for the induction of long-term potentiation) or by increasing CREB (known to strengthen synapses). Drugs targeting AMPA receptors, NMDA receptors, or the expression of CREB have all shown some promise in cognitive enhancement in animal studies, but little to no experiments have been carried out to determine effectiveness in normal, healthy humans.

Beyond cognitive enhancement, there is also the potentialfor enhancement of mood and pro-social behavior. Antidepressants are the first drugs that come to mind when discussing the pharmacological manipulation of mood, including selective serotonin reuptake inhibitors (SSRIs). Used for the treatment of mood disorders such as depression, SSRIs are not indicated for normal people of stable mood. However, some studies have shown that administration of SSRIs to healthy volunteers resulted in a general decrease of negative affect (such as sadness and anxiety) and an increase in social affiliation in a cooperative task. Such decreases in negative affect also appeared to induce a positive bias in information processing, resulting in decreased perception of fear and anger from facial expression cues.

Another potential use for pharmacological agents in otherwise healthy humans would be to blunt unwanted memories by preventing their consolidation.Thismay be accomplished by post-training disruption of noradrenergic transmission (as with β-adrenergic receptor antagonist propranolol). Propranolol has been shown to impair the long-term memory of emotionally arousing stories (but not emotionally neutral stories) by blocking the enhancing effect of arousal on memory (Cahill et al., 1994). In a particularly interesting study making use of patients admitted to the emergency department, post-trauma administration of propranolol reduced physiologic responses during mental imagery of the event 3 months later (Pitman et al., 2002). Further investigations have supported the memory blunting effects of propranolol, possibly by blocking the reconsolidation of traumatic memories.


Reviewing these drugs and their effects leads us to some general principles of cognitive and mood enhancement. The first is that many drugs have an inverted U-shaped dose-response curve, where low doses improve and high doses impair performance.This is potentially problematic for the practical use of cognition enhancers in healthy individuals, especially when doses that are most effective in facilitating one behavior simultaneously exert null or detrimental effects on other behaviors.

Second, a drug’s effect can be “baseline dependent,” where low-performing individuals experience greater benefit from the drug while higher-performing individuals do not see such benefits (which might simply reflect a ceiling effect), or may, in fact, see a deterioration in performance (which points to an inverted U-model). In the case of an inverted U-model, low performing individuals are found on the up slope of the inverted U and thus benefit from the drug, while high-performing individuals are located near the peak of the inverted U already and, in effect, experience an “overdose” of neurotransmitter that leads to a decline in performance.

Trade-offs exist in the realm of cognitive enhancing drugs as well. As mentioned, unwanted “side effects” are often experienced with drug administration, ranging from mild physiological symptoms such as sweating to more concerning issues like increased agitation, anxiety, and/or depression.

More specific trade-offs may come in the form of impairment of one cognitive ability at the expense of improving another. Some examples of this include the enhancement of long-term memory but deterioration of working memory with the use of drugs that activate the cAMP/protein kinase A (PKA) signaling pathway. Another tradeoff could occur between the stability versus the flexibility of long-term memory, as in the case of certain cannabinoid receptor antagonists which appear to lead to more robust long-term memories, but which also disrupt the ability of new information to modify those memories. Similarly, a trade-off may exist between stability and flexibility of working memory. Obviously, pharmacological manipulations that increase cognitive stability at the cost of a decreased capacity to flexibly alter behavior are potentially problematic in that one generally does not wish to have difficulty in responding appropriately to change.

Lastly, there is a trade-off involving the relationship between cognition and mood. Many mood-enhancing drugs, such as alcohol and even antidepressants, impair cognitive functioning to varying degrees. Cognition-enhancing drugs may also impair emotional functions. Because cognition and emotion are intricately regulated through interconnected brain pathways, inducing change in one area may have effects in the other. Much more research remains to be performed to elucidate these interactions before we can come to any firm conclusions.


Again, though it is not the place of this article to advocate or denounce the use of drugs for human enhancement, obviously there are considerable ethical concerns when discussing the administration of drugs to otherwise healthy human beings. First and foremost, safety is of paramount importance. The risks and side-effects, including physical and psychological dependence, as well as long-term effects of drug use should be considered and weighed heavily against any potential benefits.

Societal pressure to take cognitive enhancing drugs is another ethical concern, especially in light of the fact that many may not actually produce benefits to the degree desired or expected. In the same vein, the use of enhancers may give some a competitive advantage, thus leading to concerns regarding fairness and equality (as we already see in the case of physical performance-enhancing drugs such as steroids). Additionally, it may be necessary, but very difficult, to make a distinction between enhancement and therapy in order to define the proper goals of medicine, to determine health-care cost reimbursement, and to “discriminate between morally right and morally problematic or suspicious interventions” (Parens, 1998). Of particular importance will be determining how to deal with drugs that are already used off-label for enhancement. Should they be provided by physicians under certain conditions? Or should they be regulated in the private commercial domain?

There is an interesting argument that using enhancers might change one’s authentic identity—that enhancing mood or behavior will lead to a personality that is not really one’s own (i.e., inauthenticity), or even dehumanization—while others argue that such drugs can help users to “become who the really are,” thereby strengthening their identity and authenticity. Lastly, according to the President’s Council on Bioethics, enhancement may “threaten our sense of human dignity and what is naturally human” (The President’s Council, 2003). According to the Council, “the use of memory blunters is morally problematic because it might cause a loss of empathy if we would habitually ‘erase’ our negative experiences, and because it would violate a duty to remember and to bear witness of crimes and atrocities.” On the other hand, many people believe that we are morally bound to transcend humans’ basic biological limits and to control the human condition. But even they must ask: what is the meaning of trust and relationships if we are able to manipulate them?

These are all questions without easy answers. It may be some time yet before the ethical considerations of human cognitive and mood enhancement really come to a head, given the apparently limited benefits of currently available drugs. But we should not avoid dealing with these issues in the meantime; for there will come a day when significant enhancement, whether via drugs or technological means, will be possible and available. And though various factions may disagree about the morality of enhancement, one thing is for sure: we have a moral obligation to be prepared to handle the consequences of enhancement, both positive and negative.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, December, 2013

05. February 2015 · Comments Off on An End to the Virus · Categories: Health, Science

Breakthroughs in medicine have increased substantially over the last hundred years, and most would agree that the introduction of antibiotics in 1942 has been one of the largest milestones in the history of medicine thus far. The success in treating bacterial infection has only accentuated the glaring lack of progress in developing effective therapeutics for those other enemies of the immune system, viruses. But Dr. Todd Rider and his team at MIT have dropped a bombshell with their announcement of a new broad spectrum antiviral therapeutic, DRACO, which appears not only to cure the common cold, but to halt or prevent infections by all known viruses.

Before talking specifically about this exciting news, let us first review viral biology and why viral infections have been so difficult to treat.

As you may recall from your early education, a virus particle, or virion, consists of DNA or RNA surrounded only by a protein coat (i.e., naked virus) or, occasionally, a protein coat and a lipid membrane (i.e., enveloped virus). Viruses have no organelles or metabolism and do not reproduce on their own, so they cannot function without using the cellular machinery of a host (bacteria, plant, or animal).

Viruses can be found all throughout our environment and are easily picked up and transferred to areas where they may enter our bodies, usually through the nose, mouth, or breaks in the skin. Once inside the host, the virus particle finds a host cell to infect so it can reproduce.

There are two ways that viruses reproduce. The first way is by attaching to the host cell and entering it or injecting viral DNA/RNA into the cell. This causes the host cell to make copies of the viral DNA and translate that DNA to make viral proteins. The host cell assembles new viruses and releases them when the cells break apart and die, or it buds the new viruses off, which preserves the host cell. This approach is called the lytic cycle.

The second way that viruses reproduce is to use the host cell’s own materials. A viral enzyme called reverse transcriptase makes a segment of DNA from its RNA using host materials. The DNA segment gets incorporated into the host cell’s DNA. There, the viral DNA lies dormant and gets reproduced with the host cell. When some environmental cue happens, the viral DNA takes over, makes viral RNA and proteins, and uses the host cell machinery to assemble new viruses. The new viruses bud off. This approach is called the lysogenic cycle; these viruses are called retroviruses and include herpes viruses and HIV.

Once free from the host cell the new viruses can attack other cells and produce thousands more virus particles, spreading quickly throughout the body. The immune system responds quickly by producing proteins to interfere with viral replication, pyrogenic chemicals to raise body temperature, and the induction of cell death (apoptosis). In some cases simply continuing the natural immune response is enough to eventually halt viral infection. But the virus kills many host cells in the meantime, leading to symptoms ranging from the characteristic runny nose and sore throat of a cold (rhinovirus) to the muscle aches and coughing associated with the flu (influenza virus).

Any virus can be deadly, especially to hosts with a weakened immune system, such as the elderly, small children, and persons with AIDS (though death is actually often due to a secondary bacterial infection). And any viral infection will cause pain and suffering, making treatment a very worthwhile goal. So far, the most successful approach to stopping viral infections has been prevention through the ubiquitous use of vaccines. The vaccine— either a weakened form of a particular virus or a mimic of one—stimulates the immune system to produce antibodies specific to that virus, thereby preventing infection when the virus is encountered in the environment. In another approach, antiviral medications are administered post-infection and work by targeting some of the specific ways that viruses reproduce.

However, viruses are very difficult to defeat. They vary enormously in genetic composition and physical conformation, making it difficult to develop a treatment that works for more than one specific virus. The immense number of viral types in nature makes even their classification a monumental job as there is more enormous structural diversity among viruses. Viruses have been evolving much longer than any cells have even existed and they have evolved methods to avoid detection and to overcome attempts to block replication. So, while we have made some progress in individual battles, those pesky viruses have definitely been winning the war.

Which is why the announcement of a broad spectrum antiviral therapeutic agent is such huge news. In their paper, Rider et al. describe a drug that is able to identify cells infected by any type of virus and which is then able to specifically kill only the infected cells to terminate the infection. The drug, named DRACO (which stands for Double-stranded RNA (dsRNA) Activated Caspase Oligomerizer), was tested against 15 viruses including rhinoviruses, H1N1 influenza, polio virus, and several types of hemorrhagic fever. And it was effective against every virus it was pitted against.

Dr. Rider looked closely at living cells’ own defense mechanisms in order to design DRACO. First, he observed that all known viruses make long strings of doublestranded RNA (dsRNA) during replication inside of a host cell, and that dsRNA is not found in human or other cells. As part of the natural immune response, human cells have proteins that latch onto dsRNA and start a biochemical cascade that prevents viral replication. But many viruses have evolved to overcome this response quite easily. So Rider combined dsRNA detection with a more potent weapon: apoptosis, or cell suicide.

Basically, the DRACO consists of two ends. One end identifies dsRNA and the other end induces cells to undergo apoptosis. When the DRACO binds to dsRNA it signals the other end of the DRACO to initiate cell suicide, thus killing the infected cell and terminating the infection. Beautifully, the DRACO also carries a protein that allows it to cross cell membranes and enter any human or animal cell. But if no dsRNA is present, it simply does nothing, leaving the cell unharmed.

An interesting question is whether any viruses are actually beneficial and whether wiping all viruses out of an organismal system may have negative consequences (as happens when antibiotic treatment eradicates both invading pathogenic bacteria and non-pathogenic flora, often leading to symptoms such as digestive upset). After his recent presentation at the 6th Strategies for Engineered Negligible Senescence (SENS) conference in September 2013, Dr. Rider fielded this question and stated quite adamantly that there are no known beneficial, symbiotic, or non-harmful viruses. This point is further emphasized in a recently published interview in which he is asked whether DRACO-triggered cell death could lead to a lesion in a tissue or organ. Rider responds that “Virtually all viruses will kill the host cell on the way out. Of the hand-full that don’t, your own immune system will try to kill those infected cells. So we’re not really killing any more cells with our approach than we already have been. It’s just that we’re killing them at an early enough stage before they infect and ultimately kill more cells. So, if anything, this limits the amount of cell death.”

So far, DRACO has been tested in cellular culture and in mouse models against a variety of very different virus types. Rider hopes to license DRACO to a pharmaceutical company so that it can be assessed in larger animal trials and, ultimately, human trials. Unfortunately, it may take a decade or more to complete this process and make the drug available for human therapeutic purposes, and that’s only if there is enough interest to do so. Amazingly, the DRACO project was started over 11 years ago and has barely survived during that period due to lack of interest and funding. Even now, after the DRACOs have been successfully engineered, produced, and tested, no one has yet reached out to Rider about taking them beyond the basic research stage. Let us hope that those of us who do find this work unbelievably exciting can make enough noise that Rider’s work continues to the benefit of all mankind.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, November, 2013

03. February 2015 · Comments Off on No More Couch Potato · Categories: Health

In my review of The SharpBrains Guide to Brain Fitness a couple of months ago, the importance of certain lifestyle choices—particularly physical exercise— to maintain and enhance brain health was emphasized at length. Intuitively, we all know that physical activity is good for us. The metaphorical “couch potato” is assumed to be a person in poor health, precisely because of his or her lack of movement (and, of course, lazily consumed snacks and mind-numbing television). But even those of us who admonish the couch potato are moving our bodies a lot less these days due to an increase in the number of jobs requiring long periods of sitting. And current research is clear: all that sitting is taking a toll on our health.

So we know we need to get up and get moving. But what kind of exercise is best? So far, cardiovascular, or aerobic, exercise has received most of the attention in the literature. Because it is light-to-moderate in intensity and long in duration, aerobic exercise increases heart rate and circulation for extended periods, which is presumed to trigger biochemical changes in the brain that spur neuroplasticity—the production of new connections between neurons and even of new neurons themselves. It appears that the best regimen of aerobic exercise incorporates, at a minimum, three 30 to 60 minute sessions per week. In short, plenty of research has found that myriad positive physical and cognitive health benefits are correlated with aerobic exercise.

But what about non-aerobic exercise, such as strength training? The truth is that very little is known about the effects of non-aerobic exercise on cognitive health. What few studies exist show a positive effect of strength training on cognitive health, but the findings are definitely less conclusive than the plethora of evidence supporting aerobic exercise.

However, a lack of research should not be interpreted as negative results. I think non-aerobic exercise has received less research attention because, well, it is harder and appears less accessible than aerobic exercise. It is probably easier to get research participants to commit to a straightforward exercise regimen that doesn’t involve a lot of explanation or study to figure out. Let’s face it: pushing pedals on a stationary bike requires less mental effort than figuring out how to perform weight-bearing exercises with good form.

At worst, we may ultimately discover that non-aerobic exercise has no cognitive benefits. But let’s not throw the baby out with the bathwater. Because strength training does, in fact, promote a number of physical effects that are of great overall benefit to health, especially to the aging individual. Indeed, one would be remiss to omit strength training from any exercise regimen designed to promote healthy aging and a long, physically fit life.

The primary, and most obvious, effect of strength training is that of muscle development, or hypertrophy. Muscles function to produce force and motion and skeletal muscles are responsible for maintaining and changing posture, locomotion, and balance. Anyone who wishes to look and feel strong, physically capable, and well-balanced would do well to develop the appropriate muscles to reach these goals. Muscle mass declines with age, so it is smart to build a reserve of muscle in a relatively youthful state and to maintain it with regular workouts for as long as possible. Doing so will stave off the functional decline known as frailty, a recognized geriatric syndrome associated with weakness, slowing, decreased energy, lower activity, and unintended weight loss.

Those who know me know that I am very, very thin. At 5 foot 9 inches, it has always been a struggle to maintain my weight above 90 lbs.—a full 40 lbs. underweight for a woman of my height. This is almost certainly due, in large part, to genetics (my parents are both rail-thin), and no amount of eating has ever worked to put on additional pounds. Over the years, I grew more concerned about what my underweight meant in terms of disease risks as I age. In particular, dual energy x-ray absorbiometry (DEXA) scans for bone mineral density at age 27 and 33 showed accelerated bone loss beyond what is normal for my age. I was on a trajectory for a diagnosis of osteoporosis by my mid-40s.

Besides ensuring adequate calcium intake, I knew that the best prescription for slowing down bone loss is to perform weight-bearing exercises. Strength training causes the muscles to pull on the bone, resulting in increased bone strength. Strength training also increases muscle strength and flexibility, which reduces the likelihood of falling—the number-one risk factor for hip fracture.

So I dusted off my long-unused gym pass and started strength training 3 to 4 times a week. I was too weak to even lift weights in the beginning, so I started with body weight exercises and gradually progressed to weight machines. Weight machines allow you to build strength and to gain an understanding of how an exercise works a particular muscle or group of muscles. Many machines also have a limited range of motion within which to perform the exercise, providing some guidance on how to perform the movement. As I made improvements in strength, I began reading about strength training exercises online and downloaded some apps to help me in the gym.

For a basic “how-to,” nothing beats a video. There are plenty of exercise demonstration videos on YouTube.com and several other sites, but I prefer the definitive (and straight-to-the-point) visual aids provided by Bodybuilding.com. They offer short instructional videos for just about every strength training exercise in existence. The videos also download quickly and play easily on a mobile device, in case you need a refresher in the gym.

There are a lot of great apps out there, too. My favorites so far include PerfectBody (and associated apps by the same developer), GymPact, and Fitocracy. PerfectBody provides weekly workout routines, complete with illustrated descriptions of exercises and the ability to track your progress by documenting weight lifted and number of repetitions (reps) for each exercise. It is an all-in-one fitness program for learning foundational exercises and building strength and confidence in the gym.

If you have a hard time committing to a workout schedule, Gympact may help. One of the latest in a series of apps that make you put your money where your mouth is, you make a Gympact agreement to go to the gym a minimum number of times per week in order to earn monetary rewards for doing so. The catch is that you are charged money if you fail to meet your pact (which helps to pay all those committed gym-goers who didn’t renege on their promises). For many, the thought of losing money can provide quite the incentive to get your tail to the gym.

Now that you’ve got exercise examples, progress tracking, and motivation to actually get to the gym, how about some fun? Fitocracy is an app that turns exercise into a game, letting you track your exercise in return for points and “level ups” like a video game. There are challenges to meet and quests to conquer, adding to the competitive game-play element. But there’s also a nice social aspect, with friends and groups enabling people to “prop” one another and to provide support and advice.

Once you start pumping iron, you may quickly realize a need for nutrition adequate to meet your new muscle-building goals. As we all know, protein is the most important nutrient for building muscle. And while I will not attempt to provide advice regarding the appropriate nutrient ratio for the calories you consume each day, I can tell you that it is generally recommended to get at least 1 gram of protein per pound of body weight per day if you want to support muscle growth.

Adequate protein consumption is necessary even if you are not strength training and becomes even more important as you age. Reduced appetite and food intake, impaired nutrient absorption, and age-related medical and social changes often result in malnourishment. An insufficient intake of protein, in particular, can lead to loss of muscle mass, reduced strength, and many other negative factors leading to frailty.

It seems that whey protein provides the ultimate benefits in this arena. Whey, which is derived from milk, is a high-quality protein supplement with a rich source of branched-chain amino acids (BCAAs) to stimulate protein synthesis and inhibit protein breakdown, helping to prevent agerelated muscle-wasting (i.e., sarcopenia). Besides muscle support, a growing number of studies indicate other positive, antiaging effects of whey such as antioxidant enhancement, anti-hypertensive effect, hypoglycemic effect, and the promotion of bone formation and suppression of bone resorption. Life Extension Foundation recently reported that these effects mimic the benefits of calorie restriction without a reduction of food intake, playing roles in hormone secretion and action, intracellular signaling, and regulation of gene transcription and translation.

There are many whey protein powder supplements on the market in a variety of formulations and flavors. Whey protein isolate is quickly absorbed and incorporated into muscles, making it a good post-workout option, whereas whey protein concentrate is absorbed and incorporated more slowly, making it ideal for consumption just before bedtime. A whey protein powder may consist of isolate only, concentrate only, or both. Choose what best meets your needs and purposes.

Flavor is an important factor to consider, as well. Most major brands offer a variety of flavors such as vanilla, chocolate, strawberry, and some exotic options. Unflavored powders are sometimes available and are a great neutral protein base for mixing into (green) smoothies or other recipes. Some whey protein powders may actually include sugars to “improve” taste, so make sure to read the ingredients. Even many zero carb powders are still quite sweet. Many brands offer sample size packets which can be very helpful in determining whether or not you like a particular flavor or overall taste prior to buying an entire container.

Lastly, consider the sources of whey protein powder ingredients carefully. Not all whey is created equal, and many commercial brands on the market derive their ingredients from dubious sources or from animals treated with hormones and living in less-than-stellar conditions. But there are many great products out there, including Life Extension’s New Zealand Whey Protein Concentrate, which is derived from grass-fed, free range cows living healthy lives in New Zealand and not treated with Growth Hormone (rBST). If you have reservations about whey protein, there are also alternative protein powders that are derived from plants or egg white.

In summary, while the jury is still out regarding the cognitive benefits of nonaerobic exercise, such exercise is still a very important part of an overall plan to support health and longevity. Adequate nutritional support in the form of whey protein supplementation is generally indicated for its many health benefits, and is absolutely integral to muscle-building efforts. At the very least, strength training should complement brain-boosting aerobic exercise and will help to stave off bone loss and frailty as you age. So erase any preconceived notions you may have had about bodybuilding and start lifting today!

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, October, 2013

27. January 2015 · Comments Off on Brain Fitness · Categories: Health, Neuroscience

Book Review: The SharpBrains Guide to Brain Fitness: How to Optimize Brain Health and Performance at Any Age by Alvero Fernandez

Of all the organs in the human body, a cryonicist should be most concerned about the health and integrity of his or her brain. Thousands of books have been written about physical health and fitness, but very few address the topic of how to keep the brain fit and healthy. Happily, interest in brain fitness, once relegated to academics and gerontologists, is now taking root across America and the world.

The importance of lifelong learning and mental stimulation as a component of healthy aging has long been recognized and touted as a way to stay mentally alert and to stave off dementia in old age. As with physical exercise, “use it or lose it” appears to apply to our brains too. And now that scientists are learning more about neuroplasticity and how brains change as a result of aging, they have begun to test the effects of various factors on brain health and cognitive ability across the lifespan.

Unfortunately, like much health-related research, the results reported by the media have often been convoluted, confusing, and even contradictory. Products developed by overzealous entrepreneurs make outlandish claims and frequently don’t deliver the purported results. Consumers and professionals alike are left wondering what works and what doesn’t when it comes to maintaining our brains in optimal working condition.

To aid all those navigating the murky waters of brain fitness, enter SharpBrains—a company dedicated to tracking news, research, technology, and trends in brain health and to disseminating information about the applications of brain science innovation. In so doing, they “maintain an annual state-of-the-market consumer report series, publish consumer guides to inform decision-making, produce an annual global and virtual professional conference,” and maintain SharpBrains.com, a leading educational blog and website with over 100,000 monthly readers.

Most recently, SharpBrains has published a book on brain fitness called The SharpBrains Guide to Brain Fitness: How to Optimize Brain Health and Performance at Any Age. A compilation and condensation of information accumulated over the lifespan of the company, The SharpBrains Guide to Brain Fitness emphasizes credible research and goes to great lengths to provide the most up-to-date research results in specific areas of brain fitness, followed by interviews with scientists doing work in those fields. The goal of the guide is to help the reader begin to “cultivate a new mindset and master a new toolkit that allow us appreciate and take full advantage of our brain’s incredible properties…[by] providing the information and understanding to make sound and personally relevant decisions about how to optimize your own brain health and performance.”

The Guide begins by emphasizing that the brain’s many neuronal networks serve distinct functions including various types of memory, language, emotional regulation, attention, and planning. Plasticity of the brain is defined as its lifelong capacity to change and reorganize itself in response to the stimulation of learning and experience—the foundation upon which “brain training” to improve cognitive performance at any age, and to maintain brain health into old age, is predicated.

The difficulty of making sense of the scientific findings on brain health and neuroplasticity is discussed at length, with the finger of blame pointed squarely at the media for reporting only fragments of the research and for often reporting those results which are not most meaningful. The authors stress that “it is critical to complement popular media sources with independent resources, and above all with one’s own informed judgment.”

The following chapters go on to review what is known today about how physical exercise, nutrition, mental challenge, social engagement, and stress management can positively affect brain health. Along the way they provide dozens of relevant research results (as well as the design of each study) to support their recommendations. Reporting on all of those experiments is beyond the scope of this review, so if you are interested in examining them (and you should be!) please obtain a copy of the Guide for yourself or from your local library.

Physical exercise is discussed first because of the very strong evidence that exercise—especially aerobic, or “cardio,” exercise slows atrophy of the brain associated with aging, actually increasing the brain’s volume of neurons (i.e., “gray matter”) and connections between neurons (i.e., “white matter”). While much of the initial research supporting the effects of exercise on the brain came from animal studies, the authors report that “several brain imaging studies have now shown that physical exercise is accompanied by increased brain volume in humans.”

Staying physically fit improves cognition across all age groups, with particularly large benefits for so-called “executive” functions such as planning, working memory, and inhibition. A 2010 meta-analysis by the NIH also concluded that physical exercise is a key factor in postponing cognitive decline and/or dementia, while other studies have found physical exercise to lower the risk of developing Parkinson’s disease, as well.

But don’t think that just any moving around will do the trick. When it comes to providing brain benefits, a clear distinction is drawn between physical activity and physical exercise. Only exercise will trigger the biochemical changes in the brain that spur neurogenesis and support neuroplasticity. It doesn’t need to be particularly strenuous, but to be most beneficial it should raise your heart rate and increase your breathing rate.

Of course, adequate nutrition is also imperative in obtaining and maintaining optimal brain health. The SharpBrains Guide to Brain Fitness primarily highlights the well-known benefits of the Mediterranean diet, which consists of a high intake of vegetables, fruit, cereals, and unsaturated fats, a low intake of dairy products, meat, and saturated fats, a moderate intake of fish, and regular but moderate alcohol consumption. But I think it is safe to say that the jury is still out on the best diet for the brain, as evidenced by the recent popularity of the Paleo diet among life extentionists. And, of course, ethnicity and genetics are important, too. The authors do stress the importance of omega-3 fatty acids and antioxidants obtained from dietary sources, stating firmly that “to date, no supplement has conclusively been shown to improve cognitive functioning, slow down cognitive decline, or postpone Alzheimer’s disease symptoms beyond placebo effect.” This includes herbal supplements such as Ginko biloba and St. John’s wort.

Beyond what we normally do to keep our bodies healthy, the Guide also discusses the relative effectiveness of different forms of “mental exercise.” Perhaps you’ve heard that doing crossword or Sudoku puzzles will keep you sharp and alert into old age, or that speaking multiple languages is associated with decreased risk of Alzheimer’s disease. The good news is that these things are true—to a degree. The part that is often left out is that it’s the challenge of these activities that is important. As with physical activity vs. physical exercise, mental exercise refers to the subset of mental activities that are effortful and challenging.

Puzzles and games may be challenging at first, but they (and other mental exercises) can quickly become routine and unchallenging. In order to reap the most benefit from mental exercise, the goal is to be exposed to novelty and increasing levels of challenge. Variety is important for stimulating all aspects of cognitive ability and performance, so excessive specialization is not the best strategy for maintaining long-term brain health. If you are an artist, try your hand at strategybased games. If you’re an economist, try an artistic activity. Get out of your comfort zone in order to stimulate skills that you rarely use otherwise.

The SharpBrains Guide states that “lifelong participation in cognitively engaging activities results in delayed cognitive decline in healthy individuals and in spending less time living with dementia in people diagnosed with Alzheimer’s disease.” This is hypothesized to be because doing so builds up one’s “cognitive reserve”—literally an extra reservoir of neurons and neuronal connections—which may be utilized so that a person continues to function normally even in the face of underlying Alzheimer’s or other brain pathology. This observation raises another important point on which neuroscientists and physiologists do not yet fully agree. Will we all eventually get dementia if we live long enough without credible brain rejuvenation biotechnologies? This is a topic I would like to return to in a future installment of Cooler Minds Prevail.

Social engagement also appears to provide brain benefits. The NIH meta-analysis mentioned earlier concluded that higher social engagement in mid- to late life is associated with higher cognitive functioning and reduced risk of cognitive decline. Brain imaging studies indicate an effect of social stimulation on the volume of the amygdala, a structure that plays a major role in our emotional responses and which is closely connected to the hippocampus, which is important for memory.

Yet again, not all activity is equal. When it comes to social stimulation, “you can expect to accrue more benefits within groups that have a purpose (such as a book club or a spiritual group) compared to casual social interactions (such as having a drink with a friend to relax after work).” To keep socially engaged across the lifespan, seek out interactions that naturally involve novelty, variety, and challenge such as volunteering and participating in social groups.

“The lifelong demands on any person have changed more rapidly in the last thousand years than our genes and brains have,” The SharpBrains Guide explains in the intro to the chapter on stress management. The result? It has become much more difficult to regulate stress and emotions. It is great that we have such amazing and complex brains, but humans are among the few animals that can get stressed from their own thoughts. And while there are some (potentially) beneficial effects of short bursts of stress, high and sustained levels of stress can have a number of negative consequences. Those of note include: increased levels of blood cortisol which can lead to sugar imbalances, high blood pressure, loss of muscle tissue and bone density, lower immunity, and cause damage to the brain; a reduction of certain neurotransmitters, such as serotonin and dopamine, which has been linked to depression; and a hampering of our ability to make changes to reduce the stress, resulting in General Adaption Syndrome (aka “burnout”).

Research-based lifestyle solutions to combat stress include exercise, relaxation, socialization, humor and laughter, and positive thinking. In particular, targeted, capacity-building techniques such as biofeedback and meditation are recommended to manage stress and build resilience. Mindfulness Based Stress Reduction (MBSR) programs have provided evidence that meditative techniques can help manage stress and research shows that MBSR can lead to decreases in the density of an area of the amygdala which is correlated with reduction in reported stress.

So it appears that multiple approaches are necessary to develop a highly fit brain capable of adapting to new situations and challenges throughout life. “Consequently,” The SharpBrains Guide to Brain Fitness states, “we expect cross-training the brain to soon become as mainstream as cross-training the body is today, going beyond unstructured mental activity in order to maximize specific brain functions.”

There is growing evidence that brain training can work, but in evaluating what “works” we are mostly looking at two things: how successful the training program is (i.e., does it actually improve the skill(s) being trained?) and the likelihood of transfer from training to daily life. Building on an analysis of documented examples of brain training techniques that “work” or “transfer,” SharpBrains suggests the following five conditions need to be met for brain training to be likely to translate into meaningful real world improvements (condensed excerpt):

  1. Training must engage and exercise a core brain-based capacity or neural circuit identified to be relevant to real-life outcomes.
  2. The training must target a performance bottleneck.
  3. A minimum “dose” of 15 hours total per targeted brain function, performed over 8 weeks or less, is necessary for real improvement.
  4. Training must be adaptive to performance, require effortful attention, and increase in difficulty.
  5. Over the long-term, the key is continued practice for continued benefits.

Meditation, biofeedback, and/or cognitive therapy in concert with cognitive training to optimize targeted brain functions appear to be winning combinations in terms of successful techniques facilitating transfer from training to real life benefits. Top brain training software programs, based on SharpBrains’ analysis and a survey of their users, include Lumosity, Brain games, brainHQ, Cogmed, and emWave.

In the end, brain fitness needs are unique to each individual and brain fitness claims should be evaluated skeptically. SharpBrains recommends asking several questions when evaluating brain fitness claims, particularly whether there is clear and credible evidence of the program’s success documented in peer-reviewed scientific papers published in mainstream scientific journals that analyze the effects of the specific product.

Of course, your own individual experience with the product is ultimately the most important evaluation of all. If you are ready to take the plunge into the emerging brain fitness market, The SharpBrains Guide to Brain Fitness is a good place to start, and I’m sure they’d appreciate your feedback as this field continues to develop.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, August, 2013

26. January 2015 · Comments Off on HIV, Immunosenescence, and Accelerated Aging · Categories: Health, Neuroscience, Science

After a few articles considering Alzheimer disease from several angles, I would like to switch gears this month and talk more generally about the interaction between the immune system and aging.

In his 2012 paper[1], Caleb E. Finch documents the evolution of life expectancy in the course of human history. The life expectancy at birth of our shared ape ancestor 6 millions years ago is hypothesized to approximate that of a chimpanzee, 15 years. The first Homo species appeared 1-2 million years ago and had a life expectancy of ~20 years, while H. sapiens came onto the scene ~100,000 years ago and could expect about 30 years of life. But starting around 200 years ago, concurrent with industrialization, human life expectancy jumped rapidly, to somewhere between 70 and 80 years today.

As many readers are likely aware, the huge recent increases in life expectancy are commonly attributed to improvements in hygiene, nutrition, and medicine during the nineteenth and twentieth centuries that reduced mortality from infections at all ages. Finch hypothesizes, generally, that early age mortality over the course of human history is primarily due to (acute) infection, while old age mortality is primarily due to (chronic) inflammation. Further analysis of mortality rates over the last several hundred years leads him to further hypothesize that aging has been slowed in proportion to the reduced exposure to infections in early life. These hypotheses are supported by twentieth century examples which strongly demonstrate influences of the early life environment on adult health, such as the effects of prenatal and postnatal developmental influences (e.g., nutrition, exposure to infection) on adult chronic metabolic and vascular disorders as well as physical traits and mental characteristics. This leads Finch to suggest “broadening the concept of ‘developmental origins’ to include three groups of factors: nutritional deficits, chronic stress from socioeconomic factors, and direct and indirect damage from infections.”

Finch also considers the effects of inflammation and diet on human evolution, proposing several environmental and foraging factors that may have been important in the genetic basis for evolving lower basal mortality through interactions with chronic inflammation, in particular: dietary fat and caloric content; infections from pathogens ingested from carrion and from exposure to excreta; and noninfectious inflammagens such as those in aerosols and in cooked foods. He hypothesizes that exposure to these proinflammatory factors, which one would expect to shorten life expectancy, actually resulted in humans evolving lower mortality and longer lifespans in response to highly inflammatory environments.

A means for this, he argues, was the development of the apoE4 genotype. Noting that the apoE4 allele favors advantageous fat accumulation and is also associated with enhanced inflammatory responses, Finch argues that heightened inflammatory response and more efficient fat storage would have been adaptive in a pro-inflammatory environment and during times of uncertain nutrition. As has been discussed in prior articles in Cooler Minds Prevail, the apoE alleles also influence diverse chronic non-infectious degenerative diseases and lifespan. “Thus,” Finch concludes, “the apoE allele system has multiple influences relevant to evolution of brain development, metabolic storage, host defense, and longevity.”

With the general relationship between inflammation and the evolution of human aging and life expectancy in mind, let us now consider immune system involvement in more detail, and the relationship between HIV and immunosenescence more specifically.

Immunosenescence refers to the ageassociated deterioration of the immune system. As an organism ages it gradually becomes deficient in its ability to respond to infections and experiences a decline in long-term immune memory. This is due to a number of specific biological changes such as diminished self-renewal capacity of hematopoietic stem cells, a decline in total number of phagocytes, impairment of Natural Killer (NK) and dendritic cells, and a reduction in B-cell population. There is also a decline in the production of new naïve lymphocytes and the functional competence of memory cell populations. As a result, advanced age is associated with increased frequency and severity of pathological health problems as well as an increase in morbidity due to impaired ability to respond to infections, diseases, and disorders.

It is not hard to imagine that an increased viral load leading to chronic inflammatory response may accelerate aging and immunosenescence. Evidence for this is accumulating rapidly since the advent of antiretroviral therapies for treatment of HIV infection. An unforeseen consequence of these successful therapies is that HIV patients are living longer but a striking number of them appear to be getting older faster, particularly showing early signs of dementia usually seen in the elderly. In one study, slightly more than 10% of older patients (avg = 56.7 years) with wellcontrolled HIV infection had cerebrospinal fluid (CSF) marker profiles consistent with Alzheimer disease[2] – more than 10 times the risk prevalence of the general population at the same age. HIV patients are also registering higher rates of insulin resistance and cholesterol imbalances, suffer elevated rates of melanoma and kidney cancers, and seven times the rate of other non-HIV-related cancers. And ultimately, long-term treated HIV-infected individuals also die at an earlier age than HIV-uninfected individuals[3].

Recent research is beginning to explore and unravel the interplay between HIV infection and other environmental factors (such as co-infection with other viruses) in the acceleration of the aging process of the immune system, leading to immunosenescence. In the setting of HIV infection, the immune response is associated with abnormally high levels of activation, leading to a cascade of continued viral spread and cell death, and accelerating the physiologic steps associated with immunosenescence. Despite clear improvements associated with effective antiretroviral therapy, some subjects show persistent alterations in T cell homeostasis, especially constraints on T cell recovery, which are further exacerbated in the setting of co-infection and increasing age.

Unsurprisingly, it has been observed that markers of immunosenescence might predict morbidity and mortality in HIV-infected adults as well as the general population. In both HIV infection and aging, immunosenescence is marked by an increased proportion of CD28- to CD57+, and memory CD8+ T cells with reduced capacity to produce interleukin 2 (IL-2), increased production of interleukin 6 (IL-6), resistance to apoptosis, and shortened telomeres. Levels of markers of inflammation are elevated in HIV infected patients, and elevations in markers such as high-sensitivity C-reactive protein, D-dimer, and interleukin 6 (IL-6) have been associated with increased risk for cardiovascular disease, opportunistic conditions, or all-cause mortality[4].

But even as we are beginning to identify markers that appear to be associated with risk of poor outcome in HIV infection, it is still unclear how patients should be treated on the basis of this information. To that end, several trials are underway to evaluate the effects of modulation of immune activation and inflammation in HIV infection. At the same time, clinicians at the forefront of advancing knowledge and clinical care are performing research aimed at optimizing care for aging HIV patients.

The implications for such research may be far-reaching. In fact, many HIV clinicians and researchers think that HIV may be key to understanding aging in general. Dr. Eric Verdin states, “I think in treated, HIV-infected patients the primary driver of disease is immunological. The study of individuals who are HIV-positive is likely to teach us things that are really new and important, not only about HIV infection, but also about normal aging.”

Dr. Steven Deeks stresses the collaborative efforts of experts across fields. “I think there is a high potential for tremendous progress in understanding HIV if we can assemble a team of experts from the world of HIV immunology and the world of gerontology,” he says. “Each field can dramatically inform the other. I believe HIV is a well described, well studied, distinct disease that can be used as
a model by the larger community to look at issues of aging.”


[1] Finch, C (2012). Evolution of the Human Lifespan, Past, Present, and Future: Phases in the Evolution of Human Life Expectancy in Relation to the Inflammatory Load. Proceedings of the American Philosophical Society, 156:1, 9-44.

[2] Mascolini, M (2013). Over 10% in Older HIV Group Fit Alzheimer’s Biomarker Risk Profile. Conference Reports for NATAP: 20th Conference on Retroviruses and Opportunistic Infections, March 3-6, 2013.

[3] Aberg, X (2012). Aging, Inflammation, and HIV Infection. Topics in Antiviral Medicine, 20:3, 101-105.

[4] Deeks, S, Verdin, S. and McCune, JM (2012). Immunosenescence and HIV. Current Opinion in Immunology, 24: 1-6.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, June, 2013

29. November 2014 · Comments Off on Deficiencies in the SENS Approach to Rejuvenation · Categories: Health, Science

This article was originally published in Cryonics Magazine, 2011 Issue #1

I am an ardent supporter of Dr. Aubrey de Grey and his work to advance rejuvenation science. The man is priceless and unique in his concepts, brilliance, dedication, organizational abilities, and networking skill. His impact on anti-aging science has been powerful. I have attended all four of the conferences he has organized at Cambridge University in England. For the February 2006 issue of LIFE EXTENSION magazine I interviewed Dr. de Grey, and for the December 2007 issue of LIFE EXTENSION I wrote a review of ENDING AGING, the book he co-authored with Michael Rae.

Dr. de Grey asserts that aging is the result of seven kinds of damage – and that technologies that repair all seven types of damage will result in rejuvenation. His seven-fold program for damage repair is called SENS: “Strategies for Engineered Negligible Senescence”. Dr. de Grey asserts that repairing aging damage is a more effective approach than attempting to slow or prevent aging, and I agree with him. Being an ardent supporter of SENS has not stopped me from simultaneously being a critic of aspects of his program that I think are flawed or deficient. I will attempt to outline some of my criticisms in simple language, assuming that my readers have some knowledge of basic science.

Two SENS strategies cannot justly be described as damage-repair, in my opinion. To protect mitochondrial DNA from free radical damage he wants to make copies of mitochondrial DNA in the nucleus – and import the resulting proteins back into the mitochondria. I would call this an attempt to slow or prevent aging – it cannot be called repair.

Similarly, SENS aims to eliminate cancer by deletion of genes that contribute to cancer, specifically telomerase and ALT (Alternate Lengthening of Telomeres) genes. I am not convinced that this is the best way to eliminate cancer, and I do not believe that deleting cancer-producing genes can properly be called damage-repair.

My criticisms about a procrustean attempt to force two strategies into a model purporting to only be concerned with damage and repair is minor, however, compared to a more fundamental concern that I have that a significant form of aging damage may be being ignored by SENS. I have written a review expressing my concern entitled “Nuclear DNA Damage as a Direct Cause of Aging” that was published in the June 2009 issue of the peer-reviewed journal Rejuvenation Research, [note 1] a journal of which Dr. de Grey is Editor-in-Chief. A PDF of my review is available in the life extension section of my website BENBEST.COM. Those interested in all the citations for claims I will make in this essay are encouraged to read my review. In this essay, I limit my citations to only a few critical articles.

There are many types of DNA damage, but for the purposes of this essay I will focus on breakage of both DNA strands – resulting in a gap in a chromosome. There are two mechanisms for repairing double-strand DNA breaks: Homologous Recombination (HR) and Non-Homologous End-Joining (NHEJ). HR usually results in perfect repair, but HR can only operate when cells are dividing. NHEJ is the more frequent form of double-strand break repair, but it is error-prone. NHEJ is the only DNA repair mechanism available for non-dividing cells. Even in cells that divide, 75% of double-strand breaks are repaired by NHEJ. [note 2]

It is hard to believe that it could be a coincidence that the most notorious “accelerated aging” diseases are due to defective DNA repair. The two most prominent of these diseases are Werner’s syndrome (“adult progeria”) and Hutchinson-Gilford syndrome (“childhood progeria”), both of which are caused by defective nuclear DNA repair, mainly HR. In both diseases the “aging phenotype” is apparently due to high levels of apoptosis and cellular senescence. Apoptosis (“cell suicide”) and cellular senescence (cessation of cell division) are both mechanisms that are induced in cells experiencing nuclear DNA damage that the cell is unable to repair. It is not surprising that victims suffering massive depletion of properly functioning cells should exhibit “accelerated aging”. Mice that are genetically altered to show increased apoptosis and cellular senescence also show an “accelerated aging phenotype”.

Elimination of senescent cells and stem-cell replenishment of cells depleted in tissues by this elimination – as well as depleted by apoptosis – are part of SENS. But these strategies are only applicable to cells that divide – not to non-dividing cells such as neurons. Cryonicists are acutely aware that organs – and even whole bodies – can be replaced, but brains (neurons, axons, dendrites, and synapses, particularly) must be preserved if we are not to lose memory and personal identity. The ability of future medicine to replace all organs and tissues other than the brain would render most of SENS unnecessary – except for the brain.

There is considerable evidence of a significant role for DNA damage in brain aging. There are nearly twice as many double-stand nuclear DNA breaks in the cerebral cortex of adult (180 days) rats as in young rats (4 days) – and old (over 780 days) rats have more than twice the double-strand breaks as adult rats. [note 3] Adult rats show a 28% decrease in NHEJ activity in the cerebral cortex neurons compared to neonatal rats – and old rats show a 40% decrease. [note 4] Declining NHEJ activity with age is at least partially due to ATP decline and cellular damage that SENS is intended to fix. But even if NHEJ activity did not decline with age, nuclear DNA damage in neurons will increase at least in part because NHEJ is so error-prone.

Nuclear DNA damage typically leads to mutation or DNA repair – or apoptosis or cellular senescence when DNA repair fails (a mechanism that is believed to have evolved for protection against cancer). But not all DNA damage is repaired, and NHEJ repair is often defective. Accumulating DNA damage and mutation can lead to increasingly dysfunctional cells.

Cancer is due to nuclear DNA damage, mutations, and epimutations. Dr. de Grey has written that “only cancer matters” for mutation and epimutation to nuclear DNA. His mutation terminology does not even acknowledge DNA damage. He has assumed that damaged DNA either is or becomes a mutation. He has assumed that DNA damage that does not become a mutation is either repaired – or leads to apoptosis or cellular senescence.

Dr. de Grey has made the claim that evolution has required such strong defenses against cancer that residual mutation (and, implicitly, DNA damage) is negligible. But cancer incidence increases exponentially with age up to age 80, so it is likely that the residual increases exponentially at the same time.

As recently as the 1980s it was widely believed that normal aging is associated with extensive neuron loss. Now it is established that functional decline in the aging brain is associated with increased neural dysfunction rather than neurodegeneration. [note 5] This neural dysfunction may or may not be mostly due to cellular damage that SENS is intended to fix – including causes of declining NHEJ activity. How much neuron dysfunction associated with aging is due to accumulating mutations or unrepairable nuclear DNA damage is unknown. SENS assumes without proof that nuclear DNA damage and mutation is negligible as a cause of aging (apart from cancer, apoptosis, and cellular senescence). This may be right or it may be wrong. I believe that without definitive proof, nothing should be assumed, and active investigation to determine the facts should not be neglected.

I believe the situation is not hopeless if nuclear DNA damage proves to be a significant cause of brain aging. Future molecular technologies for detection and repair of nuclear DNA damage could be significantly better than natural DNA repair enzymes. And, to simplify the required effort, the DNA repair technologies could be restricted to genes that are actively transcribed in neurons, rather than needing to repair the whole genome.


1: Best BP. Nuclear DNA damage as a direct cause of aging. Rejuvenation Res. 2009 Jun;12(3):199-208.

2: Mao Z, Bozzella M, Seluanov A, Gorbunova V. Comparison of nonhomologous end joining and homologous recombination in human cells. DNA Repair (Amst). 2008 Oct 1;7(10):1765-71.

3: Mandaville BS, Rao KS. Neurons in the cerebral cortex are most susceptible to DNA-damage in aging rat brain. Biochem Mol Biol Int 1996 Oct; 40(3):507-14.

4: Vyjayanti VN, Rao KS. DNA double strand break repair in brain: reduced NHEJ activity in aging rat neurons. Neurosci Lett. 2006 Jan 23;393(1):18-22.

5: Morrison JH, Hof PR. Life and death of neurons in the aging brain. Science. 1997 Oct 17;278(5337):412-9.

14. November 2014 · Comments Off on Alzheimer Disease in 2020 · Categories: Health, Neuroscience

Any terminal illness is a terrible thing; but to a cryonics member, a brain-destroying neurodegenerative disease is the worst contemporary medical “death sentence” one can receive. There are several flavors of neurodegenerative disorders, many of which primarily affect the patient’s movement, strength, coordination, or the peripheral nervous system. And there are numerous contributory mechanisms in the causation of neurodegeneration, including prion infection and toxin related disease. But the most common – and the most feared – neurodegenerative disease is one that affects not movement, but cognition.

Of course, I am speaking of Alzheimer disease (AD). Originally described in a 51- year old woman by the Bavarian psychiatrist Alois Alzheimer in 1906, neuropathologists have increasingly recognized that AD is also the most common basis for latelife cognitive failure. Culminating in neuronal dystrophy and death leading to the progressive loss of memory and other cognitive functions (i.e., dementia), and affecting individuals of both sexes and of all races and ethnic groups at a rate of occurrence in the U.S. ranging from approximately 1.3% (age 65-74) to 45% (age 85-93), it is easy to see why AD has generated so much intense scientific interest in recent years.

In the recently published work “The Biology of Alzheimer Disease” (2012), most of what is known about AD today is described in detail in the various chapters covering topics such as the neuropsychological profile and neuropathological alterations in AD, biomarkers of AD, the biochemistry and cell biology of the various proteins involved in AD, animal models of AD, the role of inflammation in AD, the genetics of AD, and treatment strategies. The editors’ selection of contributions has resulted in the most up-to-date compendium on Alzheimer disease to date.

The book culminates in a chapter called Alzheimer Disease in 2020, where the editors extol “the remarkable advances in unraveling the biological underpinnings of Alzheimer disease…during the last 25 years,” and yet also recognize that “we have made only the smallest of dents in the development of truly disease-modifying treatments.” So what can we reasonably expect over the course of the next 7 years or so? Will we bang our heads against the wall of discovery, or will there be enormous breakthroughs in identification and treatment of AD?

Though a definitive diagnosis of AD is only possible upon postmortem histopathological examination of the brain, a thorough review of the book leads me to believe that the greatest progress currently being made is in developing assays to diagnose AD at earlier stages. It is now known that neuropathological changes associated with AD may begin decades before symptoms manifest. This, coupled with the uncertainty inherent in a clinical diagnosis of AD, has driven a search for diagnostic markers. Two particular approaches have shown the most promise: brain imaging and the identification of fluid biomarkers of AD.

Historically, imaging was used only to exclude potentially surgically treatable causes of cognitive decline. Over the last few decades, imaging has moved from this minor role to a central position of diagnostic value with ever-increasing specificity. The ability to differentiate AD from alternative or contributory pathologies is of significant value now, but the need for an earlier and more certain diagnosis will only increase as disease-modifying therapies are identified. This will be particularly true if these therapies work best (or only) when initiated at the preclinical stage. Improvements in imaging have also greatly increased our understanding of the biology and progression of AD temporally and spatially. Importantly, the clinical correlations of these changes and their relationships to other biomarkers and to prognosis can be studied.

The primary modalities that have contributed to progress in AD imaging are structural magnetic resonance imaging (MRI), functional MRI, fluorodeoxyglucose (FDG) positron emission tomography (PET), and amyloid PET. Structural MRI, which is used to image the structure of the brain, has obvious utility in visualizing the progressive cerebral atrophy characteristic of AD. Such images can be used as a marker of disease progression and as a means of measuring effective treatments (which would slow the rate of atrophy). Functional MRI, on the other hand, measures changes in blood oxygen leveldependent (BOLD) MR signal. This signal, which can be acquired during cognitive tasks, may provide the clinician with a tool to compare brain activity across conditions in order to assess and detect early brain dysfunction related to AD and to monitor therapeutic response over relatively short time periods.

FDG PET primarily indicates brain metabolism and synaptic activity by measuring glucose analog fluorodeoxyglucose (which can be detected by PET after labeling it with Fluorine-18). A large body of FDG-PET work has identified an endophenotype of AD – that is, a signature set of regions that are typically hypometabolic in AD patients. FDG hypometabolism parallels cognitive function along the trajectory of normal, preclinical, prodromal, and established AD. Over the course of three decades of investigation, FDG PET has emerged as a robust marker of brain dysfunction in AD. Imaging of β-amyloid (Aβ) – the peptide that makes up the plaques found in the brains of AD patients – is accomplished via amyloid PET to determine brain Aβ content. Historically, this has only been possible upon postmortem examination, so the utility of amyloid imaging is in moving this assessment from the pathology laboratory to the clinic. Because amyloid deposition begins early on, however, amyloid PET is not useful as a marker of disease progression.

The well-known hallmarks of AD, the plaques and neurofibrillary tangles first described by Alouis Alzheimer in 1906, were discovered in 1985 to be composed primarily of β-amyloid and hyperphosphorylated tau protein, respectively. Advances in our knowledge of Aβ generation and tau protein homeostasis have led to substantial research into disease-modifying drugs aimed at decreasing overall plaque and tangle load in an effort to halt neurodegeneration. Such treatments will likely be most effective if started early in the disease process, making sensitive and accurate fluid biomarkers of Aβ and tau especially important.

Outside of imaging, progress in AD diagnostics stems primarily from the assessment of fluid biomarkers of AD. These biomarkers are generally procured from the cerebrospinal fluid (CSF) and blood plasma and include total tau (T-tau), phosphorylated tau (P-tau) and the 42 amino acid form of of β-amyloid (Aβ42). These core biomarkers reflect AD pathology and have high diagnostic accuracy, which is especially useful in diagnosing AD in prodromal and mild cognitive impairment cases.

Because the CSF is in direct contact with the extracellular space of the brain, biochemical changes in the brain can be detected in the CSF. Assays to detect Aβ42 led to the discovery that Aβ42 in AD is decreased to approximately 50% of control levels, making the measurement of Aβ42 a useful clinical tool. Measurements of T-tau (around 300% of control in AD patients) and P-tau biomarkers (a marked increase in AD patients) in combination with Aβ42, however, provide an even more powerful diagnostic assay.

Fluid biomarkers for AD other than Aβ and tau have been posited, but positive results have been difficult to replicate. Novel biomarkers with the most promise inlcude the amyloid precursor proteins sAPPβ and sAPPα, β-site APP cleaving enzyme-1 (BACE1), Aβ oligomers, and other Aβ isoforms. Additionally, neuronal and synaptic proteins as well as various inflammatory molecules and markers of oxidative stress may prove valuable as CSF biomarkers. Studies of plasma biomarkers such as those investigating plasma Aβ have yielded contradictory results, but promising novel blood biomarkers for AD may be found in certain signaling and inflammatory proteins.

Taken together, progress in brain imaging and identification of fluid biomarkers hold great promise in improved diagnosis of AD cases. When combined with expected drug therapies we may be able to delay the onset of neurodegeneration and associated cognitive impairment significantly. In the meantime, early diagnosis is helpful in stratifying AD cases, monitoring potential treatments for safety, and monitoring the biochemical effect of drugs. For cryonicists, early diagnosis can help guide treatment and end-of-life care decisions in order to optimize cryopreservation of the brain.

So – back to the original question. What can we predict about the AD landscape in 2020?

Besides continued progress in early diagnosis through brain imaging and fluid biomarkers, the authors anticipate that advances in whole-genome and exome sequencing will lead to a better understanding of all of the genes that contribute to overall genetic risk of AD. Additionally, improved ability to sense and detect the proteins that aggregate in AD and to distinguish these different assembly forms and to correlate the various conformations with cellular, synaptic, and brain network dysfunction should be forthcoming in the next few years. Lastly, we will continue to improve our understanding of the cell biology of neurodegeneration as well as cell-cell interactions and inflammation, providing new insights into what is important and what is not in AD pathogenesis and how it differs across individuals, which will lead, in turn, to improved clinical trials and treatment strategies.

Originally published as an article (in the Cooler Minds Prevail series) in Cryonics magazine, April, 2013

08. September 2014 · Comments Off on Social Benefits of Rejuvenation Bioechnologies · Categories: Health, Society

When advocates of radical life extension discuss the social benefits of humans having much longer lifespans, it is often just a footnote to a personal desire to prolong life. As a consequence, cynicism from critics is often encountered. It hard to counter such skepticism effectively because people may believe you are just trying to make an essentially selfish desire look socially desirable.

There is an alternative. We can approach the topic from the other direction if we ask what kind of lifespans would be desirable if we want to increase social welfare and reduce human suffering. Let’s look at a number of issues.

There is a large literature about coping with the death of loved ones, relatives, and friends. While many people find support from such self-help books, most people would agree that no amount of anticipation or coping can eliminate the suffering and devastation that follows the death of a loved one. Is there an upside? I am not aware of any serious writer pontificating about the positive aspects about a person dear to you dying or suffering from aging-related disabilities. A society in which humans have control over the aging process would be desirable because it would eliminate the dominant cause of death (age-associated diseases) and the suffering it brings to survivors.

It is not uncommon to hear people being accused of not caring about the effects of their actions on future generations. This complaint is particularly prominent in discussions about the environment and the use of natural resources. If humans were not born to die on a predictable schedule this whole dynamic would change because the distinction between current and future generations would cease to exist. If consideration of the long-term consequences of our actions requires a prominent place in human life, we should not want humans to replace each other but generations to coexist in time and space.

Age discrimination involves discrimination of individuals on the basis of their age. In most instances, however, this discrimination concerns biological age and its effects on appearance, physical health, and mental skills. Biological age is not hard to observe and can usually be inferred from chronological age. If we prefer that people are not treated differently because of their date of birth we should want to live in a society where rejuvenation biotechnologies sever the link between chronological age and biological age.

What about economic welfare? Ageless people would be able to remain productive and generous, medical costs associated with the debilitating health and mental effects of biological aging would be substantially reduced, and highly talented people would not cease to exist.

Reasoning backwards from what morality and welfare would “dictate” about human lifespans is not just a talking point in discussions about the bioethics of life extension. One can imagine the rise of a social movement that seeks to educate the general public about the social benefits of biological control over the aging process. Such a social movement would not be in the business of making excuses for eccentric individual desires but would recommend that the reduction of suffering, sustainable growth, and more virtuous conduct would require that humans do not have a fixed expiration date.

Originally published as a column (Quod incepimus conficiemus) in Cryonics magazine, December, 2013

17. August 2014 · Comments Off on Is Aging a Choice? · Categories: Health, Science

The idea that aging is a choice will strike many readers as preposterous and I will admit at the outset that such a position can ultimately not be maintained. But in a milder sense, it should be recognized that we can make decisions in life regarding diet and lifestyle that can mitigate or accelerate the aging process. This “wiggle room” may turn out to be of great importance for reaching a time when serious rejuvenation biotechnologies will become available.

According to biologist Michael R. Rose (see the interview in Cryonics magazine, September 2013) aging is not an immutable process of wear and tear that unfolds through iron logic without being sensitive to lifestyle and diet. Aging begins after the start of reproduction and the forces of natural selection decline with chronological age, eventually stopping at late age (which raises the possibility that aging stops).

Some things that we associate with aging are not inevitable physiological processes but choices or decisions to conform to expectations. For example, when people reach adulthood, and pursue a family and career, they often conform to a lifestyle that involves more time sitting at a desk or in cars, more time spent inside, less time socializing with friends, and are subject to increasing amounts of stress and sleep deprivation.

As the physiological consequences of such a lifestyle (obesity, higher blood pressure, declining free hormone levels) express themselves many people tell themselves such things are the inevitable effects of getting older. But alternative scenarios may be possible if we remain aware of our environment, lifestyle, and diet.

In the case of diet, the dominant opinion remains that a healthy diet can be identified regardless of age, sex, and population group. There is increasing evidence, however, that such a perspective leaves a lot to be desired and that too much reductionism in these matters is not a good thing. There are, however, a number of observations that can be made. Restriction of calories (or intermittent fasting or meal skipping) seems to trigger a beneficial stress response that improves health and perhaps even extends life. Similarly, adopting a diet that more closely mimics that of hunter gatherers in conjunction with giving up a sedentary lifestyle has been successful in improving the lives (and looks!) of many people, in particular in the case of obesity.

What makes it rather difficult to adopt such lifestyle changes is that we are almost continuously exposed to an environment that makes it rather difficult to effect such changes. Most of our food is highly processed, loaded with carbs and sugar, and served in portion sizes that always seem to increase. When we move from one location to another the emphasis is on minimizing energy expenditure and eliminating resistance. We work in dark and confined spaces during the day and are exposed to light until we go to sleep (or sometimes even during sleep!). When we come home we turn on the television or the computer to “socialize.” It should not surprise us that such an “unnatural” lifestyle translates into the classic signs of aging and functional deterioration.

There is a lot at stake here. As daunting as it may seem, the idea that aging is not a uniform “process” that swallows us up at a constant rate opens up the possibilities of positive change. Armed with the latest findings in evolutionary biology and medicine we can start pushing back, stabilize the situation as best as we can, and reach a time when more radical rejuvenation biotechnologies will become available. Start moving, start lifting, go camping, make new friends, eat organic and fermented foods, skip the occasional meal, and cut the sugar!

Originally published as a column (Quod incepimus conficiemus) in Cryonics magazine, October, 2013

22. March 2013 · Comments Off on Iatrogenesis and Cryonics · Categories: Cryonics, Health

Wikipedia tells us that iatrogenesis is “an inadvertent adverse effect or complication resulting from medical treatment or advice…” The key word in this definition is “inadvertent.” For example, a doctor who exposes a patient to a bacterial infection by accidentally donning non-surgical gloves is an example of iatrogenesis. A doctor who deliberately administers a lethal dose of an anesthetic is not. One source of iatrogenesis is adverse effects.

A defining characteristic of contemporary human cryopreservation is that it is not possible to stabilize patients at very low temperatures without producing additional damage. Forms of injury in cryonics include ice formation, cryoprotectant toxicity, and fracturing. The relevance of the concept of iatrogenic diseases to cryonics was first recognized by Thomas Donaldson in his article “Neural Archeology” (Cryonics, February 1987). What sets cryonics apart is that cost-benefit analysis favors cryopreservation in a sense not encountered in ordinary medicine. Cryonics is the last hope to save the life of the patient and the alternative course of action is irreversible death.

One could say that the adverse effects of cryonics are a form iatrogenic injury, but since the major adverse effects of cryonics are known and recognized, cryonics cannot be brought under the rubric of iatrogenesis. But just as medical researchers and pharmaceutical companies allocate resources to developing drugs with fewer or less serious adverse effects, Alcor aims to improve procedures to eliminate these forms of injury. Examples include vitrification agents to eliminate ice formation, intermediate temperature storage to eliminate (or reduce) fracturing, rapid cooling devices to decrease ischemic injury, etc. The ultimate goal is to create a low temperature stabilization procedure that does not induce any additional injury. Such an achievement would constitute true human suspended animation. We would not be able to treat the disease of the patient yet, but could induce biostasis and reverse it without any adverse effects.

There is narrower application of the idea of iatrogenic injury to specific elements of cryonics procedures. For example, if a multiperson team is present at the bedside with a portable ice bath, ice, and a functioning chest compression device, but later analysis of the temperature data reveals negligible cooling, negligence or error may be involved. This is a rather dramatic example and most examples of non-intrinsic iatrogenic injury in cryonics have a subtler character. Cryonics is particularly vulnerable to iatrogenic injury because of the lack of clear objectives for the individual procedures and the lack of
consistent and comprehensive monitoring.

A rather disappointing excuse for permitting additional injury is the view that since cryonics patients will require advanced repair technologies in the future anyway it is not of great importance to minimize adverse effects of the cryonics procedures themselves. Such an attitude encourages recklessness, makes a mockery of the idea of human cryopreservation as medicine, and is not the kind of cryonics that is going to win over scientists, medical professionals, and the educated public. We do not know at which point injury translates into irreversible identity destruction, but we do know that the closer our procedures conform to reversible human suspended animation the less likely it is that we are wandering into that territory.

Cryonics cannot be disqualified merely because it introduces adverse effects. We know it does and we have no choice but to accept this. But an aggressive pursuit of human suspended animation will eliminate these adverse effects step-by-step so a future doctor will no longer need to worry about the effects of the cryonics procedure itself.

Originally published as a column (Quod incepimus conficiemus) in Cryonics magazine, February, 2013