Comprehensive coverage

Stimulating the brain / Gary Stix

Will taking a pill at breakfast improve concentration and memory - and won't it cause damage to health in the long run?

Brain stimulants. Illustration: shutterstock
Brain stimulants. Illustration: shutterstock

The symbol +H is the code used by some futurists to indicate an improved version of humanity (+humanity). This improved version of the human race will use a combination of advanced technologies, including stem cells, robots, cognitive-enhancing drugs, and the like, to overcome basic mental and physical limitations. The idea of ​​enhancing mental functions by swallowing a pill that improves attention, memory and planning - the foundations of cognition - is no longer just a fantasy of futurists. If the last decade of the last century was named "the decade of the brain" by the then president of the USA, George W. Bush, the current decade can be called the "decade of the improved brain".

The pursuit of cognition-enhancing substances is evident in news articles that enthusiastically report on the development of substances also known as "smart drugs", neuroenhancers, nootropics, or even "Viagra for the brain". From the perspective of these articles, the age of brain enhancement is here. University students routinely borrow some Ritalin pills from a prescription friend to study all night. Programmers who must adhere to a rigid schedule, or managers trying to maintain a mental edge, gobble up , a pill from the new generation of stimulants. Adherents of these drugs swear that they don't just increase alertness, like coffee, but rather they instill focused concentration of the kind needed to absorb the intricacies of organic chemistry or to explain the intricacies of calculating the mortgage repayment table.

The scientists and pharmaceutical manufacturers who work hard to translate the research dealing with the molecular basis of cognition into drugs that aim to improve mental function - especially for people suffering from dementia - also contribute to the era of improvement. There is no escaping this, that doctors will eventually start to prescribe a drug intended for Alzheimer's or Parkinson's patients also for a wider population of elderly people with milder problems. Widespread public debates about the ethics of brain enhancement reinforce the feeling that cognitive enhancement pills will one day be available to everyone.

In academic and news articles, the question arose, whether cognitive-enhancing drugs have already given some students an unfair advantage in university entrance exams, or whether it would be crossing a red line on the part of employers if they required their employees to ingest such substances in order to meet the schedule to which the company is committed. But even though the press is already publishing articles about "the boss who became a drug dealer", some express doubts as to whether drugs to strengthen "brain power" are indeed realistic. Do the drugs available today for attention problems or excessive sleepiness actually help a student to do better in an exam, or a company manager to function perfectly under the harsh questioning of the board of directors? Will a drug that interferes with basic brain functions ever be safe enough to be placed on the shelf alongside over-the-counter pain relievers and antacids? All of these questions are now hotly debated among neuroscientists, doctors and philosophers of ethics.

Ethical dissonance

If we put aside for the moment the debates about safety, fairness and coercion, there is indeed a great need for cognitive enhancers given in conditions such as attention deficit hyperactivity disorder (ADHD). According to US government data from 2007, more than 1.6 million people in the US used stimulant drugs for non-medical purposes during the 12 months preceding the survey. Legal drugs in this category include methylphenidate (Ritalin), the amphetamine Adderall, and modafinil (Provigil). On some campuses, a quarter of the students reported using these drugs. An informal online survey conducted by the journal Nature in 2008 among its readers showed that 20% of 1,427 respondents from 60 countries, asked about their personal use habits, said they had used methylphenidate, modafinil, or beta-blockers (the latter for stage fright). The most common reason was the need to improve concentration. Most often, people obtain the drugs online, or from doctors, who are allowed to prescribe a drug approved for a specific purpose to treat another condition (but drug manufacturers are legally prohibited from promoting such "off-label" uses).

The consumption of these chemicals will probably increase with the aging of the population and with the increasing globalization of the economy. "If you're 65 years old living in Boston and your retirement savings have dropped sharply and you have to stay in the workforce and compete with a 23-year-old in Bombay, you may feel pressured to turn to these compounds to stay alert and efficient," says Zach Lynch, director of the Neurotechnology Industry Association. The recently felt need for ethical guidelines assumes, of course, that these drugs are better than a dummy drug (placebo), and that they do improve some cognitive aspect, such as attention, memory or "executive function" (planning and abstract discussion, for example). Starting from this premise, many argue, ethicists must consider the possible consequences of the popularity of these drugs. For this reason, a new academic field, neuroethics, was founded in 2002, the purpose of which is, in part, to discuss the moral and social questions arising from the use of drugs and devices (brain implants, etc.) that enhance cognition.

A group of ethicists and neuroscientists published a provocative position paper in Nature last year, in which they argued that in the future drugs will not only be used to treat diseases. The article suggested allowing a broad public of mentally fit people to consume stimulant drugs to improve performance in the classroom or conference room, provided that the drugs are tested and found to be safe and effective enough for healthy people. The researchers cited studies that showed the beneficial effect of these drugs on memory and various forms of mental processing, and equated the medicinal improvement to "education, good health habits and information technology - ways in which our resourceful species tries to improve itself."

One of the authors of the article, John Harris, a bioethicist at the University of Manchester in England, went further than this in an opinion piece, which he published six months later in the British Journal of Medicine. Harris, editor of the "Journal of Medical Ethics" and of the book called "Improving Evolution", noted that if methylphenidate is found to be safe enough to treat children, it should be considered harmless for use by adults who wish to stimulate their minds. In an interview he gave later, Harris said he sees a gradual loosening of the restrictions, and if no safety issues arise, the drug (which is currently controlled in the US) may eventually end up on the shelves of non-prescription drugs, like aspirin.

These thoughts did not go unanswered. Other researchers and ethicists have questioned whether it will ever be possible to verify the safety of mind-altering drugs to the extent that would justify their distribution in the same way as over-the-counter pain relievers such as coffee and tea.

"People think that cognitive enhancement is like improving vision with glasses," says James Swanson, a researcher at the University of California, Irvine, who has been involved in clinical studies of Adderall and Modafinil for ADHD. "I think people don't understand the dangers of making these drugs available to a large number of people. A small percentage of users will develop an addiction, and there will also be those who will see their mental function deteriorate. This is why I oppose their general use.” In this spirit, the British Home Office is awaiting the report of an advisory committee to decide whether the potential harm of non-medical use of cognitive enhancers warrants new regulations.

Other scientists claim that the debate is unnecessary, because it is not clear that it is possible to improve intelligence other than through the tedious practice of cramming material for an exam in mathematics. Some who have tried to develop drugs to reverse the memory loss in dementia diseases doubt the possibility of improving the brain of a healthy person. "I would not be overly concerned about the consequences of using cognitive enhancers by healthy people, simply because there is no such thing as cognitive enhancers," says Rosiko Borcholadze, author of a popular book on neuroscience and a researcher who contributed to the work that led to the Nobel Prize-winning scientist Eric R. Kendall in 2000. "It is still early, far too early, to talk about cognitive enhancers, and such drugs may not appear in our time. This topic causes too much fuss."

According to this view, the complex system of chemical signals, enzymes, and proteins that come together to build memory is in a tightly controlled balance that is resistant to change, unless disrupted by disease. The decline of the thinking processes and the sense of self-identity that accompany dementia can perhaps be treated by compensating for the loss of important chemicals, and then it is also justified to risk unwanted side effects of the drugs. But the disruption of the fragile delicate balance in the healthy person may cause unexpected results. For example, any improvement in long-term memory (where our childhood memories and last summer vacation reside) may result in reduced capacity for working memory (the short-term memory or mental register where our brain temporarily stores phone numbers).

Some criticize the very debate about the ethics of brain enhancement and attribute the uproar today to what they call "speculative ethics". Nanotechnology and other innovative technologies are also affected by this tendency, where ethicists, scientists and policy makers are dragged into discussions of the social consequences of technologies that have not yet been invented, from smart pills to riotous nano-robots. "A considerable part of the debate on the improvement of man...suffers from inflated expectations and media chaos," point out Marti Schermer from Erasmus University in Rotterdam and her colleagues in the journal "Neuroethics".

ups and downs

The idea that existing drugs might improve cognition in healthy people is about a century old, and it has produced controversial results. In 1929, the chemist Gordon Ells began the medical use of amphetamine - a synthetic substance similar in structure to a substance extracted from the Chinese plant sherbitan (ephedra). (Els also invented the drug ecstasy, which is also an amphetamine). Different forms of the drug were given to soldiers on both sides of the barricade in World War II, to keep them alert and encourage courage. The Germans and Japanese used methamphetamine, while the British and Americans used Benzedrine, which is a drug similar to Adderall.

The scientists soon sought to find out if the effects were real. Psychological evaluations of the British and Americans in the 20s revealed that users of the drug gave high ratings to their performance on tests that measured reading speed, multiplication operations and other tasks. However, their test scores, on most tasks, were no better than subjects who received caffeine. In fact, performance even decreased on more complex tasks. "Amphetamines improve mood and therefore make us feel that our functioning is particularly good, when in fact it is not," says Nicholas Rasmussen, historian of science at the University of New South Wales in Sydney, Australia, and author of the book On Speed ​​(New York University Press, 2008). "In simplistic laboratory tests to assess performance on boring tasks, they raise scores by increasing diligence, but this has nothing like the effect on performance on a law school exam or on combat flight."

Methylphenidate, a chemical relative of the amphetamines, appeared in 1956 as a seemingly milder and milder stimulant ("the golden path in psychomotor stimulation", in the words of the manufacturer), but after comparing the doses, its biochemical and psychological effects were similar. The heyday of amphetamines was about 40 years ago. Consumption in the US reached 10 billion pills in the late 60s, before the Food and Drug Administration stopped the celebration and labeled them as controlled substances requiring a special prescription.

Neuroscientist Michael S. Gazzaniga of the University of California, Santa Barbara, who was one of the authors of the Nature paper, remembers his father sending him Benzedrine to improve learning when he was in college in the early 60s. In the mid-90s, the increasing use of methylphenidate to treat ADHD prompted researchers to develop innovative brain imaging methods and sophisticated neuropsychological tests to examine the drug's effects on healthy individuals. These studies served as a basis for comparison between a healthy population and those with ADHD and other neuropsychiatric disorders. In an article published in 1997 in the journal "Psychopharmacology", Barbara Sahakian, Trevor Robbins and their colleagues at the University of Cambridge showed that methylphenidate improved cognitive function in several measures (in particular, spatial working memory and planning) in a group of young, healthy men at rest, but not in other measures, including Attention and eloquence.

As the tests went on, the volunteers seemed to make more mistakes in their responses, perhaps due to the impulsivity induced by the drug's effect.

The same researchers found almost no cognitive benefit that the drugs bring to old but healthy men. And in 2005, a group at the University of Florida School of Medicine in Gainesville found no cognitive improvement in the performance of 20 sleep-deprived medical students who took the drug. Another obstacle to placing methylphenidate on the shelf alongside caffeine pills is its potential to cause arrhythmias as well as overconsumption for pleasure. Addiction is rare in acceptable doses, but in the 70s, methylphenidate addiction was common among people who inhaled or injected the drug, colloquially known as "the West Coast."

always awake

Given the tortuous history of amphetamines, neuroscientists and doctors rejoiced at the advent of modafinil, an alertness-enhancing drug whose side effects seemed milder and so did its potential for abuse. Modafinil's ability (introduced to use in the US in 1998) to allow working for long hours without the need for a break has made it the drug of choice for jet lagers who try to live simultaneously in four different time zones.

Jamie Cashio, a fellow at the Institute for the Future in Palo Alto, California, heard about the drug from frequent flyer friends, and asked for and received a prescription for modafinil from his doctor. During his trips overseas, he noticed that besides being alert, he also felt sharper. "The increased focus and clarity I felt was a big surprise, but it was a very pleasant surprise," says Cashio, who has mentioned the drug in several of his articles. "My experience was not that I became a super-brain. It was more of a feeling of slipping more easily into a state of cognitive flow, a state of being able to work without distractions."

Tests confirmed some of Cashew's prescriptions. In 2003, Sahakian and Robbins found that 60 healthy men at rest improved on some neuropsychological measures, such as remembering number series, but the results were unchanged on other measures. Researchers elsewhere have also found the drug to be of some benefit, although, as Cashew pointed out, it won't turn a slob into a genius. Furthermore, these studies did not examine effects on cognition over extended periods.

It is unlikely that modafinil or methylphenidate will be deregulated in the future, also because the drugs affect different people in different ways. Modafinil greatly improves the cognitive function of users with lower IQ, but has little, if any, effect on those with higher innate ability. Methylphenidate significantly improved working memory in subjects with poor memory, but only to a small extent in subjects who had good memory to begin with.

Like the amphetamines, modafinil was not founded on a thorough understanding of the biological basis of brain function. But current research shows that the drug affects some neurotransmitters, the molecules that stimulate firing in certain groups of neurons. The exact mechanism of the drug is still unclear, but recently Nora D. Volkov, director of the US National Institute on Drug Abuse, and her colleagues discovered that one of these neurotransmitters is dopamine, the same substance that amphetamines increase production and that gives these drugs their addictive power. "It seems that methylphenidate and modafinil are very similar in what they do to the dopamine system in the brain, unlike what we thought," says Volkov, although she adds that it is not practical to smoke or ingest modafinil to get a strong "high", so the risk of abuse is lower. Another barrier to widespread use appeared in 2006, when the FDA rejected the substance as a drug for ADHD in children, due to reports of severe skin rashes.

Repackaging attention-enhancing drugs as cognitive enhancers for students, executives, and programmers may yield only marginal benefits compared to consuming a double espresso. The question of what exactly is a cognitive enhancer prompted a group at the American College of Neuropsychopharmacology to come together and discuss the criteria a drug must meet in order to be defined as a cognitive enhancer. Ultimately, breakthrough drugs may come from another field of research. Insights into how we translate a baby's face or a friend's name into permanent memory have laid the foundations for new drugs aimed at achieving better function in Alzheimer's patients and other dementias.

Optimism about a new generation of drugs stems, in part, from advances in basic research on the biochemical processes of memory formation. More than 30 types of genetically modified mice showed an improved ability to acquire information and store it in long-term memory compared to the average mouse. "This is the first time in the history of neuroscience that we understand the molecular and cellular biological basis of memory," says Alquino G. Silva, a neurobiologist at the University of California, Los Angeles. "What this means for society is that this is the first time we can use this foundation to start changing the way we learn and remember."

But truly effective drugs for memory are still elusive, in part because of the scientific challenges. Of the 200 mutations introduced into the genes of mice by researchers worldwide, the vast majority caused defects. Silva remembers one mouse in his lab that illustrated what trade-offs researchers might encounter during the development of cognitive enhancers. The animals learned faster than normal mice, but were unable to complete a complex task given to them by the researchers. "If you taught them something simple, they learned it quickly, but anything a little more complex, they couldn't learn at all," says Silva. He estimates that it will take decades before it will be possible to routinely use drugs from this research.

The logistical challenges are also daunting. Quite a few of the first companies that entered the race, including companies founded by leading academic researchers, failed. In 2004, the journal Science mentioned four new companies - Sension, Cortex Pharmaceuticals, Memori Pharmaceuticals and Helicon Therapeutics - as representing a trend. Sension was closed. Cortex is faltering and desperately looking for a business partner. Last year, after several rounds of layoffs and clinical trial failures, Hoffman-La Roche bought Memori, one of whose founders is Nobel laureate Eric Kendall, for a penny a share (less than $XNUMX). Helicon survived thanks to the generosity of billionaire Kenneth Dart, tycoon of polystyrene cups, who was captivated by the idea of ​​memory drugs - the company is developing a drug that will affect a pathway related to glutamate, a neurotransmitter that stimulates a complex cellular signaling pathway involved in the creation of long-term memories [see box on the opposite page].

A sister company, Dart Neuroscience, now handles the discovery of new compounds with medicinal potential, leaving the work of clinical trials to Helicon. So far, Helicon has received more than 100 million dollars, but has not yet reached the advanced stages of a clinical trial for any of its drugs. "The way I explain this to the audience when I lecture is that when Helicon was founded, I was developing memory enhancers for my parents, and I wasn't gray-haired," says Tim Talley, Helicon's chief scientist, who participated in the founding of the company while working at Cold Spring Harbor Laboratories. "My parents are now dead, and I am ashen and well aware of the fact that the race is for me and not for them."

Talley, 55, doesn't foresee his creations ever becoming the next Viagra or Prozac. "What the media likes to do is completely ignore expected side effects and jump right into wild speculation about lifestyle drugs," Talley says. "And I think they miss the mark. The reality is that if you have memory impairment that is making your life difficult, these drugs might help, but for anyone else they would probably be too dangerous."

Despite the words of warning, drug manufacturers continue to try to develop cognition enhancers for Alzheimer's disease and other dementias [see table on opposite page]. Among the compounds being tested are those that change the effect of other neurotransmitters besides glutamate - including the activation of receptors by the nicotine in tobacco (although not receptors associated with addiction). One of the reasons people smoke is because nicotine helps sharpen attention.

Lessons learned from drugs developed for dementia may lead to substances that alleviate the milder cognitive problems associated with normal aging, assuming the compounds don't come with a burden of intolerable side effects. Such pills, if found harmless enough, may find their way into student dormitories or executive suites. "In the field of pharmaceuticals, people are realizing that a successful cognitive enhancer could be the biggest pharmaceutical hit of all time," says Peter B. Rayner, professor of neuroethics at the University of British Columbia.

Facing the market

Despite the scientific satisfaction associated with the discovery of cognition-enhancing drugs through a detailed investigation of the molecular processes underlying cognition, the first substances to reach the market for the treatment of dementia and other cognitive disorders will probably not come from a deep understanding of neural function. They may come from the accidental discovery that a compound approved for another purpose also affects cognition. For example, one of the candidate drugs, which recently entered an advanced stage of clinical trials to treat the cognitive impairment that occurs in Alzheimer's disease, was developed in Russia as an antihistamine for hay fever, and only after a while was it found to have effects on dementia. The huge market potential motivated some companies to try to break into the market in unacceptable ways, for example by "reviving" a drug that failed or did not complete clinical trials and selling it as a nutritional supplement or as a "medical food", which is subject to less strict supervision.

Similarly, new drugs may emerge as regulatory agencies approve expansion of permitted uses for a drug already known to have a cognitive effect. Cephalon, the manufacturer of Modafinil, followed this path, obtaining FDA approval to market the substance to shift workers, a much larger group than the narcolepsy (people with uncontrollable sleep attacks) for whom it was approved in the first place. (Cephalon also paid about $444 million to two states and the federal government for promoting three drugs, including modafinil, for unapproved uses.) The drive to improve cognition—whether to enhance mental focus or to help remember a friend's phone number—may prove tempting. So much so for producers and consumers alike that it may make us forget the inevitable risks of tampering with the neural circuits that give us our basic sense of self.

 

The article was published with the permission of Scientific American Israel

3 תגובות

  1. Every time the same nonsense.
    What will happen is known in advance, the same stages as in the "unnecessary buying process":
    1. Stand by the seller's stand and start thinking if I need the product - those who need a certain product don't need to think, they know, come and buy (there is no real need for a self-convincing journey).
    2. Begin the unnecessary self-convincing journey (I don't think I need it, but maybe....).
    The first two steps are ethics.
    3) Convincing and buying in most cases (in this matter - mass release of drugs that "enhance brain function" into the drug market).
    And the crushing excuse will probably be that the most non-ideal situation is to allow out of choice only a handful of people to use such drugs because they know where to get them (people who are close to illegal research laboratories).
    Imagine that of all the people in the world, only a thousand people (not you) could use a smartphone!
    Wouldn't you feel cheated?
    I have a smartphone and I'm studying psychology, so if I'm not at the computer I can watch the lectures on the smartphone, the same goes for my e-mail, etc.
    Would you allow such discrimination that I have and you don't, when financially we can all have a smartphone?
    What about an apartment, or an air conditioner, or a shower?
    In some fields people leave ethics and rationale aside.
    The first example that comes to mind:
    "Violent" computer games and "violent" movies - for decades now every scientific study, or at least in most of them, has reasserted that these games and movies make people more violent and maybe even turn some people into murderers.
    Where are the "ethical sciences" on this issue, where are the ethics on the issue of building atomic bombs?
    People's lives, isn't this a topic related to ethics?
    If ethics have been abandoned in what is related to the life and death of people, then there is no need at all to discuss "eugenics by choice" (even if the choice is not to be fired from work - it is still a choice).
    Besides, what does ethics have to say about the current situation where the level of mental abilities of each person is different?
    is it ok
    Is it okay for one person to have an IQ of 70 and another 100 and a third 140?
    Is it okay that human society as a whole is not ready to understand the first and third "types of people" I mentioned?
    So it's true, when the drugs that "enhance brain function" reach the "market", the difference between the levels of the population will increase, but at least the "stupid" will become "smarter"!
    If it is possible to turn a person with severe intellectual disabilities (a person with severe mental retardation) into a person with an average mental level (90-110 IQ today) that will allow him a better life with more options to choose from, who cares if people with 100 IQ become geniuses? !
    Why are ethics committees egoistic?
    Is being egoistic yes ethical?

  2. As a nursing assistant on behalf of the Mata nursing company, I am really happy that there is a drug that may make the lives of the elderly easier.
    I wish it was proven to work.

Leave a Reply

Email will not be published. Required fields are marked *

This site uses Akismat to prevent spam messages. Click here to learn how your response data is processed.