Genes fads: research bias has neglected most of the human genome

Most genes remain a mystery nearly 15 years after scientists sequenced the human genome

Genes fads: research bias has neglected most of the human genome

When the human genome was first sequenced in 2003, scientists were optimistic that a medical revolution was on the horizon. The promise of personalized medicine seemed within reach. 

But 15 years later, this revolution hasn’t materialized and researchers are still uncovering the meaning of the genome. 

The research bias

Although the human genome has about 20,000 genes, researchers have focused most work on a small minority of genes. 

In 2011, Gary Bader, a professor at the Donnelly Centre for Cellular and Biomolecular Research, and his colleagues contributed to an article in Nature that highlighted this gap in research. 

More recently, a study led by Thomas Stoeger at Northwestern University reported that the gap remains and scientists are still studying only a fraction of the genome. 

Based on Stoeger’s work, approximately one quarter of genes have never been studied by a full publication and remain poorly characterized.

Gene trends through the decades 

Different genes have been popular over the decades, falling in and out of fashion with time. The National Library of Medicine (NLM) in the United States has been tracking publications on genes in its PubMed database, which revealed these trends. 

In the early 1980s, a significant chunk of genetic research focused on HBB, a gene critical for the development of hemoglobin, the molecule responsible for carrying oxygen in red blood cells.

Interest in hemoglobin was spurred by the work of researchers in the 1940s and 1950s who discovered the role of abnormal hemoglobin in sickle cell disease, a disorder in which individuals are at risk of developing multiple infections and pain episodes over their lifetime. 

But hemoglobin’s popularity was short-lived. The 1980s brought about new medical concerns that shifted genetic research to different diseases. 

In particular, an unknown immune system disease that was striking apparently healthy individuals at alarming rates and overwhelmingly affecting gay men shook the public and the medical community to its core. 

Scientists soon discovered that the mysterious illness was attributed to human immunodeficiency virus (HIV), a virus that targets CD4 cells, which are a type of mature T cell that help coordinate the immune response to an infection. 

The outbreak of HIV across the world garnered attention from politicians, policymakers, and the research community. By 1987, the CD4 gene dominated genetic research and retained its popularity until the mid-1990s. 

By 2000, the TP53 gene was gaining traction. Dubbed the ‘guardian of the genome’ by some, the TP53 gene is a tumour suppressor gene and mutates in nearly half of all human cancers. 

While completing his doctoral studies at the University of Vienna, Peter Kerpedjiev sifted through the NLM records and generated a list of the most studied genes. His work showed that TP53 is not the only popular cancer gene: four out of the top 10 most studied genes of all time — TP53, TNF, EGFR and ESR1 — all play some role in cancer development or are targets for cancer drugs.

TP53 was briefly dethroned by APOE, a gene that was initially associated with cholesterol but whose popularity exploded when researchers made a link between variants in the gene to a risk of Alzheimer’s disease. 

Kerpedjiev’s work showed that both genes remain popular in research today. 

Why are some genes more popular than others?

“Researchers usually first study these genes since they seem most important, and this is the answer why only a ‘minority’ have been studied so far,” wrote Stephen Scherer, director of U of T’s  McLaughlin Centre and The Centre for Applied Genomics at The Hospital for Sick Children, in an email to The Varsity. 

Steven Narod, Director of the Familial Breast Cancer Research Unit at Women’s College Hospital, whose research focuses on BRCA1 and BRCA2, two well-characterized genes, proposed that other factors could also be at play, such as the prevalence of mutations in the genes. 

He further explained that research tends to focus on genes “where the clinical implications are clear and the interpretation [of mutations] is straightforward.” 

There are also significant barriers that deter novice researchers from studying unknown genes, according to Bader. In an email, he explained that “it can be difficult for researchers to take risks and explore new territory because if they don’t succeed, they may not be able to continue being funded.” 

Based on Bader’s commentary, funding agencies are generally risk-averse and are less likely to support studies on lesser-known genes, which poses challenges to researchers interested in studying such genes. 

What does the future of genomics hold?

Scherer is hopeful that change will come over time. 

“[The genes] will all be studied but there are only so many resources (human and financial) available, and this will take some time,” wrote Scherer. 

Bader stressed that funding agencies can be part of the shift, by encouraging researchers to explore unknown regions of the human genome.  

For example, the US National Institutes of Health has established funding opportunities targeted at researchers investigating poorly characterized genes. 

The advances in genomic technologies will also likely play a role in the future. 

“New genomics technologies are accelerating progress and making it easier to discover interesting genes,” wrote Bader. He also encouraged researchers to “consider devoting a percentage of their time to exploring new territory, if they are not already doing this, in addition to the major projects that they work on.”

With advancing technologies and support from granting agencies, perhaps the rest of the human genome will become less of a mystery.

Study finds ‘lost’ memories in mice can be recovered

SickKids researchers use light to control neurons and aid memory recovery

Study finds ‘lost’ memories in mice can be recovered

Why is it that we can’t completely remember events from our childhood?

Previous studies have shown that infants are unlikely to remember event-based memories. As infants, we lack the cognitive abilities to consolidate and store autobiographical memories. As we grow older and our brains develop, new neural pathways are made in the place of old ones, leading to a near-total loss of memories from the first few years of life.

A unique form of this circuit recalibration and the most impactful on childhood forgetting is hippocampal neurogenesis, or the generation of new neurons in the hippocampus, the region of the brain primarily responsible for memory consolidation.

A study published in Current Biology outlines how memory loss in infants occurs, and how scientists induced their recovery using optical stimulation a technique that uses light to trigger neurons in mice.

Researchers in the Frankland Lab at the Hospital for Sick Children have been studying patterns of neural activity during autobiographical memory formation. Once researchers mapped out patterns of neural ensembles during encoding, they later reactivated specific neurons in the same pattern to test whether the subject could remember the encoded memory or not.

“Successful memory retrieval occurs when some specific spatial-temporal pattern of neural firing is engaged,” wrote Axel Guskjolen, a PhD candidate in the Frankland Lab and lead author of the study, in an email to The Varsity. “If that specific pattern of neural firing fails to occur, then the animal fails to retrieve the memory, resulting in forgetting.”

In the lab, fear was encoded into mice of varying ages, both infants and adults, by exposing them to small conditioning shocks. When returned to the training context, the older mice were able to retain the memory and freeze at the times and locations that they expected the footshock days after training. However, infant mice were a different story.

“In our experiment, infant mice successfully encode a memory but fail to retrieve it when [tested] at long retention delays (i.e. infantile amnesia). Using memory-tagging and optogenetic techniques, we were able to bring the memory back by forcing the neurons that were involved with memory encoding to become active again,” wrote Guskjolen.

When their neurons were treated with light, the infant mice were more likely to remember where to freeze. The stimulation of the neurons in the hippocampus led to artificial memory expression, even 90 days after initial training.

“This finding is a bit of an enigma because we forget the earliest experiences of our lives a phenomena known as infantile amnesia,” added Guskjolen. “The finding that the physical basis of these memories still exists in the brain in a ‘silent’ state might explain how these forgotten memories continue to influence our thoughts and behaviours as adults.”

Initially, the researchers questioned whether the loss of memory in the infant mice was due to storage failure, where there isn’t enough space in the brain to retain memories, or retrieval failure, where memories are retained but the brain isn’t able to access them. However, throughout the study, the mice who were encoded with memories and opto-stimulated were able to experience these same memories again. The memory loss was therefore a case of retrieval failure.

According to Guskjolen, the implications of these findings for human medicine are hugely significant as the many “commonalities across mammalian brains in terms of neural subtypes, structure, and function” suggest that that these results will be translatable to humans.

“Many disorders that afflict humans are at their heart disorders of forgetting. Sometimes these disorders are characterized by too much forgetting (Alzheimer’s disease) and sometimes by too little forgetting (Post Traumatic Stress Disorder),” wrote Guskjolen. “To find cures [for] these disorders, it is important that we first understand the mechanisms of forgetting under normal circumstances.”