As it turns out, plants have a memory of their own — sort of. Researchers at the John Innes Centre in Norwich, England have recently found supportive evidence for epigenetic memory in plants. The finding came about upon setting out to trace the mechanism by which plants distinctly use memory of recent winter lengths to determine when to flower.

The answer was found in the behaviour of histones, a type of protein that encloses and orders DNA. Without histones, DNA would be unravelled and considerably longer in length. When the histones, located near the gene responsible for flowering time, were examined, it was found that histones would change according to exposure to cold weather and coax the FLC gene in neighbouring cells into an “on or off” position. The longer the plant was kept in the cold during the first testing, the longer it would take to flower during the second round. What was revealed was an adaptive and quantitatively predictable method of plant ‘memory.’ Since epigenetics revolves around heritable changes in gene function that happen without a change in the DNA sequence, it seems as though plants have somehow formed an entirely different method of “learning” and adaptation.

alt text

This type of memory has obvious survival benefits. Premature flowering, in most cases, would lead to the death of the plant, whereas perfect timing leads to optimal pollen and seed dispersal. Such memory works similarly to humans’ in that we must first learn dangers in order to prevent them and survive. There are many dangers that human offspring do not automatically know how to avoid, forcing them to cycle through the same learning processes for themselves. This is where a very interesting component of epigenetics steps in — transgenerational epigenetics.

Previous theories supporting the idea that one can pass on to offspring the traits accumulated during the parent’s life have been dismissed as implausible and written off as “Lamarckian,” an abandoned idea both pioneered and named after a Russian scientist before the discovery of the genome. According to a paper by Eva Jablonka and Gal Raz published in the Quarterly Review of Biology, there have now been numerous studies that support evidence of transgenerational epigenetic inheritance.

One example involves the study in the European Journal of Human Genetics of random samples of males born during three different time periods in Överkalix, Sweden. They aimed to investigate the magnitude of influence the childhood circumstances of the proband, the individual serving as the starting point for the genetic study of a family, had on the various collected data. Notably, the study found that ancestors’ nutrition was the biggest influence on the longevity of transgenerational successors.
However, not all effects are generations away. One study by Robert A. Waterland and Randy L. Jirtle explored the effects of early nutrition in viable yellow ‘agouti’ mice; the mice are called agouti since they have a transposable element in the agouti gene. The researchers found that inserting chemicals into the transposable element insertion sites allowed dietary supplements to alter the offspring’s nutrition. This suggests that dietary supplements are more than just beneficial and can actually affect epigenetic gene regulation in humans.
Until recently, we had no basis in biology to explain the mechanisms behind the phenomenon — the idea of a non-sentient organism, such as a plant, retaining “memory” was too far-fetched. Yet, when we examine these structures at the molecular level, we see all the components are in place to make it possible. Considering what the theory of evolution has taught us, it’s not a stretch to determine that the likelihood of evolving with such beneficial mechanisms is right within line. The next step for science is to create a road-map for more of these predictable epigenetic responses.