The invisible fraction: Seed mortality during storage and the interpretation of ressurection experiments
The resurrection paradigm is a very powerful tool for detecting recent adaptive evolutionary change. When zygotes from ancestral generations of a population are grown side-by-side with their descendents (and with ancestor-descendent hybrids) divergence between generations in a focal trait can be confidently attributed to genetic change. The tool is not foolproof, however. Apparent adaptive change in a focal trait could be due to immigration or drift—alternative causes that can be assessed by genomic methods. A more difficult problem is that of “the invisible fraction”. When reviving “stored” dormant seeds from soil or resting eggs from sediments, not all make it. This raises the question: Are the successfully revived zygotes a genetically random sample of their generation, or, has some selective process acted during the storage period? This becomes important if the focal trait is genetically correlated to a trait influencing survival during storage and revival. It could introduce a bias clouding the interpretation of resurrection experiments by future users of the Project Baseline seed bank. Calculations based on the multivariate breeders equation can reveal the potential range in bias due to storage selection.
The worst-case scenario entails the following: all storage mortality is selective on a seed trait; heritability of the seed trait is 1.0; and, the seed trait is perfectly genetically correlated to the focal trait. The first point implies truncation selection, so that the intensity of selection is a function of the seed mortality rate. Suppose selection in the wild has been zero. Under the worst-case scenario, 10% seed mortality would cause ancestors and descendants to diverge by 0.19 standard deviation units (sdu). A 50% mortality rate would lead to a 0.80 sdu divergence in the focal trait. To use a less extreme example (seed trait heritability and its genetic correlation to the focal trait both = 0.5), 10 and 50% mortality rates would lead to ancestor-descendent divergences of 0.07 and 0.28 sdu, respectively. Although disturbing, these calculations make the extreme assumption that all seed mortality is selective. Bias falls when this assumption is relaxed. Fortunately, Project Baseline users will know the size of the “invisible fraction”, and thus be able to compare results of their resurrection experiments to both worst-case and more plausible scenarios, and thereby evaluate the likelihood of adaptive change.