From the empirical to the molecular: Understanding biology and … – Open Access Government

The German American physiologist and experimental biologist Jacques Loeb (1859-1924) was one of the most vigorous promoters of biology as an experimental science in the 19th century. Influenced by the physicist and philosopher Ernst Mach, he at first pursued the goal of engineering life by devising techniques to experimentally control animals life functions.

In Machs positivist-empiricist approach to epistemology, understanding life meant controlling life phenomena by physical and chemical means. Loeb became most famous for his success in bringing about artificial parthenogenesis in sea urchins (1899).

Due mainly to new developments in biochemistry and genetics around 1900, which pointed to the crucial role of macromolecules, in particular proteins and DNA (then nuclein) in biology, Loeb abandoned the empiricist-phenomenological approach and the aim of controlling life by purely empirical means. He increasingly focused on the search for molecular mechanisms and causes. He now promoted the view that life was based on the interaction of specific macromolecules.

The principles of biological specificity residing in protein diversity and genetic causality based on the nuclein in chromosomes were crucial for understanding life. Loebs vision and the causal-mechanistic approach to which he significantly contributed at an early stage, became the foundation of molecular biology and are also the basis for research in synthetic biology.

Molecular biology, the search for the molecular understanding of the basic structures and functions of life, such as heredity, development, biological information, and also of processes such as evolution and problems such as diseases and their cures, was the most successful branch of 20th-century biology.

This search for molecular mechanisms of life was rejected at the time by vitalists, nature philosophers, morphologists, and positivists/empiricists such as Mach. Today, molecular biology is challenged not by philosophical currents but by another empiricism scientific movementthe big data revolution in genomics. In contrast to the 19th-century positivism/empiricism that was directed against metaphysical speculation and religion, and, in the case of Loeb, a strategy to fight mysticism and superstition in science, the 21st-century empiricism resulted from the development of new technologies in DNA sequencing and computation. What both have in common is the marginalization or rejection of causal mechanistic examination and explanation of biological phenomena.

In biology, the existence of large amounts of sequencing and gene expression data and powerful computational methodology tremendously facilitates systems approaches and pattern recognition in many fields of research. But data-driven correlation is also used to replace experimentation and causal analysis.

Todd Golub, the director of the Broad Institute of MIT and Harvard, promotes unbiased genomic surveys that are taking, for example, cancer therapeutics in directions that could never have been predicted by traditional molecular biology. According to him, the large-scale, data-harvesting approach to biological research has significant advantages over conventional, experimental methods (Golub 2010). While many genomics institutes are still pursuing the analysis of molecular mechanisms, the trend to data mining is also rising, especially among young researchers.

This new empiricist tendency disregards that science is more than statistics, correlations, or pattern recognition. For a complex science like biology, knowledge of mechanisms is crucial to answering questions about issues such as the causal role of genes and genetic variation in development or the effects of perturbations or diseases on an organism. Science aims at causal explanations of normal functions in the cell and also of deviations such as diseases.

Genomicist Edison Liu perceives great danger when experiments and hypotheses are abandoned in favour of big data: It is fallacious to believe, especially in the complexity of the human body and disease, that you can make consistent predictions simply on data. The term big data is relative and too liberally used how big is big, and when is data big enough to have confidence in the predictions?

In biology, theres usually not enough data.

The other aspect is that we know only what we know. If you had talked to us 25-30 years ago, the argument was, if I knew every single gene element and promoter, I would be able to predict you as a human being. Well, Im sorry that doesnt happen. It doesnt happen because what we thought of as the universe of known information is only a small fraction of reality. We now know that the complexity of splice variance, the complexity of alternative promoters, the complexity of post-translational modification, the complexity of gene-gene interactions, the complexity of eQTLs (genomic loci that explain variation in expression levels of mRNAs), where distant enhancer sites affect a gene megabases away. This is all new information.

what we thought of as the universe of known information is only a small fraction of reality

So, if we were simply to model on old data that we considered was the totality of the biological universe, our predictions would have been mainly wrong. This is why I think the idea that big data in medicine is going to supplant experimentation is not only unreal, its absolutely dangerous. In fact, Im really fearful that were going to fall into the trap of the Dark Ages. (Liu 2022).

uted@post.bgu.ac.il

Editor's Recommended Articles

Read the rest here:

From the empirical to the molecular: Understanding biology and ... - Open Access Government

Related Posts

Comments are closed.