Why Is Living Soil a Disruptive Innovation?


We know soil is alive.  Why do we treat it like dirt?

The first lesson in the first soils class I ever took covered what soil is made of.  We learned soil contains minerals, air, water, organic matter, and “critters”.  Yes.  That’s right.  Critters.  All those little squirmy, wiggly things that live underground.  Bacteria.  Fungi.  Worms.  Insects. Those living things are actually part of the soil.  They are part of what makes it work.

Curiously, the rise of modern agriculture can be highlighted by innovations like the mechanical plow in the 1700’s and the rise of agrochemicals in the 1900’s that have so greatly reduced key soil components (biota, water holding capacity, organic matter, etc.) that agricultural scientists throughout the past century have repeatedly warned of pending agricultural disaster.   In the US, William Albrect, who built much of his career around the relationship between soil nutrition and both livestock and human health, warned of public health risks.  The Indian scientist, Monkombu Swaminathan, who helped Norman Borlaug introduce hybrid genetics to India, warned the Indian Science Congress of potential to create an era of agricultural disaster.  You can read this warning, which succinctly forewarned many of the problems we see today, in the info box below.

[box type=”info” align=”alignright” width=”200″ ]”Exploitive agriculture offers great dangers if performed with only an immediate profit or production motive. The emerging exploitive farming community in India should become aware of this. Intensive cultivation of land without conservation of soil fertility and soil structure would lead, ultimately, to the springing up of deserts. Irrigation without arrangements for drainage would result in soils getting alkaline or saline. Indiscriminate use of pesticides, fungicides, and herbicides could cause adverse changes in biological balance as well as lead to an increase in the incidence of cancer and other diseases, through the toxic residues present in the grains or other edible parts. Unscientific tapping of underground water will lead to the rapid exhaustion of this wonderful capital resource left to us through ages of natural farming. The rapid replacement of numerous locally adapted varieties with one or two high-yielding strains in large contiguous areas would result in the spread of serious diseases capable of wiping out entire crops, as happened before with the Irish potato famine of 1854 and the Bengal rice famine in 1942. Therefore, the initiation of exploitive agriculture without a proper understanding of the various consequences of every one of the changes introduced into traditional agriculture, and without first building up a proper scientific and training base to sustain it, may only lead us, in the long run, into an era of agricultural disaster rather than one of agricultural prosperity.  Monkombu Swaminathan, Presidential Address to the Agricultural Sciences Section of the Indian Science Congress, 1968″ [/box]

By disregarding the importance of soil as the home of microbial communities that are critical for food webs, we have developed and industrialized agricultural practices that damage soil and decimate the microbial communities it supports.

 

Does Treating Soil Like Dirt Pay Well?

The institution of the cast iron moldboard plow came at a time when the political trail of market diffusion was even less clear than today, so for the purpose of this discussion, I will grant that the planting ability made possible by turning the soil, combined with the absence of science to illustrate what tillage did to vital soil life made the moldboard plow seem, up front, like a beneficial innovation that greatly enhanced agriculture. While I don’t claim to be an 18th century historian, the fact is that I am not aware of any established science available at the time that contradicted the use of the plow.  In all fairness to the traditions and oral histories shared with me over the years, I will note that the use of the plow was repulsive to those indigenous people who recognized soil as a living entity.  The voices of these ancestors were silenced by the land grabbing, money driven leaders of the time.

In more contemporary times, most Baby Boomers will recognize the name of biologist Rachel Carson, whose book Silent Spring catalyzed the banning of the pesticide DDT, but also provided a target for those in industry and government who claimed that pesticides were necessary to meet food demands. The controversy that surrounded Silent Spring, like the controversy surrounding many efforts to protect environmental quality, suggest that the motivations for our failure to manage soil for good health stem from the potential to earn money by offering chemical solutions to growers.  Industries that produce and distribute chemicals invest a lot of money into research and development, patents, and licenses that allow them claim to products which are difficult for others to replicate.  This formula for creating unique products is also a recipe for tremendous income generation, monopoly formation, and wealth concentration.  Unfortunately, it is simultaneously a formula that promotes soil and environmental damage while creating disparity between haves and have-nots.

   Statista reports a global agrochemical market value in excess of 215 billion dollars in 2016, that is expected to reach 308 billion by 2025. As anyone who understands soil health knows, the growth of this market is dependent on the decline of basic soil health.  As soil quality deteriorates, the need for agrochemicals increases.  This creates a powerful incentive for the agrochemical industry to promote practices that reduce soil health.

Now, it is natural to ask, are industry professionals really so driven by incentive to make money at the expense of our future?  Afterall, don’t most people operate out of a basic good intent?  Evidence presented by Carey Gillam suggests that many leaders driving the industry quite skillfully sway government officials to regulate in their favor.  The unspoken truth underlying such information is that when leaders manipulate regulators behind closed doors, and those regulators, whom we trust, issue statements about product safety, it becomes easy for the average, well meaning employee to accept this information as fact.  Afterall, why would anyone object to using a safe chemical that says on the label it can reduce pests or improve crop production?

Like subjects in Milgram’s classic experiments, most people will simply follow directions once they are provided by a respected authority.  In the case of agriculture, once a critical mass of people were convinced that the use of synthetic and -cidal chemicals on plants and soils was a good idea, laws were established to regulate their “safe” use, and universities and tech schools institutionalized programs teaching “safe, acceptable” usage practices.  Safe and acceptable usage practices were based on assumptions made during risk analyses by authorities who calculated how many deaths and illnesses could be deemed acceptable, or even preferable, to the potential for lost income and/or hunger that might result if chemicals were not used. Unfortunately, such arguments rarely include calculations that compare outcomes to the outcomes of truly regenerative agricultural practices.

As teachers and mentors began showing their closest contacts the “right” way to use agrochemicals,  chemical-based agriculture became conventional.  While this institutionalized practices that were profitable for only a few large companies, from that day forward, only a major disruption could reverse the trend towards increasingly chemically dependent soils.  In recent years, new techniques for analyzing microbial communities are providing information that may catalyze such a disruption.

Scientific Knowledge About Soil Life has Been Repressed.

Fortunately, science has progressed since the 1800’s.  Unfortunately, the direction of its progress has too often been biased by funding provided through governments who are subject to corporate influence.  Universities and government research agencies receive more funds to support research that drives corporate special interests than they receive to support research that supports basic community development, education, or human health and nutrition.  When I worked as a research scientist for a government agency, I was always surprised by the resistance seemingly bright and scientifically motivated administrators, journal editors, and reviewers expressed when data was presented that revealed the elegance of natural systems, but failed to offer a route to development of a marketable technology.  Don’t get me wrong.  I have no objections at all to either money or marketable technologies.  But when I worked as a scientist supported by public funds, I understood it was part of my job to serve the public.  So when I began finding microbial communities in natural habitats that could be transferred to plants to improve their productivity, I thought this should be studied in more detail, because in theory, it offered a method that any grower could access and use in any habit to improve production without the need for costly chemical inputs whose safety continues to be the subject of ongoing debate.    Now, it would take pages of text to describe the layers of the bureaucratic onion that kept me from advancing this research.  But when I was told, point blank, by a National Program Leader that the direction I wanted to take my research would never be supported because “this agency will never support an effort that doesn’t benefit the agrochemical companies,” I realized that my agency was serving corporate, as opposed to public interests.  I also realized that advancing ecologically based microbial research would continue to be an uphill battle largely because microbes in agriculture were seen by the chemical industry as a disruptive innovation.  As such, the full implications and promises about microbes in agriculture had been, and would continue to be silenced and censored because industry was going to use every weapon in their armory to protect its own interests. Sure, agencies will continue to publish data highlighting benefits of individual (aka patentable) species.  But data that highlights the potential of any healthy soil to recruit naturally occurring microbes that enhance plant productivity will be repressed, and studies designed to more fully understand this potential will always be under-funded.

Soil Biodiversity Promises to Restore Agriculture, Nutrition, and Rural Economic Prosperity.

The good news is that as the biotechnology has ballooned, so has its ability to detect and identify microbes and genes in the environment.  The complexity of this information is overwhelming to scientists and the general public alike.  However, many scientists and growers alike are recognizing the need to consider the ecological implications of this complexity, and work with the natural system, rather than exploiting isolated, marketable features.   As we do this, farmers and researchers alike are finding that microbial communities offer unbelievable potential to enhance agricultural production safely, sustainably, and economically while offering benefits to human nutrition and local economic development.

Disruption Can Be a Good Thing

The value of restoring agriculture and rural economic prosperity may not immediately resonate with fast money economics of technology driven urban areas. But the value of restoring human nutrition should.  After all, rising health care costs and how to deal with them have dominated political debates of at least the last four presidencies.  It is well known that the advance leading chronic diseases like heart disease, diabetes, and cancer, is impacted by nutrition. As numbers of chronically ill adults increase, workforce productivity declines and costs for all of us increase.   Agricultural practices that erode soil, inhibit biological cycles, maximize yield at the expense of soil health, and support sales to middle agents, who process foods in ways that reduce nutritional value, also impact human nutrition  in ways that impact health care costs.

Changes that promote more sustainable, biologically based agricultural practices will undoubtedly disrupt the flow of resources that is currently moving capital towards industrialized operations and urban power centers.  When more environmentally sound, biologically driven systems form, resources and capital will migrate towards smaller, more local farms and food outlets, and remain within the communities those entities support.    This is not to say that growth in existing power centers will come to a halt.  Afterall, some biologicals will still be patented and monopolized by large entities.  However, as individuals come to terms with the tremendous biodiversity their own soil commands, fewer growers will entrust their production to patented amendments monopolized by a few companies.  Furthermore, fewer consumers will accept the risk of eating heavily processed, nutritionally depleted foods.  This diversification of production and consumption choices will cause the net flow of agricultural goods and services to shift in new directions, creating new  opportunities for better health, better environments, and more prosperous communities.

 

Leave a Comment