It all starts with the question we’ve been told we have to decide in order to make the right career choice, when the time comes to start being financially independent: should I choose a career in industry or academia? The problem here is that the question itself is rigged, being based on the assumptions that 1) a person can only be good at one thing and one thing only, and 2) that in order to make a living, one has to put all their efforts toward the advancement of a larger organization. What were not usually told is that there are several ways to leverage ones skills to make a living in through independent, self-determined efforts, and more so now than ever before.
The result of such a limited way of thinking is that highly skilled or educated people choose one of the two options, thereby concentrating most of the innovations of society within large organizations: corporations or academic institutions (many of which are privately owned, blurring the distinction between the two). And along with those innovations, most of the research funding is concentrated there as well. Such phenomenon is should not be taken for granted as the natural order of things; they are merely the result of various decisions and policy-making which are especially favored by the free market, unregulated economy of Western nations over the past two centuries, leading to the exploitation of science primarily as a means for monetary gains, rather than to improve people’s standard of living or otherwise simply understand how the natural world works.
But the trend is starting to change. With technology becoming miniaturized, cheaper, and more accessible to the average person, there could soon be a major shift in the mindset of society that innovation has to stay in the confines of large, profit-driven organizations. Some of the signs are becoming apparent, with the rise of research crowdfunding sites, the open-access movement, and the trend of more and more young people choosing the startup route, bringing technological innovation directly to consumers, rather than them having to filter through the drawn-out bureaucratic process of technology transfer and R&D in universities and corporations, leaving a large and unnecessary gap between the time of development and the time public actually gets to see the end product.
With such factors on the rise, there are are clear reasons to define why the trend toward decentralized, independent research will be a good thing for our society and researchers themselves.
Because universities are funded in proportion to its research output, academic departments and laboratories are willing to spend exorbitant fees on journal subscriptions just for the opportunity to publish. This leads to, first of all, the incentives for doing research becoming skewed, from one of scientific to monetary motivations, and also serves to distance the ownership of the research away from the researcher and to the university as a whole. Another problem with this economic system is that journal publishers are disproportionately profiting off of the fees that institutions have to pay in order to get published.
The economics are wildly favorable for the publishers, who essentially take the free labor of academics and researchers (the authors and peer-reviewers of scholarly work are unpaid for their contributions), and sell it back to the universities that employ these authors, often at a very high cost. — Quartz
It follows that a university has to pay to get published, and needs to get publish to get paid, resulting in a situation wherein the journal publishers are the ones who practically dictate the success of a lab or entire institution.
As the cycle of publishing and funding continues, costs rise, and revenue for the publishers far exceed their cost in editing and maintaining the journals.
This leads to the integrity of the journal business becoming corrupted by the incentive for financial gains it poses, with some academics having gone so far as to call for a boycott of one of the foremost academic publishers, Elsevier, for their high subscription prices and other purported unethical practices.
This misincentivized economic cycle also results in a decrease in the quality of research, in exchange for higher quantity of output, as well as a growing trend of fake or highly unreliable publishers coming onto the scene, as it’s seen as a lucrative business opportunity.
But the publishing dilemma is not the only one. There is also the question of who has the intellectual property rights to — and therefore any future profits from — any technology developed in a university lab, and that answer is usually — yup, you guessed it — big business.
While research laboratories are preoccupied with publishing, they may overlook the intellectual property considerations of their research. Although for many researchers, making profit off of an invention is not a primary motivation, that’s no excuse not to be informed of the possibilities to gain financially from one’s efforts. When corporations sponsor academic research, it is in their interest to take the rights to any inventions that are developed through said research, which they can then commercialize and make a profit off of (considering that the primary goal of a corporation is — unless some major change has occurred in the fundamental structure of society that I haven’t heard about — to make profit). Thus, the researcher is satisfied merely getting published, while the corporations get the rights to their technology, effectively creating an oligopoly of innovation and eliminating any chance of competition from smaller startups in the same industry. Thus, another cycle is formed in which researchers have little incentive to commercialize their technologies, widening the gap in incentive between the financial beneficiaries and those who put in the effort, and corporations get ever bigger and maintain their grip on new technologies (not to mention the cultural implications of such financially-driven innovation).
Research is also big business — at large universities like Stanford, sponsored research often exceeds revenues from tuition. *
To sum it up, researchers are not being compensated in a way that is commensurate with their efforts, whereas the larger organizations (universities and corporations) are the ones who are benefiting most. Such correlation is the basis of a functioning economy, and without it it’s a wonder the research industry has managed to sustain itself so long.
Oftentimes, nowadays, many people are going into a PhD with the sole intention having a better looking resume, or to get a marginally higher salary at their future job. But this begs the question: are 4 to 6 years of research either necessary or relevant to performing a narrow corporate job description better than someone with a bachelor’s or master’s degree? The skills one gains in research are incredibly different, and arguably less relevant to many positions, as compared to those gained through pure work-experience over the same duration of time. This should not come as a surprise, considering that a PhD was not designed as a qualification tool, or as a level-up from a Bachelor’s or master’s degree, but rather, a period of time in which a curious person can delve into a field of interest to broaden the body of knowledge about a particular field.
From this perspective, it seems to be a misalignment of incentives for a student to go into a PhD solely with employment-related aspirations. Whereas that student will gain a degree in a field she may or may not have more than a vague interest in, the university sponsoring such a degree reaps the benefits of government and corporate funding to sustain research operations, as well as maintains their academic standing through the publication output and reputation boost it implies.
As such, universities and corporations are both benefited, but the researcher herself gains nothing but a degree. The disheartening result is that many PhD students are graduating into a oversaturated pool of qualified individuals, most of who end up working for a small handful of large corporations, ultimately feeding back into the cycle of oligopolization of innovation, and further leading to the devaluation of higher education qualifications, including the PhD.
Expanding the scope of research outside of a handful of academic institutions would be an incredible step, one that could allow part-time researchers to contribute their knowledge as well.*
The current trend of academic research is not exactly promising. But sooner or later, it will become evident that a degree is not the only means to career success, and more creative ways of sustaining a living will come about. With crowdfunding platforms and open access publications, research could be re-envisioned as an independent practice, unreliant on the corporate or university structure to make it a sustainable occupation. As researchers become aware of the potential commercial applications of their innovations, they may more readily pursue startups as a means of not only sustaining their research operations, but also bringing their technologies out of the lab, and into the real world, making a direct, positive impact on people’s lives. Inquisitive minds will be empowered to freely experiment without the exhausting pressures of academic life nor the restrictive obligations of corporate employment. Corporations will be pressured to compete on a fair playing field with independent research and startups, unable to count on their large budgets to buy new innovations to maintain their market share.
With such a large scale revision of the existing ecosystem, the integrity of research will naturally be restored, and the incentives for pursuing scientific investigation will be realigned — all based on the motivation of an individual to fulfill their innermost potential… and not for the sake of a higher salary or a more competitive resume.
*Quotes from this article by Techcrunch.
For more musings on the intersection of technology, education, and entrepreneurship, follow me on Twitter! 🐦