draft

Simulations for Science Token Communities preprint

Abstract

The curation and dissemination of new knowledge is one of the key pillars of science and plays an integral role in maintaining the scientific method. Despite its significance, knowledge curation is dominated by a handful of players who offer academic prestige to those using their services, leading to suboptimal incentives for the stakeholders within the science ecosystem. Rather than optimizing for transparency, collaboration, and overall scientific discourse, researchers act so as to maximize their chances of receiving funding and of getting published to high-status journals, which leads to resources being allocated towards proposing attractive research projects but, at the same time, discouraging collaboration and data sharing as that would destroy any competitive advantage. This works presents alternative approaches to scientific funding and publishing that maximize the incentives for transparency while alleviating some of the costs imposed on researchers by the institutions that employ them. Through simulation of the value flows in science, we conclude that many of the current problems facing academia may be solved by shifting the system towards decentralization and open knowledge markets.

Introduction

Value Flows in Academia

The scientific value flow starts with grant funding. Depending on the grant funding agency, research teams receive funds either by directly winning the grant from the entity that supplied it or by indirectly being allocated a portion of funding available to a research institution, e.g. a university.

When grant funding is curated by an intermediary, it is difficult to comment on the effectiveness of its allocation, but it is clear that some value from the initial grant will get lost in operational costs.

The primary job of a researcher is turning financial value into knowledge value, in other words, using the grant funding to conduct research. In a later section, we will the value leakage that takes place at this stage. After a research project is finished, its results are communicated through academic papers which need to pass a process known as peer review before being published to a jounal or a conference. Peer review is conducted on a voluntary basis so it should act as a positive injection into the science value flow since it constitutes a free service. Recent work has shed light on some problems with the current state of peer review, such as a lack of discourse and a shortage of people willing to conduct it. The final stage of the science value flow is publishing the work to journals, which, depending on the popularity of the journal, require a monetary fee.

Value Leaks and Siphons

As mentioned in Value Flows in Academia, part of the funding curated by a centralized entity employing researchers is inevitably lost in operational and other hidden costs. While such centralized entities clearly offer a service to the scientific community by attracting and training new talent, such hidden costs constitute a significant value leak, as money that was intended to advance scientific knowledge is instead used to pay for the hours spent deciding how to disburse it between different departments and research teams.

In the current science value flow, an academics career is wholly dependent on their publications. This has lead to an unprecedented increase in the number of research papers submitted to journals annually, with the addition of a substantial usage of words like "innovation", which begs the question whether scientists of today are more productive than previously or whether the system is promoting high impact and short term research projects that lead to an abundance of publications for those conducting them. At the same, these conditions lead to a hyper-competitive space where collaboration and data sharing is discouraged on the basis that any intellectual property that is available freely cannot serve as a competitive advantage to getting more publications in the future.

Furthermore, as researchers are entirely dependent on centralized institutions such as universities or private research companies to supply the spaces to conduct research projects and other resources, ranging from equipment to graduate students, these institutions can set significant constraints on the researchers. This is a tradeoff between security and intellectual freedom. On one hand, researchers get job security and they can engage in other activities if they do not receive funding for research (e.g. teaching), but on the other, they lose control over their IP which is usually signed off in the employment contract.

Lastly, as there are limited incentives for data sharing and collaboration, researchers are encouraged to publish a minimal subset of their results that passes peer review, meaning that a small portion of all the knowledge value is shared with the scientific community. At the same time, as a significant portion of knowledge is locked behind paywalls of scientific journals, new knowledge assets are less likely to affect those that might benefit from them the most.

Alternative System Designs

The problems outlined above can be summarized by leaky value flow linearity, centralization of value, and the dependence on centralized agencies for funding, operations, and knowledge dissemination. These problems are the focus of this work and of the movement known as decentralized science (DeSci). By identifying these issues, we were able to design alternative ecosystems that try to alleviate them within the emerging space of the decentralized web, colloquially referred to as Web3. While there is a shortage of real-world data on decentralized research communities, we can extrapolate the available data from other domains, such as finance, which has experienced a paradigm shift from financial products being exclusively offered by centralized entities such as banks to them being freely available through trustless Web3 protocols. We will show how a similar paradigm shift can take place in the realm of science and we validate our findings through the practice of token engineering.

Token Engineering

Token engineering is an expanding discipline within the Web3 space focusing on the design and verification of tokenized communities, i.e. communities whose incentives are aligned with digital tokens. The tokens can be both fungible or non-fungible and may or may not have monetary value, it is the job of a token engineer to propose the optimal token structure that will incentivize the desirable behaviors of actors within the ecosystem. This work takes inspiration from the Web3 Sustainability Loop developed by Trent McConaghy for modelling robust open science token communities.

Knowlege Engineering

As outlined above, token engineering encompasses a wide range of disciplines, but at its core, it is fundamentally about designing and testing systems that prompt certain behaviors. As a result, it is important to carefully consider the behavior of knowledge agents, i.e. scientists, teachers, and any other stakeholder within the scientific ecosystem.