Technological solutions to the problems of scientific publishing

How tokenisation could replace science's unofficial and outdated currency. By Anita Schjøll Brede

Scientists wanted! Job description: observe a slice of the universe, think of a hypothesis about how it works, put your hypothesis to the test and tell the rest of us how it went.
 
Sounds good. What about the pay? In an ideal world, a scientist would trade their ideas for a living. Share a good, tested idea, collect your pay and start daydreaming about the next hypothesis and experiment. In reality, scientists operate on an unofficial currency that gets between their ideas and their getting paid. It's publication.
 
But the publication process is full of problems and it fails to compensate the people providing the most value. "The fact that peer review doesn't count for almost anything is a problem," says marine biologist Miguel Pais, of the Marine and Environmental Sciences Centre in Lisbon, Portugal.
 
Scientists are smart enough to know the difference between a good incentive and a bad one. So: should journals pay you for reviewing? Should you pay publishers to edit and publish your work under an open access license? How do we track the contributions, large and small, to the entire process?
 
One answer is tokenisation. The emerging technology of tokenisation is making it more practical than ever to track tiny transactions, such as reviewing a paper or paying a software license. They work the same way Bitcoin and other cryptocurrencies work, by recording a small note on a shared, decentralised, encrypted record called a blockchain.
 
Anyone who wants to verify a transaction - who reviewed this paper, who really paid for this piece of art - can do so by checking any of the many copies of the transaction record stored on independent machines in the digital world. That makes the system robust and trustworthy in ways that used to require a single entity, such as a bank or a publisher, to ensure.
 
This means that now, instead of a choke point at the scientific publisher, who can charge just about whatever they want, tokenisation offers a way to decentralise the power in scientific publishing back to where it belongs: the scientists.
 
Tokens could recognise a researcher's contribution to the literature in many forms, such as publishing failed results, or doing the thankless task of replicating someone else's results. Things that aren’t worth publishing in the current system but which do have scientific value, such as mini-reviews or single-experiment studies, could earn recognition in the form of tokens. One startup even argues that scientists should crowd-fund their research using tokens.
 
Scientists who have contributed to the publishing process are the natural ‘earners’ of such tokens. But why would they want tokens instead of cash? Scientists, and their host institutions, need swift, targeted tools to navigate the scientific literature. Research institutions already pay top dollar for software to navigate literature databases. Instead, they could use tokens earned by contributing to the literature to enable their access to it. That’s the idea behind a growing number of new AI-enabled literature search tools.
 
The scientific community can also use tokens to manage and reward the publication system better than the traditional scientific publishers. Tokens can also serve as proxies for votes in these new systems. In the Project Aiurmodel, not even the founding company, Iris.AI, can hold more than 2% of the total tokens in the community. Other token-enabled models, such as Decentralised Research Platform, have their own twists on using a mix of reputation and financial incentives to allow scientists to manage their own peer review and community.
 
Maybe one of these new types of currency, built by the community for the community, will improve the incentives in scientific publishing and research and give scientists more time to do their real jobs: research.

Anita Schjøll Brede is iris.ai’s co-founder and CEO.

Recent Issues