<aside> 🧵 Skepsi is an annotation platform designed for collaborative peer review. Academic gatekeeping is absolutely essential for effective science, but in its current iteration it’s slow, unstructured and kept behind closed doors.
We were given a grant by Softbank to try and address these issues, the result of which was Skepsi. It provides an annotation and paper-reading layer that allows for collaborative commenting, and aggregates reviews into paper-level metadata to enable structured academic criticism.
</aside>
Annotation in Skepsi’s paper viewer
<aside> 💻 Website: https://skepsi-app.herokuapp.com/12
</aside>
<aside> 🎨 **Figma: https://www.figma.com/file/1ixuQZ1cMI7TcHlo7SQMhU/Skepsi-2.0?node-id=0%3A1**
</aside>
<aside> ⚙️ Github: https://github.com/team-skepsi/frontend ‣
</aside>
Skepsi is designed to address the flaws in academic peer review, and so a discussion of the existing standards is the best place to start.
Peer review is, without doubt, one of the most important aspects of the scientific method (Kelly, Sadeghieh and Adeli, 2014). It is the sole criteria through which journals filter academic writing, and it is therefore the public arbiter of what is considered valid science. It is therefore crucial that our approach to academic review is as effective as possible, because flawed research is capable of disrupting every part of academia.
As it stands, the process works fairly simply: academics send manuscripts to journals, who then contact several other researchers in the same field to review the draft. If those academics agree that the paper is valid, relevant and valuable to the scientific community, the journal publishes it.
In general, journals do not impose many standards upon academic reviewers (I-DAS, 2011), who are called to determine which of three categories the paper fits into: the manuscript needs revision; the manuscript is accepted; or the manuscript is rejected (Kelly, Sadeghieh and Adeli, 2014).