Explanation

For about a year now, I’ve been invested in thinking about AI tools, especially how libraries and librarians might leverage AI for work and instruction; my aim has been to help librarians “level up” so that we are informed leaders able to meaningfully participate in and convene conversations about AI use and policy creation among our campus communities.

I’ve developed a values-oriented approach to assessing AI tools as a flexible framework for thinking and evaluation. Five values guide this engagement with AI: transparency, curiosity, rigor, inclusion, and play.

So that I don’t miss the brief for this evaluation, immediately below I respond to questions Matt has posed for assessing these tools. For any who are interested, I’ll then employ the five-values rubric to evaluate Scite and Elicit, offering a summary of conclusions and/or recommendations prior to my assessment of each tool. And for anyone who’s truly keen, at the end I offer a description of each of the five values as they relate to engagement with AI platforms, followed by an explanation of what is meant by analytical, generative, and ensemble AI tools. My own ideation with ChatGPT while compiling this document can be viewed at this link.


Q’s from Matt:

How effectively does the tool streamline the process of finding relevant scholarly articles?

Scite: Scite is killer, but better results are a function of better prompts; we’ll need to offer instruction on effective prompt engineering.

Elicit: As above, Elicit is also outstanding, but skillfully engineering a query is the path to locating relevant results; Elicit also offers more intuitive options for limiting results than what I discern at Scite.

Does the tool improve the process of understanding and connecting with scholarly articles?

Scite: Scite’s Smart Citations are incredibly helpful here, and the Chrome extension allows Scite to “follow” you as you explore and discover on other platforms as well.

Elicit: This tool excels at improving understanding and connection with scholarly articles. There’s a learning curve, but the summaries offered and the analyses it generates are exceptional.

How valuable is the tool in aiding your research activities?

Scite: I’ve already been using Scite for research, and it’s a tremendous help, mainly because I can ideate with the Assistant throughout the discovery process. Now, it should be noted that the Assistant is a bit clumsy at times, not as “conversant” as ChatGPT or Claude, but incredibly helpful as a guide nonetheless.

Elicit: I’ve not used Elicit as much as Scite, mainly because I typically research in the Humanities and Elicit is unquestionably more geared toward facilitating scientific discovery. However, even when my queries center on literature and philosophy, Elicit locates and summarizes relevant results. It’s a cool tool, for sure.

How valuable is the tool in aiding your teaching activities?

Scite: As a First-Year and Transfer Experience Librarian, I think fostering familiarity and the ethical use of other tools (like Perplexity, Miro, ChatGPT, and Claude) will be more beneficial for the students with whom I work.

Elicit: See my response regarding Scite; at the same time,  I do see some possible use cases for instruction with Elicit (like the List of Concepts feature).

Do you believe these tools would be valuable and useful for students?

Scite: These tools are (in)valuable depending on the level of research students are conducting; I can also see that they will be beneficial to faculty working with grad assistants to streamline research and discover relevant literature.