I’ve written before, I am at best an enthusiastic amateur when it comes to AI, LLMs and R. But I’m braver/dumber than most, so for a talk I’m giving to NE-RUG – the Nebraska R Users Group – I’ve got some resources to share.
R libraries
ellmer
: From the folks who brought you the tidyverse
comes ellmer
, a library that supports talking to a large number of LLMs. To talk to the big commercial LLMs, you’ll need API keys, and that usually means having an API budget. But what I like about ellmer is that it also talks to locally hosted models as well. More about that later.
chores
: Built on top of ellmer
, chores
is a neat way to make tools inside of R Studio that leverage LLMs to accomplish tasks. Examples are to help with certain kinds of code tasks, or help with explanations of what an error message means.
External resources
Ollama: A cross-platform system of downloading, managing and running open-source LLMs on your local machine. With Ollama, you can run Meta’s Llama3 or DeepSeek’s R1 locally, using them to accomplish tasks without incurring costs. A rough rule of thumb is that you can run models slightly smaller than the amount of RAM you have. For example, my computer has 16GB of RAM, so I can run 14 billion parameter models (albeit somewhat slowly).