Skip to content

localLLM

Running Local LLMs with 'llama.cpp' Backend

v1.2.1 · Feb 25, 2026 · MIT + file LICENSE

Description

Provides R bindings to the 'llama.cpp' library for running large language models. The package uses a lightweight architecture where the C++ backend library is downloaded at runtime rather than bundled with the package. Package features include text generation, reproducible generation, and parallel inference.

Downloads

308

Last 30 days

12962nd

942

Last 90 days

1.5K

Last year

Trend: -18.5% (30d vs prior 30d)

CRAN Check Status

14 OK
Show all 14 flavors
Flavor Status
r-devel-linux-x86_64-debian-clang OK
r-devel-linux-x86_64-debian-gcc OK
r-devel-linux-x86_64-fedora-clang OK
r-devel-linux-x86_64-fedora-gcc OK
r-devel-macos-arm64 OK
r-devel-windows-x86_64 OK
r-oldrel-macos-arm64 OK
r-oldrel-macos-x86_64 OK
r-oldrel-windows-x86_64 OK
r-patched-linux-x86_64 OK
r-release-linux-x86_64 OK
r-release-macos-arm64 OK
r-release-macos-x86_64 OK
r-release-windows-x86_64 OK

Check History

OK 14 OK · 0 NOTE · 0 WARNING · 0 ERROR · 0 FAILURE Mar 10, 2026

Dependency Network

Dependencies Reverse dependencies Rcpp jsonlite digest curl R.utils localLLM

Version History

new 1.2.1 Mar 10, 2026
updated 1.2.1 ← 1.2.0 diff Feb 25, 2026
updated 1.2.0 ← 1.1.0 diff Feb 16, 2026
updated 1.1.0 ← 1.0.1 diff Dec 16, 2025
new 1.0.1 Oct 14, 2025