Run LLM Models Locally with Llamafile

  1. Install Llamfile
  2. Download the model
  3. Run the model with Llamafile
  4. Use with Sublayer
  5. Basic Demo

Install Llamafile

  1. git clone [email protected]:Mozilla-Ocho/llamafile.git
  2. brew install make
  3. cd llamafile
    gmake -j8
    sudo gmake install PREFIX=/usr/local

Download the model

Run the model

  • llamafile -ngl 9999 -m path/to/model.gguf --host -c 2048
  • Recommended settings for Apple M1 users:
    llamafile -ngl 9999 -m path/to/model.gguf --host -c 2048 --gpu APPLE -t 12
  • visit localhost:8080

Use with Sublayer (skip to Basic Demo if you don't have a project)

  1. Add to Gemfile:

    gem 'sublayer', '~>0.0.7'
  2. Run:

    bundle install
  3. Add to your configuration file:

    Sublayer.configuration.ai_provider = Sublayer::Providers::Local
    Sublayer.configuration.ai_model = "LLaMA_CPP"
  4. Build a sublayer generator:

  5. Use in your code:

Basic Demo

Let's make a ruby project to find a past historical event on today's date

  • # bash
    mkdir historical_event_finder
    cd historical_event_finder
    touch Gemfile
    touch historical_event_finder.rb
  • # Gemfile
    source ''
    gem 'sublayer', '~>0.0.7'
  • # bash
    bundle install
  • Build a sublayer generator with the following description:
    • "generator that uses and finds a fun historical event from the past that occurred on the same month/day as a value"
  • Paste the result from above into historical_event_generator.rb (rename if needed)
  • Write the following code in historical_event_finder.rb:
    # historical_event_finder.rb
    require 'sublayer'
    require_relative 'historical_event_generator'
    Sublayer.configuration.ai_provider = Sublayer::Providers::Local
    Sublayer.configuration.ai_model = "LLaMA_CPP"
  • run your code:
    ruby historical_event_finder.rb
Build a TDD Bot