Skip to content

kalavai-net/kalavai-client

Repository files navigation

Kalavai logo

GitHub Release PyPI - Downloads GitHub contributors GitHub License GitHub Repo stars Dynamic JSON Badge Signup

⭐⭐⭐ Kalavai and our LLM pools are open source, and free to use in both commercial and non-commercial purposes. If you find it useful, consider supporting us by giving a star to our GitHub project, joining our discord channel, follow our Substack and give us a review on Product Hunt.

Kalavai: turn your devices into a scalable LLM platform

Taming the adoption of Large Language Models

Kalavai is an open source tool that turns everyday devices into your very own LLM platform. It aggregates resources from multiple machines, including desktops and laptops, and is compatible with most model engines to make LLM deployment and orchestration simple and reliable.

Kalavai - The first platform to crowdsource AI computation | Product Hunt

What can Kalavai do?

Kalavai's goal is to make using LLMs in real applications accessible and affordable to all. It's a magic box that integrates all the components required to make LLM useful in the age of massive computing, from sourcing computing power, managing distributed infrastructure and storage, using industry-standard model engines and orchestration of LLMs.

Core features

  • Manage multiple devices resources as one. One pool of RAM, CPUs and GPUs
  • Deploy Large Language Models seamlessly across devices, wherever they are (multiple clouds, on premises, personal devices)
  • Auto-discovery: all models are automatically exposed through a single OpenAI-like API and a ChatGPT-like UI playground
  • Compatible with most popular model engines
  • Easy to expand to custom workloads

Video tutorials

Aggregate multiple devices in an LLM pool

every_tree_starts_as_a_seed.mp4

Deploy LLMs across the pool

your_wish_is_my_command.mp4

Single point of entry for all models (GUI + API)

one_api_to_rule_them_all.mp4

Self-hosted LLM pools

all_power_all_privacy.mp4

Latest updates

  • 20 February 2025: New shiny GUI interface to control LLM pools and deploy models
  • 6 February 2025: 🔥🔥🔥 Access DeepSeek R1 model for free when you join our public LLM pool
  • 31 January 2025: kalavai-client is now a PyPI package, easier to install than ever!
More news

Support for LLM engines

We currently support out of the box the following LLM engines:

Coming soon:

Not what you were looking for? Tell us what engines you'd like to see.

Kalavai is at an early stage of its development. We encourage people to use it and give us feedback! Although we are trying to minimise breaking changes, these may occur until we have a stable version (v1.0).

Want to know more?

Getting started

The kalavai-client is the main tool to interact with the Kalavai platform, to create and manage both local and public pools and also to interact with them (e.g. deploy models). Let's go over its installation.

Requirements

Requirements

For workers sharing resources with the pool:

Support for Windows and MacOS workers is experimental: kalavai workers run on docker containers that require access to the host network interfaces, thus systems that do not support containers natively (Windows and MacOS) may have difficulties finding each other.

Any system that runs python 3.6+ is able to run the kalavai-client and therefore connect and operate an LLM pool, without sharing with the pool. Your computer won't be adding its capacity to the pool, but it wil be able to deploy jobs and interact with models.

Common issues

If you see the following error:

fatal error: Python.h: No such file or directory | #include <Python.h>

Make sure you also install python3-dev package. For ubuntu distros:

sudo apt install python3-dev

If you see:

AttributeError: install_layout. Did you mean: 'install_platlib'?
      [end of output]

Upgrade your setuptools:

pip install -U setuptools

Install the client

The client is a python package and can be installed with one command:

pip install kalavai-client

Create a a local, private LLM pool

Kalavai is free to use, no caps, for both commercial and non-commercial purposes. All you need to get started is one or more computers that can see each other (i.e. within the same network), and you are good to go. If you are interested in join computers in different locations / networks, contact us or book a demo with the founders.

You can create and manage your pools with the new kalavai GUI, which can be started with:

kalavai gui start

This will expose the GUI and the backend services in localhost. By default, the GUI is accessible via http://localhost:3000. In the UI users can create and join LLM pools, monitor devices, deploy LLMs and more.

Kalavai logo

Check out our getting started guide for next steps.

Public LLM pools: crowdsource community resources

This is the easiest and most powerful way to experience Kalavai. It affords users the full resource capabilities of the community and access to all its deployed LLMs, via an OpenAI-compatible endpoint as well as a UI-based playground.

Check out our guide on how to join and start deploying LLMs.

Enough already, let's run stuff!

Check our examples to put your new AI pool to good use! For an end to end tour, check our self-hosted and public LLM pools guides.

Compatibility matrix

If your system is not currently supported, open an issue and request it. We are expanding this list constantly.

Hardware and OS compatibility

OS compatibility

Since worker nodes run inside docker, any machine that can run docker should be compatible with Kalavai. Here are instructions for linux, Windows and MacOS.

The kalavai client, which controls and access pools, can be installed on any machine that has python 3.10+.

Hardware compatibility:

Roadmap

  • Kalavai client on Linux
  • [TEMPLATE] Distributed LLM deployment
  • Kalavai client on Windows (with WSL2)
  • Public LLM pools
  • Self-hosted LLM pools
  • Collaborative LLM deployment
  • Ray cluster support
  • Kalavai client on Mac
  • Kalavai pools UI
  • [TEMPLATE] GPUStack support
  • [TEMPLATE] exo support
  • Support for AMD GPUs
  • Docker install path

Anything missing here? Give us a shout in the discussion board

Contribute

Star History

Star History Chart

Build from source

Expand

Python version >= 3.6.

sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt update
sudo apt install python3.10 python3.10-dev python3-virtualenv
virtualenv -p python3.10 env
source env/bin/activate
sudo apt install  python3.10-venv python3.10-dev -y
pip install -U setuptools
pip install -e .[dev]

Build python wheels:

bash publish.sh build

Unit tests

To run the unit tests, use:

python -m unittest