Tools for local inference

January 22, 2025 ยท View on GitHub

Here you can find tools and demos for running SmolLM2 and SmolVLM locally, leveraing libraries such as llama.cpp, MLX, MLC and Transformers.js.