Using this demo you can run Bonsai language models locally on Mac (Metal), Linux/Windows (CUDA). llama.cpp (GGUF) — C/C++, runs on Mac (Metal), Linux/Windows (CUDA), and CPU. MLX (MLX format) — Python ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results