Bug#1063673: ITP: llama.cpp -- Inference of Meta's LLaMA model (and others) in pure C/C++
[Christian Kastner]
> Repo is here [1].
Very good. Built just fine here. I checked in a few minor fixes.
I noticed llama.cpp depend on llama.cpp-backend with no concrete
dependency first. This lead to unpredictable behaviour, and I suggest
depending on for example 'llama.cpp-cpu | llama.cpp-backend' to make
sure 'apt install llama.cpp' behave predictably.
I was sad to discover the server example is missing, as it is the
llama.cpp progam I use the most. Without it, I will have to continue
using my own build.
> I thought it best to upload now and fix the remaining issues above while
> the package sits in NEW.
Very good.
I hope to get whisper.cpp to the same state, so it can have a fighting
chance to get into testing before the freeze.
--
Happy hacking
Petter Reinholdtsen
Reply to: