Re: Bug#1063673: ITP: llama.cpp -- Inference of Meta's LLaMA model (and others) in pure C/C++
On 2025-02-06 01:33, Petter Reinholdtsen wrote:
> I was sad to discover the server example is missing, as it is the
> llama.cpp progam I use the most. Without it, I will have to continue
> using my own build.
On 2025-02-06 02:42, M. Zhou wrote:
> I second this. llama-server is also the service endpoint for DebGPT.
Do you both use the WebUI with this, or just the API endpoint?
IIRC the only troublesome part was the WebUI, where the web stuff build
can be complicated.
Best,
Christian
Reply to: