[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Bug#1063673: ITP: llama.cpp -- Inference of Meta's LLaMA model (and others) in pure C/C++



On Thu, 2025-02-06 at 10:40 +0100, Christian Kastner wrote:
> 
> On 2025-02-06 02:42, M. Zhou wrote:
> > I second this. llama-server is also the service endpoint for DebGPT.
> 
> Do you both use the WebUI with this, or just the API endpoint?

I only use API.


Reply to: