[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#1063673: ITP: llama.cpp -- Inference of Meta's LLaMA model (and others) in pure C/C++



[Cordell Bloor]
> Could we just sidestep this whole question of native instructions by 
> building llama.cpp with the BLAS backend?

I like the idea.  Perhaps something for whisper.cpp too.

-- 
Happy hacking
Petter Reinholdtsen


Reply to: