[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Bug#1094806: ITP: ollama -- large language model tools



On Fri, 2025-01-31 at 12:31 +0100, Simon Josefsson wrote:
> Package: wnpp
> Severity: wishlist
> Owner: Simon Josefsson <simon@josefsson.org>
> X-Debbugs-CC: debian-devel@lists.debian.org, debian-go@lists.debian.org
> 
> * Package name    : ollama
>   Version         : 0.5.7-1
>   Upstream Author : Ollama
> * URL             : https://github.com/ollama/ollama
> * License         : Expat
>   Programming Lang: Go
>   Description     : large language model tools
> 
>  Ollama: Get up and running with large language models.
> 
> https://salsa.debian.org/go-team/packages/ollama
> https://salsa.debian.org/jas/ollama/-/pipelines


@ckk is planning to package llama.cpp within debian deep learning
team (debian-ai@l.d.o). Maybe you want to discuss with the team
whether you want to deal with the embedded copy of llama.cpp inside
ollama source tree?

I did not look into how ollama enables ROCm and CUDA support, but
that's also something the team care about. Do you want to enable
any of them?


Reply to: