[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#755043: Initial import needs more than 14GB of RAM



Hi,

On Thu, 17 Jul 2014, Vincent Bernat wrote:
> I have tried to setup a local instance to contribute some patches but
> the initial import of the database needs more than 14GB of RAM. This

What command did you use for the initial import?
Was that with the fixture distro_tracker/core/fixtures/debian-repositories.xml
or with a custome set of repositories?
How did you evaluate that memory requirement?

I know it takes several Gb but none of the machines where I did the
initial import had so much memory so your claim seems strange
to me.

I often only run ./manage.py tracker_update_repositories for the initial
import that said (and not run_all_tasks).

> Trying to limit to unstable doesn't help either.

Huh. Reducing the number of repositories and packages should really
help... I wonder what you're hitting here.

> Maybe a fixture with only a thousand packages or a limit of 1000
> packages per source would help.

We should certainly try to optimize the memory consumption of the
repository update process.

I have zero experience in analysis of Python's memory usage but I believe
that there are good tools for this.

Among the packaged tools I found python-meliae and python-memprof.
python-objgraph might also be useful. 

And python 3 has tracemalloc...
https://docs.python.org/3/library/tracemalloc.html

Cheers,
-- 
Raphaël Hertzog ◈ Debian Developer

Discover the Debian Administrator's Handbook:
→ http://debian-handbook.info/get/


Reply to: