Hello, It seems MonetDB has problem with handling of huge tables (larger than amount of memory available). I tried to load large amount of data into a table using COPY command. I loaded same data (6 millions records, 700+MBytes) few times. Loading was successful two times, on third time server crushed. After crush I could not start the server again, it output messages !WARNING: GDKlockHome: ignoring empty or invalid .gdk_lock. !FATAL: BBPinit: cannot properly process bat/BACKUP/. and exited. Removing of the database data directory did not help much - server output messages !ERROR:mvc_init: unable to create system tables !WARNING: BBPincref: range error 410 Couple minutes later it output !ERROR: BATSIGcrash: Mserver internal error (Segmentation fault), please restart . !ERROR: (One potential cause could be that your disk might be full...) Well, insufficient disk space could be a cause, before server crashed I had approximately 2GB free. But I have cleaned up 10 GB and problem persists. How can I troubleshoot and fix the server ? What are memory / disk space requirements when bulk loading data ? I tested with 32-bit system, 512 MB RAM + 1GB swap, Ubuntu Linux, MonetDB-4.9.3, nightly build dated September 30. -- Best regards, Andrei Martsinchyk mailto:andrei.martsinchyk@gmail.com