
Thanks Stefan,
with loading ~1.5 GB on a 32 bit machine, there is a reasonable chance that the loading process (that needs to be able to address all data concurrently) comes close to or even hits the 32-bit (i.e., max. 4 GB, possibly only 2 GB) address space limit.
Does this limitation refers to the size of the data that I am adding per COPY INTO (was 0.2 GB) or the final size of the table (was at 1.9). Assuming the latter this would give a hard limit for table size under Win32. If this is the case, a workaround could be partioning the data into several tables. What is more efficient in MonetDB: chunks of rows at full columns width or chunks of columns at full row length?
Would it be possible that you could provide us with your data? That would help us a lot to analyze, locate and hopefully fix the crash...
I give you access by seperate mail, you can also create the test data file using the R script snippet in my previous mail. Have a nice weekend Jens