
Hi all: I'm testing MonetDB5 to know if it's suitable for an OLAP project. I have to load a bit more than 11.5 million rows into one table, and I get the following results (I'm working on a Windows 2000 Professional 32 bits box with 1GB RAM and Monet v5.0.0_beta1_2): - when I try to load all the data in a single "copy into", mserver crashes at a certain point. - when I try to load the data in 100K chunks with several "copy into" into the same sql session, the server crashes at 7.8 million rows, and when I try to load the rest of the data it gets loaded, but the first 7.8 million rows get corrupt and the rest get ok. - when I load the data in 100K chunks in 5 different times (2.6 million rows each time) everything gets ok; after every 2.6 million I disconnect from sql client and shutdown the server, so when I restart the server and connect from sql client all the logs get persistent (thanks to Niels Nes to explain simply that logs get persistent after a server restart and when a sql session starts). So, initially I thought that MonetDB couldn't load that amount of data in a 32 bits box, but after all it can, because at last the data was successfully loaded. Is there a limitation with the "copy into"? Is there a way to achieve this process without restarting the server and sql sessions? I've read some threads about 32 bits limitation, but I don't know for sure how to calculate the maximum rows I can have into a single table. In my case, the table has a 147 bytes length using the following sizes in data types: smallint --> 2 bytes int --> 4 bytes char(2) --> 2 bytes the limitation would be 2GB / 147 bytes --> 14.608.732 rows ? I've assumed this after what I've read, but please let me know if I'm doing a wrong calculation. Thanks in advance, Franco -- View this message in context: http://www.nabble.com/M5-SQL-load-data-with-%22copy-into%22-behaviour-and-32... Sent from the monetdb-users mailing list archive at Nabble.com.