[MonetDB-users] Copy records into table failed
Hello, I am trying to import data from MySQL into MonetDB usig the bulk loader, the csv has ~11GB, 163Mil Rows. When running: sql>copy 163715240 records into sys.clicks from '/data/csv_monet_dump/clicks.tbl' using delimiters '|',' more>', '"' null as ''; I always get this error message: SQLException:importTable:failed to import table ERROR: HEAPextend: failed to extend to 654860960 for 32/3231tail ERROR: TABLETcreate_bats: Failed to create bat of size 163715240 merovingian.log: 2011-03-10 14:12:31 MSG performance[6281]: #GDKmmap(2383544320) fails, try to free up space [memory in use=41411528,virtual memory in use=12310675456] 2011-03-10 14:12:31 MSG performance[6281]: #GDKmmap(2383544320) result [mem=32087208,vm=12310675456] I am using 64bit openSUSE 11.3, 12GB RAM, 4GB Swap. Does anybody know how can i overcome this error? Any help would be most welcome :) Cherers, Tibor. -- View this message in context: http://old.nabble.com/Copy-records-into-table-failed-tp31115783p31115783.htm... Sent from the monetdb-users mailing list archive at Nabble.com.
Hi, Split the data into several tables, load them sequentially. Tibor Barna wrote:
Hello,
I am trying to import data from MySQL into MonetDB usig the bulk loader, the csv has ~11GB, 163Mil Rows. When running:
sql>copy 163715240 records into sys.clicks from '/data/csv_monet_dump/clicks.tbl' using delimiters '|',' more>', '"' null as '';
I always get this error message: SQLException:importTable:failed to import table ERROR: HEAPextend: failed to extend to 654860960 for 32/3231tail ERROR: TABLETcreate_bats: Failed to create bat of size 163715240
merovingian.log: 2011-03-10 14:12:31 MSG performance[6281]: #GDKmmap(2383544320) fails, try to free up space [memory in use=41411528,virtual memory in use=12310675456] 2011-03-10 14:12:31 MSG performance[6281]: #GDKmmap(2383544320) result [mem=32087208,vm=12310675456]
I am using 64bit openSUSE 11.3, 12GB RAM, 4GB Swap.
Does anybody know how can i overcome this error? Any help would be most welcome :)
Cherers, Tibor.
Hi dariuszs, This sounds like a good idea, but i am still interested what the cause of the problem is. I had run the same script again, not long ago, and it worked (for the 11GB CSV). Than the next problem: sql>copy 163715212 records into sys.page_urls from '/data/csv_monet_dump/page_urls.tbl' using delimiters '|',' more>', '"' null as ''; MAPI = (monetdb) /home/wired/MonetDB/var/MonetDB5/dbfarm/mapi_socket ACTION= read_line QUERY = copy 163715212 records into sys.page_urls from '/data/csv_monet_dump/page_urls.tbl' using delimiters '|',' ', '"' null as ''; ERROR = !Connection terminated merovingian.log: 2011-03-10 16:28:51 MSG merovingian[3266]: database 'performance' (17635) was killed by signal SIGSEGV My database just crashed, I restarted it, run the command again, it worked. Strange :) The second CSV was only 2GB. Cheers, Tibor. dariuszs wrote:
Hi, Split the data into several tables, load them sequentially.
Tibor Barna wrote:
Hello,
I am trying to import data from MySQL into MonetDB usig the bulk loader, the csv has ~11GB, 163Mil Rows. When running:
sql>copy 163715240 records into sys.clicks from '/data/csv_monet_dump/clicks.tbl' using delimiters '|',' more>', '"' null as '';
I always get this error message: SQLException:importTable:failed to import table ERROR: HEAPextend: failed to extend to 654860960 for 32/3231tail ERROR: TABLETcreate_bats: Failed to create bat of size 163715240
merovingian.log: 2011-03-10 14:12:31 MSG performance[6281]: #GDKmmap(2383544320) fails, try to free up space [memory in use=41411528,virtual memory in use=12310675456] 2011-03-10 14:12:31 MSG performance[6281]: #GDKmmap(2383544320) result [mem=32087208,vm=12310675456]
I am using 64bit openSUSE 11.3, 12GB RAM, 4GB Swap.
Does anybody know how can i overcome this error? Any help would be most welcome :)
Cherers, Tibor.
------------------------------------------------------------------------------ Colocation vs. Managed Hosting A question and answer guide to determining the best fit for your organization - today and in the future. http://p.sf.net/sfu/internap-sfd2d _______________________________________________ MonetDB-users mailing list MonetDB-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/monetdb-users
-- View this message in context: http://old.nabble.com/Copy-records-into-table-failed-tp31115783p31117004.htm... Sent from the monetdb-users mailing list archive at Nabble.com.
Hi, When you run out of memory and swap space that's what happens. Tibor Barna wrote:
Hi dariuszs,
This sounds like a good idea, but i am still interested what the cause of the problem is. I had run the same script again, not long ago, and it worked (for the 11GB CSV).
Than the next problem: sql>copy 163715212 records into sys.page_urls from '/data/csv_monet_dump/page_urls.tbl' using delimiters '|',' more>', '"' null as '';
MAPI = (monetdb) /home/wired/MonetDB/var/MonetDB5/dbfarm/mapi_socket ACTION= read_line QUERY = copy 163715212 records into sys.page_urls from '/data/csv_monet_dump/page_urls.tbl' using delimiters '|',' ', '"' null as ''; ERROR = !Connection terminated
merovingian.log: 2011-03-10 16:28:51 MSG merovingian[3266]: database 'performance' (17635) was killed by signal SIGSEGV
My database just crashed, I restarted it, run the command again, it worked. Strange :) The second CSV was only 2GB.
Cheers, Tibor.
dariuszs wrote:
Hi, Split the data into several tables, load them sequentially.
Tibor Barna wrote:
Hello,
I am trying to import data from MySQL into MonetDB usig the bulk loader, the csv has ~11GB, 163Mil Rows. When running:
sql>copy 163715240 records into sys.clicks from '/data/csv_monet_dump/clicks.tbl' using delimiters '|',' more>', '"' null as '';
I always get this error message: SQLException:importTable:failed to import table ERROR: HEAPextend: failed to extend to 654860960 for 32/3231tail ERROR: TABLETcreate_bats: Failed to create bat of size 163715240
merovingian.log: 2011-03-10 14:12:31 MSG performance[6281]: #GDKmmap(2383544320) fails, try to free up space [memory in use=41411528,virtual memory in use=12310675456] 2011-03-10 14:12:31 MSG performance[6281]: #GDKmmap(2383544320) result [mem=32087208,vm=12310675456]
I am using 64bit openSUSE 11.3, 12GB RAM, 4GB Swap.
Does anybody know how can i overcome this error? Any help would be most welcome :)
Cherers, Tibor.
------------------------------------------------------------------------------ Colocation vs. Managed Hosting A question and answer guide to determining the best fit for your organization - today and in the future. http://p.sf.net/sfu/internap-sfd2d _______________________________________________ MonetDB-users mailing list MonetDB-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/monetdb-users
Hi I am still evaluating MonetDB, but if this is what happens when the memory runs out, I am afraid that it is not stable enough for production use. My database could reach more than 100GB, even if i add more RAM, at some point I will still run out of memory. Please correct me if I am wrong. What would be the optimal setup for MonetDB for my case? Cheers, Tibor. dariuszs wrote:
Hi, When you run out of memory and swap space that's what happens.
Tibor Barna wrote:
Hi dariuszs,
This sounds like a good idea, but i am still interested what the cause of the problem is. I had run the same script again, not long ago, and it worked (for the 11GB CSV).
Than the next problem: sql>copy 163715212 records into sys.page_urls from '/data/csv_monet_dump/page_urls.tbl' using delimiters '|',' more>', '"' null as '';
MAPI = (monetdb) /home/wired/MonetDB/var/MonetDB5/dbfarm/mapi_socket ACTION= read_line QUERY = copy 163715212 records into sys.page_urls from '/data/csv_monet_dump/page_urls.tbl' using delimiters '|',' ', '"' null as ''; ERROR = !Connection terminated
merovingian.log: 2011-03-10 16:28:51 MSG merovingian[3266]: database 'performance' (17635) was killed by signal SIGSEGV
My database just crashed, I restarted it, run the command again, it worked. Strange :) The second CSV was only 2GB.
Cheers, Tibor.
dariuszs wrote:
Hi, Split the data into several tables, load them sequentially.
Tibor Barna wrote:
Hello,
I am trying to import data from MySQL into MonetDB usig the bulk loader, the csv has ~11GB, 163Mil Rows. When running:
sql>copy 163715240 records into sys.clicks from '/data/csv_monet_dump/clicks.tbl' using delimiters '|',' more>', '"' null as '';
I always get this error message: SQLException:importTable:failed to import table ERROR: HEAPextend: failed to extend to 654860960 for 32/3231tail ERROR: TABLETcreate_bats: Failed to create bat of size 163715240
merovingian.log: 2011-03-10 14:12:31 MSG performance[6281]: #GDKmmap(2383544320) fails, try to free up space [memory in use=41411528,virtual memory in use=12310675456] 2011-03-10 14:12:31 MSG performance[6281]: #GDKmmap(2383544320) result [mem=32087208,vm=12310675456]
I am using 64bit openSUSE 11.3, 12GB RAM, 4GB Swap.
Does anybody know how can i overcome this error? Any help would be most welcome :)
Cherers, Tibor.
------------------------------------------------------------------------------ Colocation vs. Managed Hosting A question and answer guide to determining the best fit for your organization - today and in the future. http://p.sf.net/sfu/internap-sfd2d _______________________________________________ MonetDB-users mailing list MonetDB-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/monetdb-users
------------------------------------------------------------------------------ Colocation vs. Managed Hosting A question and answer guide to determining the best fit for your organization - today and in the future. http://p.sf.net/sfu/internap-sfd2d _______________________________________________ MonetDB-users mailing list MonetDB-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/monetdb-users
-- View this message in context: http://old.nabble.com/Copy-records-into-table-failed-tp31115783p31124588.htm... Sent from the monetdb-users mailing list archive at Nabble.com.
On Fri, 11 Mar 2011, Tibor Barna wrote:
I am still evaluating MonetDB, but if this is what happens when the memory runs out, I am afraid that it is not stable enough for production use.
My database could reach more than 100GB, even if i add more RAM, at some point I will still run out of memory.
Did you also implement swap space? Stefan
Hey Stefan, I have 4GB Swap space, 12GB RAM, and a 100GB database. I could extend the swap for 1-2x the RAM, but still, if it ever runs out of memory, the database should still not crash. Is there a rule how the partitions should look like, amount of RAM, Swap? Tibor. Stefan de Konink-3 wrote:
On Fri, 11 Mar 2011, Tibor Barna wrote:
I am still evaluating MonetDB, but if this is what happens when the memory runs out, I am afraid that it is not stable enough for production use.
My database could reach more than 100GB, even if i add more RAM, at some point I will still run out of memory.
Did you also implement swap space?
Stefan
------------------------------------------------------------------------------ Colocation vs. Managed Hosting A question and answer guide to determining the best fit for your organization - today and in the future. http://p.sf.net/sfu/internap-sfd2d _______________________________________________ MonetDB-users mailing list MonetDB-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/monetdb-users
-- View this message in context: http://old.nabble.com/Copy-records-into-table-failed-tp31115783p31124969.htm... Sent from the monetdb-users mailing list archive at Nabble.com.
On Fri, 11 Mar 2011, Tibor Barna wrote:
I have 4GB Swap space, 12GB RAM, and a 100GB database. I could extend the swap for 1-2x the RAM, but still, if it ever runs out of memory, the database should still not crash.
I guess it should stop the process... but it cannot ever finish it...
Is there a rule how the partitions should look like, amount of RAM, Swap?
Rule of thumb is 2x RAM = SWAP. Stefan
participants (3)
-
dariuszs
-
Stefan de Konink
-
Tibor Barna