Hi all, Hannes Mühleisen (cc'd) generously shared his homepages.cwi.nl/~hannes/importer.py Python script with me last week, which automates the creation of tables from text files: Python's csv-reader helps recognize the dialect of the text file as well as grab the header for column names and identify the types for the columns, and then a single COPY %i OFFSET %i RECORDS INTO %s FROM \'%s\' USING DELIMITERS … is issued to MonetDB (see line 404 of the script). I had some issues with this over the last three days, but now I see the source of the error I get: There is nothing wrong with Hannes's Python code but when COPY INTO turns back to the text file, MonetDB still complains about the last row containing an "incomplete record" (it is empty) even if Python was smart enough not to count that row into the number of records beforehand. Is there a way to call COPY INTO that is robust to the text file ending in an extra empty row? It is common with some generated CSV or TSV files. By the way, I also have a follow-up about how to merge some of the tables together. I am happy to work on this, of course, I don't expect anyone to script this for me, but I am confused about how much I can use SQL variables for this (and how) versus how much I need to use another language to script the repeated calls to MonetDB about each year for each series of tables (I have dozens of dozens). I hope this is a reasonably well-posed question: http://dba.stackexchange.com/questions/65329/dynamic-sql-merge-partitioned-t... Thanks for any help, Laszlo