
On 12-11-2010 21:47:43 +0300, Andrey Timofeyev wrote:
Hello Stefan,
The memory consumption is grown for every JDBC query.
Here is example:
public class SimpleTest {
public static void main(String[] args) throws ClassNotFoundException, SQLException {
Class.forName("nl.cwi.monetdb.jdbc.MonetDriver");
Connection con = DriverManager.getConnection("jdbc:monetdb://[1]127.0.0.1:50000/test", "monetdb", "monetdb");
PreparedStatement st = con.prepareStatement("INSERT into test_data (id, name) values(?, ?)");
for (int i = 0; i < 1000000; i++) {
st.setInt(1, i);
st.setString(2, Integer.toString(i));
st.addBatch();
if(i % 10000 == 0){
System.out.println("10000 inserted");
}
}
st.executeBatch();
you execute everything in one very large batch, why not execute (and clear) the batch at smaller intervals? You print "10000 inserted", but you actually only queued up that amount of inserts, not sending anything to the database.
for (int i = 0; i < 1000000; i++) {
PreparedStatement select = con.prepareStatement("SELECT * from test_data where id = ?");
You really want to make a million prepared statements here? The server has to store them all...
select.setInt(1, i);
ResultSet rs = select.executeQuery();
if(i % 10000 == 0){
System.out.println("10000 selected");
}
rs.close();
select.close();
}
}
}
Does it reproduced on your database?
I guess so, since you're leaking prepared handles like mad