I have been struggling with this problem for a couple of weeks. We
are using Monet 4.6.2 (I know, I know!) to power a web service
written in Java. When Monet is running on OS X, I can handle 200+
requests per second, but when I install it on a (much faster) Linux
machine, the app only gives me 20 requests per second.
The difference happens on the TCP/IP connection. I have the simplest
test:
var a:=bat(int,int).insert(1,1);
a.print();
This takes 4ms on OS X, but 40ms on Linux. I've traced down the
system calls, and what happens is that when the server is running on
OS X my client reads back the results in one or two calls to recv,
whereas when the server is on Linux the client only gets a couple of
bytes back from the server and then blocks until the rest arrives.
The following is a section of strace's output on the Java application:
$ strace -f -ttt java -cp test.java test
Monet running on OS X:
----------------------
send(5, "a.print();\n", 12, 0) = 12
recv(5, "#------------\n# h ....", 8192, 0) = 50
ioctl(5, FIONREAD, [45]) =
recv(5, " # type ....", 8192, 0) = 45
Monet running on Linux:
-----------------------
send(5, "a.print();\n", 12, 0) = 12
recv(5, "#-", 8192, 0)
ioctl(5, FIONREAD, [0]) =
recv(5,