On 7/18/07, Vishal Bhasin <firstname.lastname@example.org> wrote:
I ran the scenario illustrated in the attachment. Everything SOAP/HTTP.
We are using in-memory. I see most latency in ODE service..I don't have a way to determine CPU utilization by service unit.
I added instrumentation in the code to find time spent in each service unit..time spent in all but ODE -SU is less than 40 ms..but ODE-SU is around 2000 ms.
You might want to try with and without your XSL transform. This is something I have not benchmarked yet. Otherwise, the code paths should be relatively the same.
What test case did you run on your laptop? Also, do you have any optimization tips?
Things to consider:
1) If using -trunk, use JAVA_OPTS with "-
I get better performance with this setting; and there seems to be an Ode memory leak related to async processing in JBI.
2) You might want to experiment with "-
This avoids some XML re-parsing. It was introduced some time ago because we had a couple StAX to DOM conversion issues but they might no longer apply.
3) As a rule of thumb, I usually recommend starting with 20 threads/CPU
(overall). This is generally enough to saturate CPU under reasonable
network and disk I/O. A smaller number of threads is usually faster if
your I/Os are short but you might need more if you have long HTTP
requests, for example. If your CPUs are already 90%+ busy, then you
have enough threads.
4) For tracking down bottlenecks, I personally like taking thread-dump samples. It's sort of a statistical approach to performance monitoring. By looking at the stack traces, you'll see what the code is up to and you can figure out if there are unexpected things happening, such as the CPU spending disproportionate amounts of time in some area of the code. I use the standard "jstack" and "jconsole" utilities from the JDK for this, although you can also use CTRL-BREAK (Windows) or CTRL-\ (Unix) directly in the console.
That's about all I can think of now... I believe in-memory processes should perform well out-of-the-box so I'm curious to hear what's different in your scenario.