Jetty has moved!
Jetty is a project at the Eclipse Foundation.
Homepage:http://www.eclipse.org/jetty
Downloads: http://download.eclipse.org/jetty/
Documentation:http://www.eclipse.org/jetty/documentation/current/
About:http://www.eclipse.org/jetty/about.php
Jetty Powered:http://www.eclipse.org/jetty/powered/
Contact the core Jetty developers at www.webtide.com
private support for your internal/customer projects ... custom extensions and distributions ... versioned snapshots for indefinite support ... scalability guidance for your apps and Ajax/Comet projects ... development services from 1 day to full product delivery
Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

Asynchronous Servlets: Suspendable Requests.

Most non trivial web applications need to wait at some stage during the processing of a HTTP request, examples of waiting request handling include:

  • waiting for a resource (eg. thread, JDBC Connection) to be available before processing the request.
  • waiting for an application event in an Ajax Comet application (eg Chat message, price change, etc.)
  • waiting for a response from a remote service (eg RESTful or SOAP call to a web service).

The current servlet API (<=2.5) supports only a synchronous call style, so that any waiting that a servlet needs to do must be with blocking. Unfortunately this means that the thread allocated to the request must be held during that wait with all its resources including kernal thread, stack memory and often pooled buffers, character converters, EE authentication context, etc. It is wasteful of system resources to be held while waiting.

blocking servlet example

Consider a modest web application with requests taking 10ms of CPU to handle and generate the response.
If run a 2 CPU machine, then you could expect close to 2*1000/10 = 200 requests per second to
be handled with only 2 threads.

However, if the handling of the request needs to wait for resources, more threads will be required. If for example, the average handling spent 5ms waiting for synchronized locks and 15ms waiting for IO to flush write the response, then total time per request will be 35ms and the thread pool will need to contain 200*30/1000 = 6 threads.

Now typically a web application will interact with a database of some kind, often on a remote machine, so database operations can often take 50ms or more, during which time the servlet has to wait for the response. Thus total request time is increased to 80ms and 200*(30+50)/1000 = 16 threads are needed.

If that web application is modified so that it requires a remote web server, then a much greater wait will be introduced. It is not unusual for a web service request to take many hundreds of milliseconds

  • No labels
Contact the core Jetty developers at www.webtide.com
private support for your internal/customer projects ... custom extensions and distributions ... versioned snapshots for indefinite support ... scalability guidance for your apps and Ajax/Comet projects ... development services from 1 day to full product delivery