Jetty has moved!
Jetty is a project at the Eclipse Foundation.
Homepage:http://www.eclipse.org/jetty
Downloads: http://download.eclipse.org/jetty/
Documentation:http://www.eclipse.org/jetty/documentation/current/
About:http://www.eclipse.org/jetty/about.php
Jetty Powered:http://www.eclipse.org/jetty/powered/
Contact the core Jetty developers at www.webtide.com
private support for your internal/customer projects ... custom extensions and distributions ... versioned snapshots for indefinite support ... scalability guidance for your apps and Ajax/Comet projects ... development services from 1 day to full product delivery
Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

Asynchronous Servlets: Suspendable Requests.

Most non trivial web applications need to wait at some stage during the processing of a HTTP request, examples of waiting request handling include:

  • waiting for a resource (eg. thread, JDBC Connection) to be available before processing the request.
  • waiting for an application event in an Ajax Comet application (eg Chat message, price change, etc.)
  • waiting for a response from a remote service (eg RESTful or SOAP call to a web service).

The current servlet API (<=2.5) supports only a synchronous call style, so that any waiting that a servlet needs to do must be with blocking. Unfortunately this means that the thread allocated to the request must be held during that wait with all its resources including kernal thread, stack memory and often pooled buffers, character converters, EE authentication context, etc. It is wasteful of system resources to be hold these resources while waiting.

Blocking Waiting Examples

JDBC Connection Pool

Consider a web application handling on average 400 requests per second, with each request interacting with the database for 50ms. To handle this average load, 400*50/1000 = 20 JDBC connections are need on average. However, requests do not come at the even rate and there are often bursts and pauses. To protect a database from bursts, often a JDBC connection pool is applied to limit the simultaneous requests made on the database. So for this application, it would be reasonable to apply a JDBC pool of 30 connections, to provide for a 50% margin.

If momentarily the request rate doubled, then the 30 connections would only be able to handle 600 requests per second, and 200 requests per second would join those waiting on the JDBC Connection pool. If the servlet container had a thread pool with 500 threads, then that would be entirely consumed by threads waiting for JDBC connections in 2.5 seconds of this request rate. After 2.5s, the web application would be unable to process any requests at all because no threads would be available. Even requests that do not use the database would be blocked due to thread starvation.

This thread starvation situation can also occur if the database runs slowly or is momentarily unavailable.

If the web container was able to threadlessly suspend the requests waiting for a JDBC connection, then thread starvation would not occur, as only 30 threads would be consumed by requests accessing the database and the other 470 threads would be available to process the request that do not access the database.

Remote Web Service

Comet Chat application

  • No labels
Contact the core Jetty developers at www.webtide.com
private support for your internal/customer projects ... custom extensions and distributions ... versioned snapshots for indefinite support ... scalability guidance for your apps and Ajax/Comet projects ... development services from 1 day to full product delivery