M6.8 Seattle Event: Web Traffic Report


The M6.8 earthquake centered near Seattle, WA provided a good test of the USGS Earthquake Hazards Program's web service capabilities. This is a look at how the main program page and the Pasadena office web site fared after this event.

The earthquake occurred at 10:54 PST on Wednesday, February 28, 2001. It was reported felt over a wide area of the Pacific Northwest, and as far away as Salt Lake City. Traffic on all the USGS earthquake web servers began increasing almost immediately. Most servers were quickly crushed by the avalanche of traffic, and were rendered unresponsive for several hours following this event. The two exceptions were the main Earthquake Hazards Program page at http://earthquake.usgs.gov and the USGS Pasadena web site at http://pasadena.wr.usgs.gov. The Pasadena web site serves as the main center for the Community Internet Intensity Map web pages, also known as the 'Report an Earthquake' pages. This site allows people to submit an online questionnaire to report the intensity of an earthquake at their location. Data from the submitted questionnaires is used to build a map showing the reported earthquake intensity. Both of these sites are using Squid reverse-proxy servers as http accelerators. This effectively splits the load on the web server and improves web service capacity by about an order of magnitude. This proved decisive in the aftermath of the Seattle earthquake. The story of how we arrived at this configuration is in Web Servers, Earthquakes, and the Slashdot Effect. This article describes how our web server was flooded after the Hector Mine earthquake, and our response to this event.


earthquake.usgs.gov - total traffic
Figure 1
Traffic on earthquake.usgs.gov peaked at 710 hits/sec, and 12.8 Mb/sec data rate. The curve shows the characteristic shape of an earthquake-driven traffic spike. The rise is nearly vertical, followed by an exponential decay.

The earthquake.usgs.gov site is served off of a pair of Sun Solaris servers located in Menlo Park and Reston. There are three Squid servers acting as a front-end for the site. Figure 1 shows the traffic on each of the three servers. Load sharing is done with standard round-robin DNS, so the load balance is not perfect, but good

earthquake.usgs.gov - total traffic
Figure 2
enough for this application. The highest traffic on a single Squid server was 248 hits/sec, and 4.5 Mb/sec data rate. Testing indicates that a single Squid server of this type is capable of handling about 400 hits/sec and a data rate of about 60 Mb/sec, so we did not exceed the limits of the system this time. Figure 2 shows the aggregate traffic on all three servers.

USGS Pasadena - total traffic
Figure 3
Ordinarily, earthquakes outside Southern California do not have a significant impact on the Pasadena Office web server. However, this changed with the national rollout of the Community Internet Intensity Map. This is an online facility for people to submit their experiences in an earthquake. This data is used to build a map showing the distribution of reported shaking. Figure 3 shows the total traffic seen on the Pasadena web server.
Traffic detail - Pasadena web site
Figure 4
Most of the people coming to the Pasadena site were referred from earthquake.usgs.gov, which is why the peak is not as sharp as the peak seen on that server. The peak in Pasadena was 124 hits/sec, and 21.8 Mb/sec. Note that the data rate in Pasadena was actually higher than the rate for earthquake.usgs.gov. This is because the CIIM map page for an event grows with the number of submitted reports. This event generated an unprecedented volume of responses, so the map page became quite large.

Figure 4 shows detailed traffic information for the Pasadena web site. Note that traffic to the CIIM pages makes up almost all of the traffic. Also note the purple line on the graph. This represents completed questionnaires submitted for processing. The rate of incoming questionnaires never went above 1/sec. The Pasadena server is capable of processing 20/sec, so we had a comfortable safety margin. This shows that the improvements we made to our web server after the September 3, 2000 Yountville earthquake were worthwhile.

USGS Pasadena Squid Server - CPU Usage
Figure 5
The only dark spot for the Pasadena server is shown in Figure 5. This shows CPU usage on the Squid server. Note that from about 12:30 to 15:00, the server was 100% busy. Traffic was being served, and the site was accessible during this time. However, the server was observed to be lagging slightly, and was just a little slow. The red line indicates CPU time spent on system functions, typically disk IO. This indicates that the machine needs more memory. Also, there may be some more kernel tuning that could improve performance under heavy loads. We are currently exploring options for upgrading this machine and improving its performance.

Still, all things considered, the Pasadena and Earthquake Hazards Program web servers performed quite well under the onslaught.


Stan Schwarz
Honeywell Technical Services
Southern California Seismic Network Contract
Pasadena, California
01 Mar, 2001 09:42 PST