Open Source Procedure for Assessment of Loss using Global Earthquake Modelling software (OPAL)

. This paper provides a comparison between Earthquake Loss Estimation (ELE) software packages and their application using an “Open Source Procedure for Assessment of Loss using Global Earthquake Modelling software” (OPAL). The OPAL procedure was created to provide a framework for optimisation of a Global Earthquake Modelling process through:


Introduction
The OPAL procedure (Fig. 1) has been developed to provide a framework for optimisation of a global earthquake modelling process, and to provide a state-of-the-art look at what open-source software tools are available globally.
It is up to the user to select those software packages that are deemed appropriate for use, and to then critically review using both the user manual (Daniell, 2009b) and the references to next test the applicability.A logic-tree approach is subsequently applied between the software packages in order to achieve an objective result combining systems, as no one system will be correct due to uncertainties in each of the four steps of the Earthquake Loss Estimation (ELE) procedure, as discussed below.This weighting is based on the quality of the ELE software package.This will minimise outlier results.For insurance purposes, the software package results should be critically reviewed and the variance of the separate models used.Selection and understanding of exposure (remote sensing or otherwise), vulnerability (empirical vs. analytical vs. multi-tier), hazard (probabilistic or deterministic) and loss (complexity of social and economic results required) for test location.
Acquisition of information, documentation and software of all possible available software packages relating to test regions and those that can be modified to suit location.

Assessment of software packages
The software packages should then be assessed in terms of their aforementioned 4 component parts and a final selection procedure created.

Loss assessment at test location for various packages
Use of the assessed software packages, and coding of particular methods that are desirable for hazard, vulnerability, exposure and socio-economic loss.
Separate presentation of results with scatter.Confidence values can be assigned.Logic tree software approach can be undertaken and extended damage and socio-economic analysis.

Overview of Earthquake Loss Assessment
Earthquake Loss Assessments are produced in order to detect possible economic, infrastructure and social losses due to an earthquake.In order to produce an effective ELE, four components must be taken into account in that define seismic loss as a function of: 1. exposure is defined as the amount of human activity located in the zones of seismic hazard as defined by the stock of infrastructure in that location (usually defined by geocell); 2. vulnerability is defined as the susceptibility of the infrastructure stock; 3. hazard is defined as the probability of a certain ground motion occurring at a location, which can be determined by scenario modelling via stochastic catalogues, PSHA (Probabilistic Seismic Hazard Assessment) or other such methods, and can include different types of earthquake effects; and 4. damage loss conversion can be defined as the mean damage ratio (ratio of replacement and demolition to repair and restoration cost; economically-speaking), or the social cost (i.e.number of injuries, homeless and deaths).
Because of the myriad of ways that each of these components (see Fig. 2) that make up seismic loss can be determined, there is a large range of earthquake loss estimation methods available.For some regions one particular method may be more applicable.This is because of a possible reduction in epistemic uncertainty (lack of knowledge) due to data collection and scientific assumptions used for the ELE method not being the same at any location in the world.In addition, probabilistic regional uncertainties in source, path, and site occur, quantified by aleatory variability.ELEs should quantify the uncertainties for a particular earthquake scenario (both epistemic and aleatory).Unfortunately, the scenario is nearly never realised in terms of an actual earthquake, as seen by the recent 2011 Tohoku earthquake exceeding estimates of the maximum magnitude thought to exist on the fault and also the 2011 Christchurch earthquake occurring on previously unrecognised faults.
It is necessary to define an area of interest in which the seismic hazard should be pinpointed at every location.For this paper, the Zeytinburnu district in Istanbul, Turkey with 50, 0.005 • × 0.005 • geocells was defined as the location where full earthquake loss estimation would be undertaken.The vulnerability of the infrastructure stock exposed to this hazard can be convolved with this hazard, and therefore a damage distribution is able to be established based on various classes of infrastructure damage.From this damage distribution, economic and social losses can be derived.All of these components constitute an ELE.Calculation of the losses can either be done in a proactive way (pre-earthquake scenario modelling) or a reactive way (post-earthquake fixed scenario modelling).
Nat. Hazards Earth Syst.Sci., 11, 1885Sci., 11, -1900Sci., 11, , 2011 www.nat-hazards-earth-syst-sci.net/11/1885/2011/A review of all recent literature available in these 4 components can be seen in Daniell (2009b), but only the difference between capacity spectrum (Applied Technology Council (ATC), 2005) and displacement-based methods is shown above in Fig. 3 as part of the overview for use in the loss assessment.Displacement-based models will be examined here, as these types of models have been seen to provide a significant reduction in error in terms of calculating structural and non-structural damage (Calvi, 1999;Priestley et al., 2007).The capacity spectrum method is a quicker method computationally than the displacement-based method and also requires less building parameters to create the final loss estimate (i.e.building column and beam lengths, depths, etc. are not required).However, it is subject to greater uncertainties than the displacement-based method in locations where all these details are available.

Preliminary acquisition and assessment of ELE software
Considerable research has been done to provide adequate earthquake loss estimation (ELE) models for region specific scenarios and other studies.Many different software packages have been produced around the world in order to provide accurate loss estimates; however, these can be used simultaneously in order to reduce uncertainty in the result.
With the wealth of software packages available for these risk assessment studies and economic, social and infrastructure loss estimations, a synopsis of many available packages has been undertaken and a full documentation can be viewed in Daniell (2009b)

CAPACITY SPECTRUM METHOD DISPLACEMENT-BASED METHOD
The steps are as follows for each building:-1.Material & mechanical properties of buildings for Monte Carlo simulation required.2. Multiple Degree-of-Freedom (DOF) equivalent Single DOF system to produce a random pushover curve (F-∆) Relate displacement capacities (∆) of SDOF systems to the demand (η) of displacement response spectrum at effective response periods of vibration i.e. yield (T y ), limit state 2 and 3 (T 2 and T 3 ) to check limit state position, using formulae.Full method can be seen in Crowley et al. (2006) or Daniell (2009b).
Looks at the crossover of capacity (via a pushover curve) with accelerationdisplacement response spectrum (ADRS) demand to define performance point.This relies on the same step 2 as in displacement-based.
However, then an area method (area under curve/full hysteresis loop) is used to define equivalent non-linear damping (ξ) and ductility for the iteration to get the exact % of ξ.The corresponding displacement capacity (Sd) can then be compared with vulnerability curves and placed in damage states, as done in HAZUS.source.Thus, although documentation and reproduction of every software package is available, the actual versions are not available in most cases, as seen in the modifiable (mod.)column.Many of these procedures can be changed by the user to add complexity to the social and economic loss outputs.
Exposure is a function of the population, remote sensing, building use and other building inventory data used for the test region.Some software coding has been hardwired for only a district or city, whereas some are also able to include regional (R) and full country level analysis.With further coding, some city-district style procedures can be increased to a country level analysis.
An earthquake may have no ground shaking losses, some ground shaking losses, or all ground shaking losses when compared to secondary effects (tsunami, landslide, fire, liquefaction, etc.).Aggregating the result of losses from earthquakes, the following conclusions have been found.In terms of demand or hazard, ground shaking, as demonstrated by Bird and Bommer (2004) Daniell (2010) found that only 75 % of these social losses and approx.85 % of the economic losses were due to shaking, however a much lower amount is due to building collapse.In the Asia-Pacific Region this value reduces to 63 % (Daniell et al., 2010).Secondary effects such as liquefaction, fault rupture, landslides and slope stability, tsunami, and standing waves can cause  much damage.However, due to complexity, these have not been included in most of the ELE software packages.
Table 1 considers the various demand (hazard) possibilities between analysis modes that can be undertaken for earthquake loss estimation.The difference between probabilistic (multiple scenario) and deterministic (scenario-based) SHA is important and thus a desirable software package should allow for both methods, including using real-time, historical and user-specified data to provide a pre-and post-earthquake analysis tool.
The temporal distribution of earthquakes in probabilistic methods is generally looked at in two ways: a Poisson distribution process in which earthquake probability is independent of time from the last earthquake (earthquakes are a random process as shown by the Parkfield prediction exercise - Bakun, 1985); or time-dependent methods which assume that earthquake events are linked temporally.Considering the difficulty of interseismic Coulomb stress modelling, a Poisson distribution process is a reasonable assumption.However, the 2011 Christchurch earthquake has also shown that temporal models are not necessarily better.
For the single scenario deterministic-predicted method, the software can be utilised for a certain chosen earthquake by the user.PAGER and QLARM are the only methods which do not allow this, due to their real-time nature.A userdefined event for the ground motion can sometimes be applied, allowing the user to apply a complex theoretical model or any model desired.In contrast, deterministic-observed values are also used in various packages, utilising either historical ground motions or corresponding to ShakeMap ground motions from an automated near real-time network (i.e.strong-motion networks).This can usually only be applied for a few locations in the world, but the new methodologies of PAGER and QLARM make it possible to employ ground-motion maps.
The Intensity and Response Spectrum are generally linked with the vulnerability component, i.e. intensity-empirical and response-spectrum-analytical methods.Regional and Next Generation (NGA) GMPEs are used in many methods.HAZUS uses a response spectrum based on PGA, Sa = 0.3 s, 1 s, 3 s and many are based on such theories.Most of the software packages also allow for observed, theoretical, or empirical ground motions.Observed spatial ground motion distributions generally use past earthquake catalogues or real-time ground motions to develop the ground motions.Theoretical ground motions derived from seismological models for various earthquake scenarios have also been allowed through this user-defined setting in a few different ELE software packages (DBELA, EQSIM, OPENRISK, REDARS and SP-BELA and most likely in CAPRA, QL2 and SAFER).However, these are time-consuming.Site effects are generally taken into account via geotechnical site classification, i.e.NEHRP site classes (1997) and the relative changing of the bedrock frequency spectrum due to shear wave velocity.Geological classification is also used in a number of city-specific software packages and a few use borehole-based classification.
The vulnerability module can be empirical (Damage Probability Matrices, vulnerability indices, functions and curves, or screening method), analytical (analytical vulnerability curves, capacity spectrum, collapse-based and displacementbased) or hybrid (combination).Occupancy criteria generally include use (residential, etc.) and sometimes occupancy rate (day/night).Structural criteria include basic structural criteria such as number of floors, material properties, and member dimensions.SP-BELA and DBELA use complex failure mechanisms, i.e. simplified pushoverand displacement-based, respectively.Quality criteria also include age of buildings (generally 4 categories) and relative quality of construction, but in complex cases, such as DBELA, SP-BELA, QLARM, and EQSIM, variability in construction materials and type is examined.
Social and economic losses are generally a function of damage.Simple social (Ss) losses usually only include deaths, but sometimes include levels of injuries and homeless.More complex social (Sc) losses include indirect losses, commuting disruptions, dislocation and shelter analysis, as well as social vulnerability.Simple economic (Es) losses include simple damage-based multiplication of floor areas and housing prices, whereas complex economic (Ec) losses include economic vulnerability analysis, indirect economic loss, flow-on market effects and ripple effects.
By applying the test case of the user into Table 1 and setting what the desired complexities are, software packages and/or a coding system can be chosen.

Multicriteria analysis using OPAL to decide optimum software package
For Zeytinburnu, Turkey as a test case, a multicriteria analysis tool was produced in order to aid decision analysis for the 30 reviewed software packages.This uses a number of criteria including the following modules.Each of these has been applied in an easy to use GUI (Graphical User Interface) for people to apply whatever test region they want to code.Depending on complexity within each of the modules, the ranking will change based on the components and information that is available.In some cases, a vulnerability method may be too complex to apply to a certain software package.It also may be that a certain test region has been undertaken that limits the software, or certain hardwiring in the software code means that certain parameters cannot be changed.These 5 modules include: 1. Technical Aspects and Software Detail Module In each of these modules, there is a decision engine for the various contributing components.This is detailed below.This was based on the work of Stafford et al. (2007) with respect to choosing criteria to analyse the models.A series of questions are asked of the user in order to rank the ELE software packages.A range of 110 criteria using qualitative and quantitative measures has been calculated and is used to then rank the packages.A summary of these is shown in Fig. 5.
The following criteria have been selected as a test case for the MCA tool.The codes are shown in Fig. 6.
Site class, buildings, seismological information, and cost data are all present on geocell level 3.
-Hazard wanted: Use of response-spectrum.
-Coding wanted: major -to achieve best result (complex), allows complete changing of functions.-Socio-economic analysis wanted: both complex (3SE).
At the end of the input, the following software tool output from MCA results in the given top 10 ranking: Thus, it was decided that SELENA (HAZUS-based) and a modified HAZUS (MHAZUS) and modified DBELA (MD-BELA) software would be used as they could all be applied at district level, analytical methods could be used (since the given exposure data was of high quality), the software was open-source based, and socio-economic functions and algorithms could be changed.MAEviz has already produced a Zeytinburnu case study so it was not used.

Loss assessment for Zeytinburnu District, Istanbul, Turkey
MHAZUS and MDBELA were coded and produced in MATLAB ™ , the source code of which is available in part in Daniell (2009b)  Hazard is contained within the sphere at every level using module recommendations.Spatial concepts and complexity increases with level i.e. 1 is simplest and largest spatial scale.
With more data, a higher level of accuracy can be used for an ELE software.Computation time also increases.Temporal changes are intrinsically linked in the sphere.
The surface of each sphere defines the seismic risk result and output required for ELE software type.

Review of existing studies
A number of other studies have undertaken a comparison for Turkish conditions including an analysis of certain parameters and also software packages.As part of the EU LESS-LOSS project, a case study was undertaken for Istanbul and its Zeytinburnu district looking at retrofitting strategies as well as exposure and vulnerability function details (Spence et al., 2007).Much of this was based on the previous work in the BU-ARC Project (BU-ARC, 2002).Strasser et al. (2008) undertook a review of 5 different software packages with respect to Istanbul (KOERILOSS, SIGE, ESCENARIS, SELENA, and DBELA).Erduran et al. (2010) showed that a minor difference in damage states was defined by the GMPE chosen.Virtually no difference was seen from the choice of global (PAGER) or local (KOERI) exposure building stocks however the most important parameter is the vulnerability function choice (dependent on code choice).

Exposure
Zeytinburnu District consists of mainly commercial buildings in the north, and primarily residential buildings in the south.It consists of 37 building types (4 masonry types, 33 RC types), 1 to 9 stories high, with 11 250 buildings in 50, 0.005 • × 0.005 • geocells.
As seen in Table 2, this follows the definition and work of Bal et al. (2008a), using numerals for the number of stories, a or b refers to low yield (220 MPa) or high yield steel (420 MPa), and RC Frame (RC) and RC Frame Softstorey (SRC), in defining the Turkish building stock.The 8 HAZUS codes used include p = pre-code (pre-1979 stock), and m = low code (post-1980 stock).
The number of buildings in each geocell is shown in Fig. 8. From aerial photos from a Turkish Govt website, most of the buildings in the Zeytinburnu district were built between 1966 and the present, but the Turkish seismic code was only defined from 1940 to the present and not enforced well (H.Sucuoglu, personal communication, 2008).From 1975, a better seismic code was in place, but again not greatly enforced.Revisions in 1944Revisions in , 1947Revisions in , 1949Revisions in , 1953Revisions in , 1961Revisions in , 1968Revisions in , 1975Revisions in , 1981Revisions in , 1985Revisions in , 1997Revisions in , and 2006 have occurred, but not always affecting Istanbul.In addition, other minor revisions have also occurred.The pre-1979 stock was defined as pre-code, whereas the post-1980 stock was defined as low-code.

Hazard
The Ground Motions used were 100 spatially correlated ground motion (GM) fields, 100 spatially uncorrelated GM fields for MHAZUS and MDBELA, and 1 median GM field and variability for SELENA, HAZUS, and MDBELA.Temporal correlation was not taken into account for this study; however, it has been discussed in Daniell (2009a).
The distance from the closest fault source to the geocell is, as shown in Fig. 8, ranging from 11-16 km.Both aleatory variability (σ ) and epistemic uncertainty (ε) were accounted for in the randomised ground motions up to ±3 standard deviations.

Vulnerability
The Capacity Spectrum Method (CSM) was used for SE-LENA and MHAZUS (with a modified iteration method (MADRS) was also utilised for SELENA.Erduran et al. ( 2010) showed, however, that virtually no difference in damage states is seen between CSM and MADRS.Displacement-based design was used for MDBELA.The flowchart in Fig. 7 shows the process to develop a damage matrix based on limit states.A pre-code assumption was used for the Zeytinburnu district for MHAZUS and SE-LENA, based on the aerial photos and seismic code enforcement assumption.The material and mechanical properties for MDBELA were contributed by Bal et al. (2008b).

Socio-economic loss
Using the 37 building classes of MDBELA, and the 8 HAZUS building classes for MHAZUS and SELENA, and the number of buildings in each damage limit state, the following formula could be used to calculate economic cost of repair.Repair cost per damage limit state is a convolution of floor area, an economic cost of 187 to 225 C per m 2 (from approximate unit construction costs for new buildings in Turkey found in Bal et al., 2007), depending on size of building, number of storeys, damage class repair % as defined below, and the number of buildings in that damage limit state.The mean damage ratio (the ratio of repair to replacement) in each limit state for Turkish conditions was found in Bal et al. (2008a), where any building which is extensively or completely damaged must be demolished as seen in Table 3. Social losses for day and night populations were calculated via equations for night and day.Deaths and injuries by BU-ARC (2002) equations were calculated as a function of building damage.More research is required into the role of seismic intensity versus trapped people, damage state, and casualty classes given the variability of such equations.Included below are the casualty ratio values in Table 4.
MDBELA and SELENA showed approximately the same number of buildings within MHAZUS-based damage classes, whereas MHAZUS showed a high percentage in the complete bracket.Presented in Table 5 is the total damage % for the median of the 100 runs for the spatially correlated ground motions and those of the median ground motion.
The geocell mean damage ratio values are reasonably similar between all methods.As expected, as the site class moves from E to C (i.e. from around a shear wave velocity  of 200 m s −1 to approx.600 m s −1 ), and as distance increases (attenuation effects), the mean damage ratio decreases due to the lower relative ground motions for the median case.For the randomized ground motions this is not the case, due to spatial correlation.Both the MHAZUS and MDBELA methods produce the same spatial distribution of social losses despite having considerably different estimates (thus only MD-BELA is presented in Fig. 8).However, MHAZUS gives higher and more variable social and economic loss values (Figs. 9 and 10).The following economic and social losses for a daytime scenario can be seen in Table 6.
This type of socio-economic information shown in Figs. 8, 9, and 10 can be very useful for emergency response planning, and it is encouraging that all methods show consistent patterns for both day-time and night-time events.Similar analyses can be undertaken by the user for the test case.The mean death toll is 4400 from MDBELA and 5800 from MHAZUS from correlated ground motions, and 3800 from SELENA for the median ground motion.A conclusion was made that the MHAZUS version had unrealistic bias due to  the pre-code assumption employed (it was decided that more of Zeytinburnu was built of pre-1975 quality, even if designed under post-1975 seismic coding).Thus, the complete damage ratio was greater and more deaths and injuries were calculated than SELENA.Seeing as though SELENA and MHAZUS are based on the same principles, the only difference employed was the ability to model correlated ground motions, and the change in code assumption for the building stock.The SELENA building stock was based on the exact figures of the year of the building and a subsequent low code assumption for this proportion of the building stock.However, the variance when calculating the correlated version was still larger than MDBELA.SELENA did not give such a high level of completely damaged buildings but more in the extreme damage range, thus reducing the casualty number which is calculated based on completely damaged buildings, again using BU-ARC (2002).This correlates extremely well with the findings of Griffiths et al. (2007) that 99 % of the buildings in Zeytinburnu are extremely vulnerable.It relates the fact that Düzce had building collapses and extreme damage in around 40 %, and this would be expected in Istanbul.They note that Zeytinburnu has a higher prevalence of 3-storey housing that has been shown to be more vulnerable, thus the numbers could be higher.Thus, these collapse figures are supported.The casualty estimation calculation seems reasonable when the values are compared to those of Gölcuk in the Izmit earthquake (4428 deaths, despite being subjected to 0.82 g).Golcük had a fatality ratio of 3.33 %.In this study, the fatality ratio for Zeytinburnu is at 1.26 % (about 3 times less than Golcük).Another study by the LESSLOSS project had a different scenario earthquake with 1.03 % completely damaged buildings, killing 484 people in the Zeytinburnu district, thus also agreeing in principle with this analysis (Spence et al., 2007).More weighting of an expert panel for software package use should be given to MDBELA using this result.It uses the exact buildings statistics for Turkish conditions based on Bal et al. (2008a, b), and thus reflects the building stock better.The building stock better captures failure dynamics of buildings than that of HAZUS due to the extra detail used.That being said, building stock variability and the use of more than one vulnerability method is preferred.The CSM is a previously tested method, and therefore significant values should also be assigned to SELENA and MHAZUS.Based on a participatory modelling of the quality of the ELE software package result, weights of 0.6 for MDBELA, 0.3 for SELENA and 0.1 for MHAZUS were given based on popularity, use of Turkish conditions and vulnerability method details.MHAZUS was given a lesser ranking simply due to the code assumptions made.More work is required for a methodology to be built for the logic tree calculation of risk results.
A reasonable median estimate of 4400 (1.3 %) deaths during the day was found, with a standard deviation of 3800 deaths, depending on random variability within ground motions.This reduced to 3400 (1.6 %) deaths during the night, given the lower night-time population in Zeytinburnu.It should be noted that this methodology only uses the BU-ARC (2002) casualty functions, but more work must be done to create appropriate Turkey specific casualty functions to obtain a more accurate result that does not rely on building typologies alone.

Conclusions
For the Zeytinburnu District an earthquake of significant magnitude would be catastrophic, and by looking at the information provided as to the locations of the social effects such as deaths and injuries, as well as the infrastructure and lifeline damage locations, disaster response planning can be put in place in order to greatly reduce the number of casualties.ELER, MAEviz, and most other major software packages have attempted to model the Istanbul scenario earthquake.Policy is currently in place within Zeytinburnu to retrofit buildings within the district to seismic standards in order to reduce the approximately 4400 median deaths (DBELAbased) with a possible range of 3766 deaths (by SELENA) to 5762 deaths (by MHAZUS) for a daytime scenario.Such studies into mitigation strategies as undertaken in the LESS-LOSS Project will aid the selection of the building stock to focus on (Spence et al., 2007).This can be done on a district level or a geocell level.The Zeytinburnu district (building value 2.4 billion C) will have repair costs for a mean disaster of approx.1.6-1.8 billion C, which is substantial.This is because repair costs are higher in Turkey than in HAZUS for the USA due to Turkish post-earthquake rebuilding laws.Standard deviations over the 100 ground motions also provide a good prediction of the uncertainty of these figures for insurance and reinsurance.The authors believe that the 5762 deaths are too high, and is a direct result of the seismic code assumption.However, it must be noted that the values in MDBELA and SELENA are preferred with the final value of 4400 median deaths preferred.
Using the OPAL procedure, enough knowledge can be gained to undertake an ELE for a desired test case anywhere in the world.Many ELE software packages have been produced globally, allowing for reasonably accurate damage, social, and economic loss estimates of scenario earthquakes to be made.Displacement-based methods have been found to give less variability in results, but require a reasonable sample of building data to be useful.In areas where not such a detailed dataset of building stock is available to run displacement based methods, capacity spectrum methods should be used.No one method can mimic losses, as has been seen in the recent Christchurch and Tohoku earthquakes.Thus, a combination of two or more software packages into a multitier approach is desirable for greater distribute accuracy.Unfortunately, a large disclaimer must be placed: while all popularly used loss modelling routines are based on similar concepts which are consistent, it is the larger, less frequent earthquakes for which the data is required so that better modelling can be undertaken in the future.
This paper shows the process to obtain an earthquake loss estimate from first principles using open source software.Additional work should be undertaken to correctly determine the seismic code use and enforcement within Zeytinburnu by better defining which parts of the district should fall into which seismic code level of HAZUS (pre-, low-, moderateor high-code).
OPAL is an ongoing open-source project with further software production and data tools to be generated as part of a second phase, including further production of MDBELA and MHAZUS in open-source coding software.The use of the MCA in selecting suitable software packagers for researchers has already been used by different programmers of earthquake loss estimation routines.
in 50 earthquakes reviewed from 1980-2003, contributes most (approx.90 %) to the social and economic losses in earthquakes, and therefore only ELE software packages which consider ground shaking have been tabulated.Marano et al. (2010) use the PAGER -CAT catalogue from 1968-2008 for 749 fatal earthquakes, showing that the expanded data show approximately 21.5 % of social losses are due to secondary effects of earthquakes.Through work looking at around 1950 fatal earthquakes from 1900-2010 using the CATDAT catalogue,

Figure 5 .
Figure 5.The Components contributing to the MCA Decision Tool

Fig. 5 .
Fig. 5.The components contributing to the MCA Decision Tool.

2 Figure 8 .LongitudeFig. 8 .
Figure 8. Left:-Geocell NEHRP Site Class (C, D or E), Distance from the closest source (in 3 km) and no. of buildings in that geocell, Right:-Mean Damage Ratio per geocell for 4 MDBELA given Bal et al. (2008a) damage ratios for Turkish settings 5

Table 1 .
A synopsis of the components of 30 mostly open source worldwide ELE software packages.
1. Availability of Software Packages -closed or open source, availability and contact, method and documentation.2. Update and Development Status -updated since 2007?, age, development status as of 2010.3. Hardware and Software Needed -hardware required, source code software, licensed software needed.4. Regional Applicability -applicable regions in the world, spatial level, test regions used for the software.

Table 2 .
Building damage % in HAZUS-based codes for the 3 ELE packages.

Table 4 .
Casualty rates for Reinforced Concrete Buildings and Masonry Buildings used as adapted fromBU-ARC (2002).

Table 5 .
Building damage % in HAZUS-based classes for the 3 ELE packages.

Table 6 .
Economic and social losses for daytime scenarios (population = 353 657) for median and correlated ground motions for 3 different ELE software packages.