Where to find more information

Here is a list of readings that I’ve compiled for my classes and students over the years on topics relating to the intersection of technology and disaster, crisis, risk management.  This list is kept current as older material becomes obsolete and new materials is available.  Let me know what you think are solid reference materials on these topics. Continue reading Where to find more information

Multimodal Learning

Educators are in constant search for more efficient and effective ways to advance student learning.  Thus it is no surprise that educators have been interested in the often-quoted saying that:

We remember…

  • 10% of what we read
  • 20% of what we hear
  • 30% of what we see
  • 50% of what we see and hear
  • 70% of what we say
  • 90% of what we say and do

Unfortunately, these oft-quoted statistics are unsubstantiated.  This article: Multimodal-Learning-Through-Media breaks the myth of the “cone of learning” where people only remember 10% of what they read. Continue reading Multimodal Learning

The Myths created by the Mercator Projection

The Mercator Projection is the biggest myth about the Earth that we pass on (often unknowingly) to our children.  OK, I don’t know if it is the biggest but it certainly builds the wrong perception of the globe.

The Mercator projection was originally designed in the mid 1500’s.  A highly useful projection because it kept course lines constant.  A ship’s navigator could plot a course with a straight line from one port to another.  No map projection can keep all features accurate, so the Mercator projection distorts the size and shape of large objects.  Land masses at the Equator appear smaller and land masses at the poles are magnified significantly. Continue reading The Myths created by the Mercator Projection

Image Manipulation

Truth in advertising should extend to photo manipulation too. Here’s a few videos that highlight the point:

A great example of the power of Photoshop in the right hands.


Here’s an interesting (and honest) video from McDonald’s on their advertising: http://youtu.be/oSd0keSj2W8

Another one called “The Photoshop Effect”: http://youtu.be/YP31r70_QNM

And finally this one: http://youtu.be/17j5QzF3kqE

But then, not everything is fake: http://youtu.be/2Pd_ZHw9xH0


[Update: I’ve added more at Image Manipulation, part 2]

A little more on usability

About two years ago, I did a blog regarding usability. This video adds to that including my thoughts on BYOD (Bring Your Own Device) and the impact on disaster technology. Regardless of how the future rolls out, the advances in technology should not make things more complex for the users. In fact, the additional computing power needs to be used to make work easier for the users.

Historic Information Breakdowns

Risk managers study causes of tragedies to identify control measures in order to prevent future tragedies.  “There are no new ways to get in trouble, but many new ways to stay out of trouble.” — Gordon Graham

Nearly every After Action Report (AAR) that I’ve read has cited a breakdown in communications.  The right information didn’t get the right place at the right time.  After hearing Gordon Graham at the IAEM convention , I recognized that the failures stretch back beyond just communications.  Gordon sets forth 10 families of risk that can all be figured out ahead of an incident and used to prevent or mitigate the incident.  These categories of risk make sense to me and seemed to resonate with the rest of the audience too.

Here are a few common areas of breakdowns:

Standards: Did building codes exist?  Were they the right codes?  Were they enforced?  Were system backups and COOP testing done according to the standard?

Predict: Did the models provide accurate information?  Were public warnings based on these models?

External influences: How was the media, public and social media managed?  Did add positively or negatively to the response?

Command and politics: Does the government structure help or hurt?  Was Incident Command System used?  Was the situational awareness completed?  Was information shared effectively?

Tactical: How was information shared to and from the first responders and front line workers?  Did these workers suffer from information overload?


“Progress, far from consisting in change, depends on retentiveness. When change is absolute there remains no being to improve and no direction is set for possible improvement: and when experience is not retained, as among savages, infancy is perpetual. Those who cannot remember the past are condemned to repeat it.”  — George Santayana

I add that in since few people actually know the source and accurately quote it.  Experience is a great teacher.  Most importantly, remembering the past helps shape the future in the right direction.

Below are a list of significant disasters that altered the direction of Emergency Management.  Think about what should be remembered for each of these incidents, and then how these events would have unfolded with today’s technology – including the internet and social media.

Seveso, Italy (1976).  An industrial accident in a small chemical manufacturing plant.  It resulted in the highest known exposure to 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) in residential population.  The local community was unaware of the risk.  It was a week before public notification of the release and another week before evacuations.

Bhopal Methyl Isocyanate Release (1984).  An industrial accident that released 40 tones of MIC.  There was no public warning.  The exact mixture of the gas was not shared so the first responders did not know how to treat the public.

Chernobyl Nuclear Disaster (1986).  An explosion at the plant and subsequent radioactive contamination of the surrounding geographic area. Large parts of Europe and even North America were contaminated.  The Communistiic regime hid the initial information and did not share information until another country detected it.

Hurricane Hugo (1989).  At the time, this was the costliest hurricane disaster.  There was an insufficient damage assessment that lead to wrong resource allocation.  The survivors in rural communities were not located and responded to for many days.  Much of the response was dependent on manual systems.

Loma Prieta (1989).  A M7 earthquake that injured around 3800 in 15 seconds.  Extensive damage also occurred in San Francisco’s Marina District, where many expensive homes built on filled ground collapsed and / or caught fire. Beside that major roads and bridges were damaged.  The initial response focused on areas covered by the media.  Responding agencies had incompatible software and couldn’t share information.

Exxon Valdex (1989).  The American oil tanker Exxon Valdez clashed with the Bligh Reef, causing a major oil leakage.  The tanker did not turn rapidly enough at one point, causing the collision with the reef hours. This caused an oil spill of between 41,000 and 132,000 square meters, polluting 1900 km of coastline.  Mobilization of response was slow due to “paper resources” that never existed in reality.  The computer systems in various agencies were incompatible and there was no baseline data for comparison.

Hurricane Andrew (1993).  Andrew was the first named storm and only major hurricane of the otherwise inactive 1992 Atlantic hurricane season. Hurricane Andrew was the final and third most powerful of three Category 5 hurricanes to make landfall in the United States during the 20th century, after the Labor Day Hurricane of 1935 and Hurricane Camille in 1969.  The initial response was slowed due to poor damage assessment and incompatible systems.

Northridge Earthquake (1994).  This M6.7 earthquake lasted 20 seconds.  Major damage occurred to 11 area hospitals.  The damage made FEMA unable to assess the damage prior to distributing assistance.  Seventy-two deaths were attributed to the earthquake, with over 9,000 injured. In addition, the earthquake caused an estimated $20 billion in damage, making it one of the costliest natural disasters in U.S. history.

Izmit, Turkey Earthquake (1999).  This M7.6 earthquake struck in the overnight hours and lasted 37 seconds.  It killed around 17,000 people and left half a million people homeless.  The Mayor did not receive a damage report until 34 hours after the earthquake.  Some 70 percent of buildings in Turkey are unlicensed, meaning they did not get approval on their building code.  In this situation, the governmental unit that established the codes was separate from the unit that enforced the codes.  The politics between the two units caused the codes to not be enforced.

Sept 11 attacks (2001).  The numerous intelligence failures and response challenges during these three events are well documented.

Florida hurricanes (2004).  The season was notable as one of the deadliest and most costly Atlantic hurricane seasons on record in the last decade, with at least 3,132 deaths and roughly $50 billion (2004 US dollars) in damage. The most notable storms for the season were the five named storms that made landfall in the U.S. state of Florida, three of them with at least 115 mph (185 km/h) sustained winds: Tropical Storms Bonnie, Charley, Frances, Ivan, and Jeanne. This is the only time in recorded history that four hurricanes affected Florida.

Indian Ocean Tsunami (2004). With a magnitude of between 9.1 and 9.3, it is the second largest earthquake ever recorded on a seismograph. This earthquake had the longest duration of faulting ever observed, between 8.3 and 10 minutes. It caused the entire planet to vibrate as much as 1 cm (0.4 inches) and triggered other earthquakes as far away as Alaska.  There were no warning systems in the Indian Ocean compounded by an inability to communicate with the population at risk.

Hurricane Katrina and Rita (2005).  At least 1,836 people lost their lives in the actual hurricane and in the subsequent floods, making it the deadliest U.S. hurricane since the 1928 Okeechobee hurricane.  There were many evacuation failures due to inadequate considerations of the demographic.  Massive communication failures occurred with no alternatives considered.


Additional resources


Data in standard uniforms

Data standards

Standards are a common language for discussing and sharing data that can be approved or ad-hoc.  A standard is defined by the people who use it.  That is key.  In the end, it doesn’t matter if the standard is approved by a governing body or not.  What matters is that the people who use it agree to it.  When used properly, standards will save time and money, and ensure quality and completeness.

In a meeting about missing persons’ data standards it was stated that if the Red Cross, Facebook and Google agreed on a standard to share data, then everyone else will follow.  Not because the three organizations are a governing committee but instead they would be the three largest players in the space.

Data standards make it possible for you to share data within and between organizations.  They make it possible to compare different sets of data for improved analysis.  They form the basis of data infrastructure (framework for collecting, storing and retrieving data).

Here are a few examples of data standards: