When the going gets tough, hams get going

Reprinted from Urgent Communcations at http://urgentcomm.com/disaster-response/when-going-gets-tough-hams-get-going

When the going gets tough, hams get going

Mar. 19, 2013
Merrill Douglas | Urgent Communications

A handheld radio, portable antennas, extra batteries and cables, a soldering iron, clean clothes, snack bars and a length of rope.

That’s some of what you’ll find in a “go-bag.” And if you’re one of the many amateur-radio operators who volunteer during local emergencies, you always keep a go-bag packed. When disaster strikes, you grab it and rush to a Red Cross shelter, an emergency operations center (EOC) or some other activity hub to do what you do best — get messages through, despite all sorts of obstacles.

They don’t often get a lot of publicity, but amateur-radio operators — or “hams” — play an important role in emergency response.

“They’re a prime example of a grassroots effort,” said Keith Robertory, manager of disaster response emergency communications at the American Red Cross in Washington, D.C. “They live where the disaster occurs, and they already have the equipment, the knowledge of the location and knowledge of how the disaster would impact that location. So they’re immediately there and can start doing work.”

Hams often swing into action well before a storm or other event causes havoc on the ground. During hurricane season in the Caribbean, for instance, hams in that region keep their eyes on the weather out their windows, said David Sumner, chief executive officer (CEO) of the American Radio Relay League (ARRL) in Newington, Conn. They use their radios to call in observations to the National Hurricane Center in Miami.

As the storm passes, it might knock out power and damage antennas, “so they rig another antenna, start up the generator, and they’re back in business,” Sumner said

When hurricanes, blizzards, ice storms, earthquakes, tornadoes or other forces of nature cause widespread damage, hams get to work wherever they’re needed. In some cases, they transmit messages to take the place of two-way radio or phone systems that have been rendered inoperable in the aftermath of a disaster.

For instance, as Superstorm Sandy overwhelmed parts of the northeastern U.S. last October, some hams assisted regional hospital systems that had lost the ability to communicate among their buildings, Robertory said.

“Somebody would go to them and say, ‘We need this message passed to this building,'” he said. “They would get on the radio, call the amateur-radio operator in that other building, and give them the message.” The second operator then carried the message to the recipient.

Amateur-radio operators also help individuals contact family members, help the Red Cross conduct damage assessments and help get shelters established, Robertory said. For instance, people in a shelter might want to register on the Red Cross’s “Safe and Well” system to let family and friends know that they’re okay, but the shelter might not have power or Internet access at the time.

“An amateur-radio operator can call an amateur-radio operator somewhere else who has Internet access and relay information to put into a missing-persons database,” Robertory said.

Even when other networks are operating, ham operators take some of the load off those communications systems when traffic gets heavy.

Quick response

In the aftermath of Sandy, volunteers with the Greater Bridgeport Amateur Radio Club in Connecticut handled messages for three evacuation centers housing about 800 local residents.

“They were ready to take calls and dispatch people,” said Dana Borgman, press information officer for Region 2 of Connecticut Amateur Radio Emergency Service (ARES), a volunteer organization. “The messages could be about supplies, logistics — any kind of reports.”

Public-safety communications networks in Bridgeport were operating at the time, Borgman said. Ham radios supplemented those channels. But, if the phone system in a shelter stopped working, hams could step into the void.

“If someone in a shelter needed to make a request, they could call someone at a different point, such as the EOC,” Borgman said. “They’d establish communication and say, ‘I have a request from the shelter manager. We need 200 cots and more fresh water.'” An operator at the other end would relay the request to the appropriate person.

Members of ARRL’s New York City-Long Island section provided similar aid after Sandy. At the time, Jim Mezey — now manager of that section — held the emergency coordinator’s post. Because he lives in Nassau County on Long Island, he focused most of his attention there.

“I did a lot of traveling,” he said. “I was without power for a while, so I used my mobile station to do most of my work. I also moved to the county EOC and worked with the Radio Amateur Civil Emergency Services (RACES)” — another volunteer group. For the most part, however, section members provided services to the Red Cross.

Finding enough manpower during the emergency became a bit tricky, because many of the radio volunteers from Long Island live on the hard-hit South Shore, Mezey said.

“They had their own problems with floods and losing power,” he said. “Their batteries lasted only so long, and that was it. No gasoline, no way to get around.”

Of course, for volunteers whose homes were flooded, taking care of their own families took top priority, he said.

Amateur clubs can swing into action quickly because they maintain ongoing partnerships with myriad emergency-response organizations. The ARRL has developed memoranda of understanding with 13 national organizations, such as the American Red Cross, the Association of Public-Safety Communications Officials (APCO), the Salvation Army and the Federal Emergency Management Agency (FEMA). Many operators also take advantage of training opportunities.

“A lot of the amateur-radio operators are now becoming CERT (Community Emergency Response Team) members,” said Borgman. “Also, we encourage our members to take all of the ICS (Incident Command System) training.”

ICS training teaches operators about the structure of incident command and how to use standard terminology, rather than terms specific to police, firefighters, radio operators or other specialists.

Beyond delivering messages, hams offer a lot of miscellaneous technical assistance, some of which is quite ingenious, Robertory said.

“They like to ‘MacGyver’ things,” he said. “You’ll hear a lot of amateur-radio people say, ‘Give me a car battery, an antenna and a radio and I can communicate from anywhere.'”

In times of disaster, hams tend to be extremely flexible, Robertory said.

“In the morning, they’ll set up an antenna and start communicating,” he said. “They’ll set up a satellite dish for us, and then they’ll set up a computer. They’ll troubleshoot a printer, and then they’ll teach someone how to use the fax machine.”

Clearly, when the going gets tough, it’s great to have someone on hand with a go-bag, a radio — and the attitude of a ham.

Fairfax County Roundup

Here are my Fairfax Roundup Speaker Notes from the Fairfax Roundup meeting.  The meeting is a great local event to build the community relationships between faith and community based organizations and the local government entities.  There were five breakout sessions.  My session was about technology in disasters.

For those who attended, additional details on the topics I spoke about can be found in the following blog entries.

PACE: http://keith.robertory.com/?p=664

Social Media: http://keith.robertory.com/?p=802

Public Notifications: http://keith.robertory.com/?p=732

Radio Types and Bands: http://keith.robertory.com/?p=674

Cellular Communications: http://keith.robertory.com/?p=676

Feel free to email me if you have any questions.


Historic Information Breakdowns

Risk managers study causes of tragedies to identify control measures in order to prevent future tragedies.  “There are no new ways to get in trouble, but many new ways to stay out of trouble.” — Gordon Graham

Nearly every After Action Report (AAR) that I’ve read has cited a breakdown in communications.  The right information didn’t get the right place at the right time.  After hearing Gordon Graham at the IAEM convention , I recognized that the failures stretch back beyond just communications.  Gordon sets forth 10 families of risk that can all be figured out ahead of an incident and used to prevent or mitigate the incident.  These categories of risk make sense to me and seemed to resonate with the rest of the audience too.

Here are a few common areas of breakdowns:

Standards: Did building codes exist?  Were they the right codes?  Were they enforced?  Were system backups and COOP testing done according to the standard?

Predict: Did the models provide accurate information?  Were public warnings based on these models?

External influences: How was the media, public and social media managed?  Did add positively or negatively to the response?

Command and politics: Does the government structure help or hurt?  Was Incident Command System used?  Was the situational awareness completed?  Was information shared effectively?

Tactical: How was information shared to and from the first responders and front line workers?  Did these workers suffer from information overload?


“Progress, far from consisting in change, depends on retentiveness. When change is absolute there remains no being to improve and no direction is set for possible improvement: and when experience is not retained, as among savages, infancy is perpetual. Those who cannot remember the past are condemned to repeat it.”  — George Santayana

I add that in since few people actually know the source and accurately quote it.  Experience is a great teacher.  Most importantly, remembering the past helps shape the future in the right direction.

Below are a list of significant disasters that altered the direction of Emergency Management.  Think about what should be remembered for each of these incidents, and then how these events would have unfolded with today’s technology – including the internet and social media.

Seveso, Italy (1976).  An industrial accident in a small chemical manufacturing plant.  It resulted in the highest known exposure to 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) in residential population.  The local community was unaware of the risk.  It was a week before public notification of the release and another week before evacuations.

Bhopal Methyl Isocyanate Release (1984).  An industrial accident that released 40 tones of MIC.  There was no public warning.  The exact mixture of the gas was not shared so the first responders did not know how to treat the public.

Chernobyl Nuclear Disaster (1986).  An explosion at the plant and subsequent radioactive contamination of the surrounding geographic area. Large parts of Europe and even North America were contaminated.  The Communistiic regime hid the initial information and did not share information until another country detected it.

Hurricane Hugo (1989).  At the time, this was the costliest hurricane disaster.  There was an insufficient damage assessment that lead to wrong resource allocation.  The survivors in rural communities were not located and responded to for many days.  Much of the response was dependent on manual systems.

Loma Prieta (1989).  A M7 earthquake that injured around 3800 in 15 seconds.  Extensive damage also occurred in San Francisco’s Marina District, where many expensive homes built on filled ground collapsed and / or caught fire. Beside that major roads and bridges were damaged.  The initial response focused on areas covered by the media.  Responding agencies had incompatible software and couldn’t share information.

Exxon Valdex (1989).  The American oil tanker Exxon Valdez clashed with the Bligh Reef, causing a major oil leakage.  The tanker did not turn rapidly enough at one point, causing the collision with the reef hours. This caused an oil spill of between 41,000 and 132,000 square meters, polluting 1900 km of coastline.  Mobilization of response was slow due to “paper resources” that never existed in reality.  The computer systems in various agencies were incompatible and there was no baseline data for comparison.

Hurricane Andrew (1993).  Andrew was the first named storm and only major hurricane of the otherwise inactive 1992 Atlantic hurricane season. Hurricane Andrew was the final and third most powerful of three Category 5 hurricanes to make landfall in the United States during the 20th century, after the Labor Day Hurricane of 1935 and Hurricane Camille in 1969.  The initial response was slowed due to poor damage assessment and incompatible systems.

Northridge Earthquake (1994).  This M6.7 earthquake lasted 20 seconds.  Major damage occurred to 11 area hospitals.  The damage made FEMA unable to assess the damage prior to distributing assistance.  Seventy-two deaths were attributed to the earthquake, with over 9,000 injured. In addition, the earthquake caused an estimated $20 billion in damage, making it one of the costliest natural disasters in U.S. history.

Izmit, Turkey Earthquake (1999).  This M7.6 earthquake struck in the overnight hours and lasted 37 seconds.  It killed around 17,000 people and left half a million people homeless.  The Mayor did not receive a damage report until 34 hours after the earthquake.  Some 70 percent of buildings in Turkey are unlicensed, meaning they did not get approval on their building code.  In this situation, the governmental unit that established the codes was separate from the unit that enforced the codes.  The politics between the two units caused the codes to not be enforced.

Sept 11 attacks (2001).  The numerous intelligence failures and response challenges during these three events are well documented.

Florida hurricanes (2004).  The season was notable as one of the deadliest and most costly Atlantic hurricane seasons on record in the last decade, with at least 3,132 deaths and roughly $50 billion (2004 US dollars) in damage. The most notable storms for the season were the five named storms that made landfall in the U.S. state of Florida, three of them with at least 115 mph (185 km/h) sustained winds: Tropical Storms Bonnie, Charley, Frances, Ivan, and Jeanne. This is the only time in recorded history that four hurricanes affected Florida.

Indian Ocean Tsunami (2004). With a magnitude of between 9.1 and 9.3, it is the second largest earthquake ever recorded on a seismograph. This earthquake had the longest duration of faulting ever observed, between 8.3 and 10 minutes. It caused the entire planet to vibrate as much as 1 cm (0.4 inches) and triggered other earthquakes as far away as Alaska.  There were no warning systems in the Indian Ocean compounded by an inability to communicate with the population at risk.

Hurricane Katrina and Rita (2005).  At least 1,836 people lost their lives in the actual hurricane and in the subsequent floods, making it the deadliest U.S. hurricane since the 1928 Okeechobee hurricane.  There were many evacuation failures due to inadequate considerations of the demographic.  Massive communication failures occurred with no alternatives considered.


Additional resources


Satellite Comms and Antennas

Satellite Communications

Satellites provide a valuable link during disasters since it requires no local terrestrial infrastructure beyond where you are setting up.  Cell phones require cell towers within a few miles to be working and not overloaded.  Wireline services require a connection through the disaster to where you are.  Satellite systems do require a power source.  Depending on the size, it can be a vehicle’s 12 volt power outlet, a portable generator or a vehicle mounted generator.

A satellite is in an orbit around the Earth.  There are many different ways to position a satellite in orbit depending on the need.  A common orbit for communication satellites is a geostationary orbit 22,236 miles above the Earth.  Precision is needed when working with satellites at that distance.  One degree off and the satellite will be missed by 388 miles.  That is like aiming to land in Washington DC and really ending up in Detroit or Boston.


The antenna used makes a big difference.  Let’s start by looking at a two-way radio antenna.  Most handheld two-way radios have an omni-directional antenna.  That means it doesn’t favor any specific direction so orientation doesn’t matter as much.  This gain in flexibility is matched with a loss in “punch” or sending power.  Imagine a basic light bulb in a lamp with no shade.  It spreads light everywhere.  That’s how an omni-directional antenna works.

What if you want that light to be focused to only project sideways?  Like an all around white light on a boat.  The bulb and the lens are constructed to direct the light in a specific pattern.  This is similar to a high gain antenna.  The main punch of the radio signal is increased perpendicular to the antenna by reducing the energy projected parallel out the top and bottom of the antenna.

Now you want to project light in a single focused direction such as a spot light or flashlight.  The bulb is constructed with reflectors and other features to direct the light.  The same is true with radio antenna.  A directional antenna is also called a beam or Yagi-Uda antenna.  Most of your neighborhood roof mount TV antennas take this form.  A series of metal rods direct the radio waves to a higher focus then the use of a single rod.

Wait, a TV antenna?  I thought those were receiving only?  The neat thing about antennas is that they will receive with the same characteristics as they transmit.  A highly directional antenna is more sensitive and will better pick up a signal from its pointed direction then an omni-directional antenna.  However, if the same signal came from a different direction then where the directional antenna is pointed, then the omni-directional antenna will receive it better.

So why don’t we always use directional antennas?  Think back to two-way radio repeaters.  The repeater’s antenna an example of when you want to broadcast the signal widely.  Directional antennas are good for communications between two known locations.  Onmi-directional antennas are good when you don’t know where the other location is, or it keeps changing and moving the antenna continually is impractical.

Going back to our analogy of light to describe radio waves, now imagine that you need a highly focused light.  A laser pointer; it is designed to send out a highly focused beam of light that can be seen for long distances.  The radio version of this is the satellite dish.  The transmitter bounces the signal off a parabolic reflector which theoretically sends all the energy in the same direction in a narrow beam.  These very narrow focus antennas are called “very small aperture terminal” (VSAT).

This series of examples is just discussing the shape of the antenna relative to the direction and focus of the radio waves.  It is possible to use practically any frequency with any shape of antenna so long as the antenna is properly tuned.  Different radio bands are naturally more efficient for communication modes when combined with certain types of antennas.

The important thing to remember here is just because you can do something doesn’t mean you want to do it.  This is where you need to rely on your radio technicians to design the most effective system using the right frequencies and modes to get the message to the final destination.


Additional readings


Getting a little more technical on satellites

You’ll probably want to start with the entry on satellite communication and antennas before this one.


Orbits are put into four levels based on altitude.  Low Earth Orbit is about 100 miles to 1240 miles.  The short distance allows low-power devices with omni-directional antennas to be used.  The most common example is satellite phones, like Iridium.  These satellites circle the Earth in 90 minutes.  From any spot on the Earth, the satellite will only be visible overhead for 10 minutes.

Being visible isn’t referring to seeing it with your naked eye.  Visible means a direct line-of-sight view of a spot so communications can occur.  It also assumes a large sky, such as being in Montana or the open ocean.  The more clutter blocking the sky reduces the satellite visibility.  This includes hills, mountains, trees, buildings and other obstructions, or being in a low spot like a valley.  It is nearly impossible to use satellite equipment in downtown New York City at ground level due to all the buildings.

Medium Earth Orbits are 1240 miles to under 22,236 miles.  At this altitude, the satellites orbit the earth in four to six hours.  That extends the visibility overhead to about two hours.  GPS satellite operate in this orbit.

Geosynchronous Orbits are satellites at 22,236 miles.  It takes a full day to orbit the Earth so the satellite will appear in the same spot of the sky once a day.

Geostationary Orbits are satellites at 22,236 miles and parallel to the equator.  Since the satellite is moving at the same speed the Earth is rotating and in the same plane as the rotation, the satellite is in the same spot of the sky all the time.  This is the most popular orbit to park a satellite in.

High Earth Orbits are satellites above 22,236 miles.  They are not commonly used for our purposes.

Footprints, beams and look angles

A satellite’s footprint is the circular area on the surface of the Earth that is visible to the satellite.  This is the potential area of coverage that the satellite can communicate with.  Areas directly under the satellite will receive a stronger signal then those on the fringe areas due to the increased distance and atmospheric interference.

Satellite operators want efficient use of their equipment so they use beams, or specifically focused transceivers, to cover areas within the footprint.  Imagine a satellite positioned above the equator roughly centered on the United States.  The satellite operator’s intended audience is maritime users.  They would focus the beams toward the waters of the Atlantic, Pacific, Gulf and Great Lakes; and away from inland areas knowing that there are few oceangoing freighters in Colorado.  No energy is wasted trying to fill a part of the satellite’s footprint where it will never be used.  When evaluating a satellite for use, you need to consider the beams and not the total footprint.

The more transceivers a satellite has, the more traffic it can handle at simultaneously.  Satellite operators rate their satellites by the total cumulative traffic it can handle simultaneously through the entire satellite.  The other factor when evaluating service is how much traffic a specific beam can handle.  In normal daily use, it is hard to overload a single spot beam as the resources are geographically dispersed.  A catastrophic disaster will bring many of these resources to a single geographical location; all trying to use the same beam.  That is when the beam will be overloaded.

Outages and overloads on satellite services are common during major hurricanes such as Katrina, Rita, Gustav and Ike.  This is most common on shared satellite services.  Consider the satellite user density that occurs with the convergence of local, state and Federal responders; media and observers; utility companies restoring service; private companies COOPing; and NGOs, CBOs and FBOs responding as well.  Many of these rely on some form of satellite service.  Paying for dedicated satellite airtime is quite costly especially when they only use it occasionally.  Now that a disaster has occurred, they all want to use it at the same time.  In many ways, a satellite in orbit is similar to a cell tower: they are designed to maximize revenue efficiently for normal use, and extreme circumstances quickly exceed the designed capacities.

Finding a satellite in the sky is done through look angles.  These measurements are unique based on the observer’s location.  With geostationary satellites, the look angles will remain constant so long as the observer’s location remains constant.  A look angle is made up of three parts: the azimuth, elevation and polarization.  The azimuth is the compass direction (0-360°).  The elevation is how high to look up (0-90°).  Polarization is rotating the transmitter to align the radio waves with the satellite.

Here’s an exercise.  Imagine that we are using Intelsat’s G-18 satellite located at 123° West.  This is a geostationary satellite so we know that it will be above the equator and 22,236 miles up.  123° West is near the California coast.  If we are in San Francisco, the azimuth would be 180° and elevation 46°.  The higher the elevation, the easier it is to clear tree and other obstacles.  Move to St. Thomas, USVI; the azimuth is 258° and elevation is 22°.  St. Thomas is an island with hilly peaks in the middle so a satellite shot is unlikely from the NE side of the island due to the low look angle.  Change our location again to Boston; the azimuth becomes 242° and elevation drops to 19°.  The same situation occurs in Maine where the elevation is very close to the horizon.  We were lucky during an operation in Maine and setup headquarters at a military airbase that had a runway near the same angle we needed.

Bands and frequencies

Just as two-way radios have a number of different bands with difference characteristics, so do satellites.  Satellites operate at higher frequencies then two-way radios.  A number of these frequencies are shared with terrestrial services.  For example, the S Band (2-4 GHz) includes both satellite radio (Sirius), and Wifi and Bluetooth signals.

Inmarsat BGANs operate on the L-band (1-2 GHz).  These are small, easy to point and decent global coverage.  The major issue with BGANs is the low data throughput (32 to 256 kbps) and high cost.

C-Band (3.7-8 GHz) is well known for the “direct to home” TV signals using larger dishes of 2 – 3½ meters in diameter.  The downside is that C-band has power restrictions and receives interference from microwave services.

Ku-Band (12-18 GHz) doesn’t have the power restrictions of the C-band and is used by the DirecTV system.  The main challenge of Ku-band is the nearness to the resonant frequency of water.  This means that water absorbs radio waves reducing the strength of the signal.  This is commonly called rain fade.  If you have DirecTV, you’ve experienced this when your signal goes out during heavy rain storms.  The wavelength absorption peaks at 22.2 GHz.  For non-technical purposes, think of it this way: The subscript U representing being under this peak.  The subscript A of the Ka-band represents being at or above this peak.

These characteristics are important considerations depending how satellite service will be used.  A Ku-band service will not help you for communications during the storm, but it will have the fastest speeds before and after the storm.  The C-band could work in the storm, but the size of the dish makes portability unlikely and temporary setups risky in high winds.  L-band will get through nearly all the time, but only a relatively slow speeds and high cost.


Additional resources


If radio waves were visible light

There is a lot more in common between radios and cell phones then most people expect.  It can be hard to see similarities when the user interfaces are designed so differently.  Fundamentally, they both have a power source that drives the device to generate a signal across an antenna.  In turn, the antenna generates radio waves that run through the atmosphere until they hit another antenna attached to a receiver.

If you could see radio waves, they’d appear as if we had hundreds of lights turned on all around us.  We’d see the waves coming off our cell phones, wifi-enabled devices, blue-tooth devices, wireless phones, cellular-enabled tablets and hot spots.  Also visible is the radio waves from your neighbors’ equipment coming right through your walls as if the walls weren’t even there.  The wireless baby monitor would probably appear just as annoying as the tantruming child.  Larger sources of radio waves would emanate from cell towers.  Way off in the distance, AM and FM towers would glow like a sun.  Even the fast food drive through isn’t immune due to the wireless headsets and speakers.  Look to the sky and you’ll see the satellites sending their signal to the earth.  Right above the equator, the concentration of transmitting satellites would resemble the Milky Way.    Add in all the natural sources and unintended sources from poorly designed electrical systems to really complete the image.  No lie.  Radio waves are everywhere.

In the US, the National Telecommunications and Information Administration will set the broad allocation of the spectrum and how it can be used.  They publish the US Frequency Allocations: The Radio Spectrum chart.  It is very finely divided down, yet you’ll still see major sections allocated to broadcasting.  Spectrum is a finite resource.  We cannot create any more and all of it is allocated to something.  That is why spectrum management is so important.  Broadcasting has had to make better and more efficient use of the spectrum to keep it.  Hence the evolution of HD radio; which by the way is hybrid digital not high definition.  It also led to the use of Digital TV to include more information and resolution in the TV station’s broadcast.

At the bottom of this chart is the full spectrum.  Near the left end is the audible wave lengths; the middle contains a very narrow band of the visible spectrum; and the far right is cosmic rays.  The continuous range of frequencies (and then some) is called “DC to daylight”.  DC refers to direct current or 0 Hertz.  Daylight refers to the band of visible light, starting about 405 THz.  Thz is Terahertz or 1012 Hertz.  If you’re used to the metric system, Tera comes after Giga.  Looking for a radio that does “DC to daylight” isn’t a literal radio.  It is referring to a radio that will continuously cover all possible radio bands.  Keep in mind that the more bands (frequency ranges) a radio will cover; the less impressively it can master a single band.  Think of it this way: a Swiss army knife provides a lot of tools which are better than nothing, but far less handy then having the actual tool needed.


Additional reading

National Telecommications and Information Administration. (2003). U.S. Frequency Allocation Chart.  Retrieved from http://www.ntia.doc.gov/osmhome/allochrt.html



PACE and Interoperable Communications

Primary, Alternate, Contingency, Emergency

PACE is a structure to build a communications plan.  The key to a good communications plan is that everyone has a basic idea of what will be attempted and when.  It makes no sense to be standing by a fax machine while someone is trying to call you on the radio.

Primary is the day-to-day communication system used.  This could be desk phone or cell phone for most businesses.  It can be the two-way radio system used in public safety.  Primary is the first way that you attempt to reach someone during routine times.

Alternate is the next-used system.  If you normally call a person at their desk and they are not there, the next step might be to call their cell, use a radio, page them, or call their home.  It could even be email, text messaging, whatever; there is no single right answer.  It will all depend on what systems your organization uses.  The right answer will be the consistent one that everyone knows.  This avoids the “Oh, I was listening for you on the radio; I didn’t think to check email” confusion.

Contingency is the system that you fall back to when the main methods of communicating are not working or not able to reach the person.  Normally, when this point is reached, it is obvious that something not routine is going on.  When including radios in your plan, make certain that predefined frequencies and modes have been agreed on and shared.  Saying generically that you’ll use amateur band or business band two-way radio is like telling someone that you’ll meet in Virginia and not be any more specific to the address.

Emergency is the system of last resort.  When nothing else is working, expect to pull this out.  There are two important things to note here.  Anything kept behind glass that says “break in case of emergency” will not work.  The equipment will not be tested and the users not trained.  Make certain that your emergency communications systems are regularly tested and used for highest impact during a disaster.  Second, sneaker-net is a valid communication system.  Setting up runners, shuttles or other ways to manually carry messages is fine.  Sometimes the best technology to use is none at all.

Interoperable Communications

Interoperable communications is not a technical problem; it is a political problem.  The technology exists today (and is widely used) to interconnect any number of systems to each other.  The political problem comes in when teams from different organizations are in direct contact in a way that bypasses the “normal” chain of command.  Regardless of how much interoperability exists, the Police Chief wants authority over all the police units and the Fire Chief wants authority over all the fire units.

Everyone wants interoperable communications but who do they really want to talk with?  Do they want the ability to do broadcast information that cross many channels, or is it for two-way exchanges between anyone?

Interoperability requires pre-disaster decisions to be made.  Who is authorized to activate or start using the interoperability channels?  Who has the authority to control radio traffic on the shared channel?  When units are engaged on the interoperability channels, do they have an expectation to monitor or check in on their normal primary channel?  The Incident Command Systems appears to resolve these problems but only within the scope of the incident itself.  These political pitfalls exist outside ICS.  Major incidents can be divides across a number of channels so interoperability isn’t just one channel but a whole suite depending on local plans.

Ten-digit interoperability: The phone system is a communication system, and totally interoperable.  You give me your phone number and I’ll give you mine.  It is simple and works everywhere in the US.  The basic telephone number is still the foundation of voice communication regardless if it is land-line, cellular, or satellite.

I like to tell people that I do INTRA-operable communication.  If I can get my organization to talk to itself, then most of my work is done.