2008 Flood

The GreatAmerica Building is buzzing with talk this week about an event 10 years ago. Actually, pretty much the entire city of Cedar Rapids is remembering where they were and what they were doing during the 2008 Flood.

It was unprecedented. A “500 Year Flood” that impacted Cedar Rapids and dozens of other communities in Iowa. Cedar Rapids is the home of GreatAmerica, and we were unfortunate enough to be on the Cedar River when it spilled over its banks.

We were, however, fortunate to be located in the Midwest, where a can-do work ethic is built into the culture. So it is no wonder our team members rallied when disaster struck, and when the unpredictable happened, our teams adapted and did what was necessary to ensure our customers didn’t feel any disruption.

We learned a lot of lessons from the 2008 Flood, the most important lesson being that the work on a business continuity plan is never over.

Any business can face disaster, and nobody ever expects it to happen to them. To learn the steps we have taken to prepare for these situations, and our recommendations for other businesses who are interested in creating a business continuity plan, check out this blog: 9 Tips for Creating a Business Continuity Plan.

When we tell the story of the flood it makes sense to go back to the events leading up to the crest on June 13th, 2008.

Setting the Stage

Heavier than normal snowfall from late 2007 through early 2008 fell in much of Iowa. Cedar Rapids had 60 inches of snow, and cold temps kept the snow on the ground through winter.

Snow

This bank of stored water was conspired with record levels of rainfall in early spring to bring significant flooding throughout the state in April of 2008.

The Days Leading to the Flood

Friday, June 6th

Flood warnings began on June 6th, with a notification from Iowa Homeland Security and Emergency Management that flood of levels seen in 1993 (a previous record flooding event) could be present. In the following days, rivers across the state began to flood.

Tuesday, June 10th

At 8:00 am on June 10th the river was at 14.26 feet with a projected crest of 21 feet on Thursday, June 12. Based on the possibility of water coming up through the storm drains and lapping the GreatAmerica Building, sandbagging began. That evening, the river reached 17 feet.

Wednesday, June 11th

Mid-morning on the 11th, the forecasted crest was adjusted to 24 feet. At the same time the GreatAmerica Executive Business Continuity Planning (EBCP) team met, formally declared an emergency, and put our business continuity plan in action. The City of Cedar Rapids also issued mandatory evacuation orders for businesses in the flood zone, including the GreatAmerica building.

Putting Business Continuity Plan in Action

During that initial meeting, the EBCP team enacts our plan which identifies core teams to work from multiple recovery workspaces. The GreatAmerica IT teams begin removing the systems they could that wouldn’t impact customer activity. Employees begin evacuating, with a few volunteers staying to respond to customer requests.

By the end of the day, water from the Cedar River was coming out of the storm drains, our toll free numbers were re-routed to our recovery workspace and critical systems are starting to be moved as well.

Removing Systems 1 Removing Systems 2

  

As night fell - Cedar Rapids was mostly a ghost town. Electricity was cut off to the downtown by 10 pm.  Sixty miles away, the IT teams were working furiously through the night to bring critical systems online. 

Business as Usual

By the time core team members began arriving at the recovery workspace on the 12th, most production servers had been moved to the backup facility and made available to access data. Our customers and were mostly unaware of the chaos happening in Cedar Rapids, and when they asked, we let them know it is business as usual for them.

Backup Facility First Workspace EOC Day One
Backup Facility, Cedar Falls, June 12, 2008 Emergency Operations Center, Cedar Rapids, June 12, 2008

 

Meanwhile, leaders were communicating the plan to team members via individual phone calls, as well as through Emergency Notification “blasts.” Those who weren’t assigned to work from the recovery workspace were pitching in where they could while systems worked to get more employees online.

A “500 Year Flood”

Building-Flood-Line-cropped

While the (EBCP) team was working around the clock to ensure operations at GreatAmerica were serving customers, the flood waters kept rising and the people who live and work in Cedar Rapids began to watch in horror and awe as their city went under water.

Friday the 13th the Cedar River crested at 31.2 feet – more than 11 feet higher than the previous record flood in 1929.

Getting Back Into the Building

While customers were still getting the “business as usual” experience from GreatAmerica, the flood waters receded, and on Tuesday, June 17th we were granted access to our building to assess the damage and estimate how long before we could move back. As we suspected, the first floor was greatly impacted by the flood waters. We had 8-1/2 feet of water in the building, submerging the critical building infrastructure and mechanicals (e.g. electrical, data, water pumps, etc.)

Working Through Adversity

The GreatAmerica Building was unoccupied for 68 days while it was undergoing repairs and replacing the critical mechanical systems. Most employees were able to work from the field, including several recovery workspaces, home offices, and shared work sites in coffee shops and hotels.

Additional Workspaces Home Office 2
Additional Backup Facility Home Office

 

We used four remote locations, requiring shuttles running multiple shifts.

Measurement of Success

During the entire disaster, and even afterwards we have asked ourselves whether our response to this disaster was a success. Ultimately, the measure came from how well we were able to deliver services to our customers. We posted record numbers during those months, and some of our most loyal customers were unaware anything had happened because they saw no interruption in service.

What We Did Right & What We Learned

If hindsight is 20/20, the past 10 years have provided ample perspective on what went well and where we could improve.

A formal disaster recovery plan of the traditional (if this, then that) only makes sense if you can count on being able to control both the timing and the nature of the disaster – and doesn’t if you can’t.In other words the only things that are really predictable about data center recovery are the plan won’t apply to what actually happens, the recovery process will take longer and cost more than expected, and the whole thing will be far more chaotic and ad hoc than anyone ever wants to admit afterward.

Paul Murphy (Continuity Central Posting)

What Went Right

Good Planning

GreatAmerica had a plan. The plan wasn’t perfect and it wasn’t designed specifically for a flood. But talking about business continuity and disaster recovery over the years helped us understand the key pieces of a business continuity plan. 

Good Leadership

The EBCP Team went through annual exercises with the objective of extracting decisive behavior from what could be an analytical group. Additionally, our senior leaders represent the best of our culture and have broad, deep knowledge of our company and the industries we serve. That alone allowed our leaders to make major decisions rather quickly knowing what the outcomes would be.

Team Members Embrace Our Principles

The GreatAmerica hiring process puts a heavy emphasis on hiring for culture. This disaster gave employees the opportunity to demonstrate character and tenacity. They rallied to ensure our customers never felt a single hiccup. Some worked hours to move our systems to the recovery center and back. Others by driving long hours to remote worksites, and yet others providing their dining rooms as temporary work centers.

Good Luck

We also had luck on our side. There was no loss of life anywhere in the Cedar Rapids community as a result of the flood. The disaster declaration was timely, and our plan ensured critical resources were removed before evacuation. We were also lucky to have gained access within four days of the flood crest. During that first trip into the building, we were able to recover additional supplies, equipment and files that had originally been left behind. We were also fortunate to avoid any structural damage to the GreatAmerica Building.

What We Learned

Nothing tests a business continuity plan like a real disaster. There is no question we learned a lot about the places our plan needed some attention. Here are four things we learned as a result of the 2008 Flood.

More Employees Online is Better

We need more employees online in a shorter period of time than we originally anticipated.  While we had enough to resume our operations, there was pressure to add seats, computers and phones throughout the entire event.

Location, Location, Location

The location of your recovery work space is important. Too close to your business, and it could be impacted by the same disaster (flood, tornado, and hurricane). Too far away and productivity is impacted by the travel time and the stress it puts on your employees.  By the end of the flood event, GreatAmerica had several small recovery work space sites in operation, as far away as 70 miles.

Recovery Takes Longer Than Expected

We also learned for the time it would take for building restoration and re-entry was much longer than we anticipated.

The Plan is Never Complete

Since the 2008 flood we’ve changed our approach to recovery workspace, and now can fit our entire staff in the backup facility. The most important lesson is that we learned business continuity planning is never done, and there are always improvements to be made. 

WATCH NOW

We talked with Jim Burns, Vice President of IT about business continuity and the three things he recommends when creating a plan. We hope you never have to use a business continuity plan, but if you do, the investment of time, talent and resources is well worth it.