Data Security | March 2013 Digital Edition

In the Wake of Superstorms, Keeping School Data Safe

Keeping school data safe requires an evolving strategy that's equal parts paranoia, smart use of newer technologies, and taking the pain and turning it into gain.

servers used to secure data
Memorial School in Union Beach, NJ, was flooded by Superstorm Sandy, but the school's data stayed safe. (photo courtesy of Dennis Duarte)

This article, with an exclusive video interview, originally appeared in T.H.E. Journal's March 2013 digital edition.

When Superstorm Sandy engulfed Union Beach, NJ, the lone school in the district, Memorial School, which was the highest point in town and had always served as the evacuation center in times of trouble, was doused in a slushy cocktail of street runoff, sewage, and salt water. The next day, when he took a tour of the building, Superintendent Joe Annibale knew his students "were not returning to this school any time soon." Fortunately, student data could follow them wherever they and the school's staff went (in this case to three other locations) because Dennis Duarte, the district's lone technology administrator, had taken some basic precautions to ensure that the data was protected.

Duarte was in the habit of backing up his local servers every day, but before the storm hit he took the extra step of performing an off-site backup too--something he usually did every couple of weeks. The off-site backup entailed copying all of the district's data onto 1-terabyte portable drives. "Since we're such a small district, it wasn't a lot to back up," he explains. "I copied everything over and threw it in my book bag when I went home on Friday. And I thought, 'This is silly. I'm wasting a lot of time doing this. I'm not even going to need to use this.'" As simple as Duarte's approach sounds, it turned out to be just the right decision for this situation.

Securing your data and making it recoverable in case something bad happens isn't a "set-it-and-forget-it" kind of operation. Data protection is actually an evolving formula that's equal parts paranoia, smart use of newer technologies, and taking the pain and turning it into gain.

Batten Down the Hatches
Securing your data against physical threats requires thinking about obvious potential problems, such as not locating your servers in the basement if your data center resides in a flood zone. It also calls for considering less common perils, such as a vendor capturing smartphone photographs of a monitor displaying student contact information while you've stepped away to take a call.

Joplin Schools understands better than most the need for physical protection of data. The Missouri district is still constructing three school facilities to replace 260 classrooms destroyed when parts of the city were razed by a deadly, mile-wide tornado in May 2011. As one small aspect of that rebuilding effort, the district is focusing on tightening physical security so that operations can continue in its schools no matter what arrives at its doors. For example, the IT organization maintains two server rooms. The first is in the original location: below ground, in a cinderblock room with dual air conditioners and doors that remain locked at all times. Only certain people in the district can access that room. The second server room is in a different location and is secured by magnetic locks. Staffers must have a security badge to gain access there.

Similarly, the data center at Westbury Schools on New York's Long Island is locked at all times, with entry limited to staff people who carry the appropriate access on their keycards. The room is also monitored by video camera. Jay Marcucci, the CIO and director of technology, communication, and information services for the district, is the author of that policy. He previously worked in New York City's financial services sector, a job that taught him the importance of absolute physical security. Of course, even the most secure door can't keep data completely safe, which is why off-site storage in the cloud is becoming increasingly popular for schools.  

Cloud and Storage Management
The move to the cloud is well under way in K-12. In CDW's 2013 State of the Cloud Report, 42 percent of districts reported that they're implementing or maintaining cloud computing; that's up from 27 percent in 2011. The biggest area of adoption: 40 percent of respondents said they've moved or are in the process of moving their storage burden to a cloud environment.

Recently, the Fort Worth Independent School District (TX) purchased a new vendor-hosted student information system from Focus School Software. "It's interesting when you think about all the problems that were solved immediately for those schools," says Glynn Ligon, president and CEO of ESP Solutions Group, an advisory company for state and local education agencies. Chief among the benefits of cloud storage is that backup is "off-loaded" to the vendor.

President Andrew Schmadeke says that Focus actually maintains its servers at SoftLayer, a company that provides leased infrastructure with locations in Dallas and Washington, DC. "If there's a big tornado that wipes the data center in Dallas," he says, "we can have our customers that are hosted in the Dallas center up and live in the Washington, DC, data center in 24 hours. So far we've never had to do that."

For school customers that host Focus in their own data centers, the company offers redundant backup to a different physical location. "Every few months we do a little drill, to make sure we can do a whole bunch of our customers within 24 hours [and] get them live somewhere."

As reassuring as that may sound, Ligon cautions that caveats come with a reliance on the cloud. Because school systems maintain private data protected under any number of compliance regulations, before signing with a cloud provider, educators have to scour contracts to be sure that their data won't be hosted outside of the country. "Sometimes these hosting services aren't very straightforward about where the servers are located [or] where their backup systems are," he notes.

Jarrett Potts, director of marketing for storage management company STORServer, points out another potential problem: bandwidth logjams. "If I'm sending data from my school district to the cloud, what is the service-level agreement I have to get my data back? If I need my 1 terabyte of production data back, can I recover that over the internet?" The answer is no, he says, unless you're paying a "tremendous amount of money for bandwidth."

His recommendation is to maintain a combination of private and public clouds. Retain a copy of all data locally on some form of private cloud--all of those servers you've virtualized in the data center--and use public cloud-based storage for strategic pieces of production data. "Not all data is created equally," he notes. "My résumé is not as important as payroll."

Hosting a district's critical applications in the cloud is a fairly new practice that Duarte will be picking up this year for Union Beach. "Our database vendors have been very helpful and generous and have given us their cloud services for free for the remainder of the year," he says. However, he's still showing caution. He's negotiating to keep the data on-site and to mirror it online. "There's something about being able to hold something in your hand. I don't know if it's antiquated. It can't hurt."

Bob Burwell, CTO for storage and data-management vendor NetApp, believes that many districts just need to get smarter about how they use their existing storage assets. "Storage is typically about 25 percent of a data center's costs," he says. By tapping into storage-management features such as data compression and de-duplication, districts can reduce the amount of storage they require. That, in turn, reduces the pressure to continually buy new servers as data grows and multiplies, which reduces expenses and means IT people have fewer resources to manage.

Burwell also points to the company's Snapshot capability, which captures a point-in-time copy of data even while applications continue running, making for faster recovery of individual files or complete storage volumes. Snapshot enabled the Broome-Tioga Board of Cooperative Educational Services (NY) to speed up backup of its mail server data, which could previously take days to finish. The BOCES creates snapshots of Exchange and SQL Server data twice daily and replicates them to its disaster recovery site.

Recovering Your Bases
About three or four years ago, Westbury's Marcucci got his wake-up call about data recovery when a backhoe took out the district's fiber connection to the high school, where the main network operations center (NOC) is located. "That essentially crippled us, and that's what made me realize if anything like that happens again, we need a backup source." He began working on a plan to develop a second data center at a middle school about a mile away. Once that's online in June, if anything should happen to the systems maintained at the high school, Marcucci anticipates that within 30 seconds, they'll failover to the backups running at the middle school. He and the facilities director will test the system this summer by pulling the plug and seeing what happens when the generator they've placed at the high school doesn't switch on immediately.

"We're a high-needs district," Marcucci says. "We do a lot of programs for credit recovery. Most of them are online, and we really can't afford to lose the high school NOC because the kids will not be able to continue with the program. We're trying to give them seamless access to data. We don't want the whole district to go down because of one school."

NetApp's Burwell suggests what could be a cheaper alternative to building a redundant data center: sharing storage space with another district. "You can replicate data between sites over cost-effective media. I basically go to a screen that says, 'Tell me what you want to replicate, where you want to replicate, and how often,' and that will be sent over standard Ethernet connections."

Both approaches bring up a good point: Nobody really cares about backup per se. Says Potts, "What they really care about is recovery. It doesn't matter how they structure it, how much money they spend, how fast their disks are--if they can't recover their data, it's all for naught."

This is where virtualization comes into play. Potts recommends running the traditional tiered system: a combination of high-speed disks for data that's being accessed currently; low-speed disks for data that hasn't been touched in the last seven days; and tape for data that that hasn't been touched in six months. By creating a consolidated environment with storage virtualization, you gain several advantages. Pieces of that storage infrastructure can be replaced without downtime. You can also mix and match storage hardware from multiple vendors. In cases where storage resides in multiple locations, the virtualized view presents it as a unified resource. "The consolidation makes things a lot easier and a lot more cost-effective," Potts says.

Of course, the most high-tech solutions aren't always the best suited for the situation. At Union Beach, as soon as power was restored to the town--a week after Sandy struck--Duarte set up several workstations in his office running LogMeIn, which gave his scattered users remote access to vital applications they needed: the student information system, the special services application, and the business software. But even now, if a user requires access to data from some other location and can't run the specific application through LogMeIn, Duarte uses another form of remote access: He burns data to a CD and delivers it in person.

Beware Malware
Another danger to data is malicious software, aka malware. Viruses, worms, Trojan horses, mobile code, and blended attacks can steal data, slow your users' computers, suck up processing, change files and configurations, introduce bizarre verbiage or images to desktops, turn off antivirus or other protective programs, steal the use of e-mail, and display other behaviors that frustrate users and suck up precious IT time to chase down the source of problems.

Just as with encryption or complex passwords, if you're not already using these forms of data protection, you put students, staff, and teachers at risk.

  • Antivirus software for computers to freeze and eradicate malware before it grows legs
  • Intrusion prevention systems to do packet sniffing and check out network traffic for suspicious activity
  • Firewalls to restrict the traffic that can pass from one network to another
  • Content filtering to monitor what students are exposed to and to make sure nobody on staff is doing something that could put the district into the headlines
  • Application white-listing to provide a structure for what software is allowed to run on district computers
  • Network access control to manage who gets to do what on the network

Don't forget about securing the data on those smartphones either. If a user is ever wondering why her phone battery is running down so quickly or his apps don't work the way he expects, it might be a case of a mobile device infection.

And now, according to a recent report by security vendor Kaspersky Lab, mobile apps are being used to install malware on older PCs. When a hapless victim connects a phone to a PC to perform an update, the attack launches. Experts recommend the same kind of protection guidelines for phones as they do for PCs: Update operating systems on the device; shop for apps in big-name app marketplaces (though this isn't fail-safe, as Google has taught us); and use a mobile security program to monitor for malfeasance.

comments powered by Disqus

White Papers:

  • Creating Magic in the Classroom PDF Screenshot

    Download this informative Executive Summary from an exclusive event aimed to provide educators, administrators and IT leaders with practical tools for integrating technology into the classroom. Read more...