Proton Data Security

Proton Data is the world's leading manufacturer of data security products

Considering data sanitization as edge computing rises

Considering data sanitization as edge computing rises

 

Proper hard disk destruction and solid state drive shredding may not come across as a hot-button issue in the tech world, but the rise of edge computing could put a renewed focus on how companies safeguard physical data assets. As organizations move more of their systems out to the network edge, they must also create formalized plans to properly destroy data and manage the device decommissioning process. Edge computing makes all of these tasks more complex.

The changing data center world
To understand edge computing, it is important to first put the movement into the larger data center context in the world today. With that in mind, let’s consider what has happened in the past decade:

Virtualization’s rise: The emergence of virtualization empowered organizations to partition systems, increase server usage capacity and reduce the amount of hardware in their data centers. Suddenly, one server could do what 10 servers were doing.

Broadband’s takeover: For a long time, companies maintained small data centers and server closets in branch offices to support local computing requirements. However, widespread access to broadband services, including relatively high-performance WAN services, empowered IT teams to move more data between locations, making users less reliant on local server access through the LAN.

Consolidation’s move: With companies being able to use fewer machines and move data over long distances, they no longer needed to manage large numbers of small data centers. Instead, organizations could consolidate their server rooms into centralized facilities, possibly using colocation or similar hosting methods along the way.

Cloud’s emergence: While companies were consolidating their data center assets, service providers came to realize they could automate and orchestrate virtual machines over multiple servers to create interconnected clouds that share resources. As such, businesses could subscribe to services and pay only for what they needed.

In a fairly short time, the IT world shifted from a heavily distributed environment in which companies had to manage hardware assets across a variety of closed-off locations to a consolidated ecosystem in which organizations relied on mega data centers. As such, the organizations that bear the brunt of the responsibility for data destruction are the service providers managing the countless systems hosted in the cloud. Edge computing is turning this model on its head, bringing the data center back to its past.

Abstract image of a cloud overlayed on a neon-blue circuit board.
  • Facebook
  • Twitter
  • Google+
  • Pinterest
  • Hacker News
  • LinkedIn
The cloud redefined the modern data center.

What is edge computing?
In its simplest form, edge computing is what many people have been doing all along – having the processing work of a compute task completed by a local device instead of in the cloud. A local server running an analytics workload is similar to an edge system, for example. There is a key distinction, however. Edge computing typically involves systems close to the actual location where data is generated, but that are also still connected to a cloud or data center service backend.

Think of it this way: In a typical cloud configuration, data is sent to the cloud, processed by solutions in the backend and sent back out to the endpoint for user action or for an automated workflow to commence. With edge computing, all of that work can be done locally, but it can also be informed by data residing in the cloud when applicable.

“Edge computing is rising in light of IoT solutions that require real-time data processing.”

Edge computing is rising in light of IoT solutions that require real-time data processing, such as autonomous vehicles or robotic manufacturing lines. The latency involved in sending data out to the cloud and transmitting commands back is unrealistic, so organizations need mini data centers to be deployed on site. In some cases, this means creating micro-servers that can handle capacity in rugged environments. In others, it involves using strategically located data centers with low-latency connections.

Either way, the market for edge computing is set to expand at a compound annual growth rate of 35.4 percent for the period of 2017 to 2022, MarketsandMarkets found.

As organizations move more computing resources to the edges of their cloud networks and away from monolithic data centers, they must consider how they are going to manage hardware life cycles in these distributed environments. This is where physical hardware access and data sanitization come into play.

Edge computing and physical data security
One of the problems that has come with the move toward consolidated data centers is an emphasis on network security. Because few businesses have to worry about their physical data centers being breached, most are focused on preventing hackers from getting into the network. As edge computing rises, the network risks remain. After all, edge systems are still in the same cloud network. However, physical data access control challenges escalate in light of this change.

Chris Brown, CTO for the Uptime Institute, told Data Center Knowledge that physical security is often neglected in edge computing settings because the systems themselves are so easy to deploy.

“Often they’re self-contained racks, and all you need is to provide reliable power and AC and put them in some sort of structure that keeps the weather off them,” Brown told the news source. “So, they may not be in a standard data center; they can be deployed in a warehouse.”

Conceptual image of two futuristic cars surrounding by icons representing digital services.
  • Facebook
  • Twitter
  • Google+
  • Pinterest
  • Hacker News
  • LinkedIn
Autonomous vehicles are among the technologies driving edge computing.

Dealing with data sanitization on the network edge
Imagine you have an edge data center and one day a system fails. A busy, slightly panicked engineer drives over, notices that a hard drive failed, goes out to get a replacement part, comes in, pulls out the old system and, in a hurry to get to the next task, rushes out without grabbing the failed hard drive. It may not seem like a priority, but if your edge data center isn’t in a controlled area, somebody can grab that drive and try to recover the data, which is often possible.

As if that situation isn’t problematic enough, organizations must also ask how they will handle decommissioning on an ongoing basis. Will old systems be shipped to a central data center to be destroyed? Will they be taken apart on site with only parts containing sensitive data shipped to another location to be taken care of? Who is in charge of documenting shipping, receipt and confirmation of destruction? Can you trust a third party with all that?

Ultimately, businesses must not only destroy hard disks and solid state drives thoroughly enough to sanitize data, they must also maintain a clear chain of custody. This becomes more complex in an edge computing setup.

The simplest solution may be to purchase a hard disk degausser and general media shredder for edge locations – or invest in portable options that engineers can bring with them when they service machines. With data sanitization options available on-site, organizations can simplify the chain of custody and prevent unauthorized access to data housed in decommissioned hardware. Proton Data offers a full line of NSA-approved storage device destruction systems. Contact us today to learn more.

 

Pin It on Pinterest

Share This