How companies protect data centers against the threat of physical intruders

The range of threats impacting business data is diverse, but while substantial attention gets paid to protecting systems from hackers, the actual infrastructure that houses sensitive information can be an attack vector as well. Companies have grown increasingly aware of the threats posed by a physical intruder in the data center, and certain best practices have emerged around physical security as a result. Leading enterprise data centers and colocation facilities use solutions such as surveillance, security checks, hardened exteriors and mantraps to protect themselves from these threats.

"Companies spend multi-millions of dollars on network security," Enterprise Storage Forum contributor Christine Taylor wrote in a recent article. "Yet if an attacker, disaster, or energy shortage takes down your data center then what was it all for? Don't leave your data center gaping open, and make very sure that your data center provider isn't either."

Limiting physical access by outsiders to the data center is key, as it is a sensitive environment that can be easily sabotaged – either knowingly or unknowingly. One initial protection many data centers use is to have a hardened exterior with extra thick walls and windows (as well as no windows in the server room), Taylor wrote. This precaution helps protect against both physical attacks such as explosives and natural disasters. Similar protections might include crash barriers or landscaping features around the data center to help hide it and protect it from an event like a car crash, for instance.

Security checks and mantraps
Another basic security practice is to use 24/7 surveillance with cameras that move and cover the entire premises, ideally backed by an on-premise security guard. During business hours, security guards can also perform security checks on visitors. In a recent column for TechRepublic, contributor Michael Kassner described a visit to an enterprise data center for which he was required to show two forms of ID and turn over his electronic devices to prevent him from taking pictures.

He then faced some internal physical barriers in the form of a turnstile and mantraps, which are essentially airlocks designed to prevent more than one person from passing through a door at once. The ones Kassner encountered had sensitive weight scales to detect if more than one person was coming through, as well as if that person had carried something in and not out or vice versa. Mantraps and turnstiles prevent tailgating, or the practice of following an approved employee through a secure entrance.

As companies make data center decisions, choosing a provider that can offer these robust solutions for protecting physical infrastructure is essential. Just as businesses need to secure their digital perimeter, they should look to achieve best practices for locking down their physical perimeter as well.

Developing the customized cloud in the data center

The past few years have seen a wholesale embrace of cloud storage and application hosting approaches, and companies are continuing to look for solutions that meet their evolving computing needs. Despite the rapid growth of cloud services, however, the majority of cloud deployments are still private, occurring in the on-premise or managed data center, according to VMware CEO Pat Gelsinger. While the move to public cloud services is set to continue as companies look for cheaper delivery models for certain services, this general reality is expected to stay consistent for the foreseeable future.

"On-premise cloud is a $2 trillion market … 92 percent of cloud is on-premise," Gelsinger said at the Cloud Factory conference in Banff, Alberta, according to VentureBeat. "And Gartner says that by 2020 it will still be 77 percent."

Companies are keeping their clouds on-premise due to reasons tied to security, cost, government regulations and availability, Gelsinger added. However, there's another reason for the ongoing use of on-premise cloud: There are many ways to use the cloud, and not all of them are best suited for public cloud deployments. Companies can leverage virtual servers in a variety of ways, and the dominant model for new IT deployments is moving toward custom implementations on a per-use basis, ITBusinessEdge's Arthur Cole wrote in a recent column. Rather than following industry trends, companies are looking for the right way to meet their needs for individual applications, whether they have certain scale requirements that necessitate using the public cloud or regulatory demands that make it easier to keep data in a single colocation facility.

"If ever there was an example of a technology being all things to all people … it is the cloud," Cole wrote. He later added, "Cloud infrastructure, then, is likely to become as diverse as today's legacy environments, but with the added twist that it can be made and remade according to the needs of the moment."

As companies look for the best balance of cloud deployments, they can benefit from working with a managed services provider or IT consulting firm to determine the optimal data center infrastructure and virtual server architecture solutions to meet their needs.

Federal big data initiatives make data management paramount concern

Effective data management will be a critical concern as the United States federal government ramps up its exploration of big data. While information-driven initiatives have the potential to transform a variety of civil and infrastructure projects, as well as contribute to a meaningful cybersecurity plan, a lack of data oversight could make these projects ineffective and put people at risk. 

Federal agencies have already put some big data initiatives in motion, while other industry analysts tout the potential benefits of information analysis. Recent research found that organizations including the Department of Homeland Security and the Government Accountability Office think that big data tools can help them combat cyberthreats on a country-wide scale, according to InformationWeek. Efforts to combat climate change, establish "smart" utilities and improve national healthcare can also capitalize on the insights big data provides.

However, data management, already a thorn in the side of many federal agencies, will become more difficult as data storage demands skyrocket. The Federal Data Center Consolidation Initiative, a project to close 40 percent of federal data centers – saving $5 billion by 2015 in the process – may be losing steam amid cost concerns and facilities closures that don't align with best practices, according to FCW. Out of the more than 7,000 government data centers, only 640 have been shut down. Although 470 are slated to shut down by September 2014, 2,400 would have to close within the next year and a half to reach the stated goal of 40 percent.

The government's struggles are a reminder that data management cannot take a backseat to cost or facilities considerations.

Getting cloud storage services at the right price

 

Cloud storage services offer organizations peace of mind by providing a secure location to store and back up data. But what happens when the company needs to recover it on a short timetable? Some cloud providers offer an easy road to retrieving data and resetting environments following an incident, but others may have hidden their helpful services behind a dense thicket of extra fees. It’s important to know how to avoid cloud storage pricing models that aren’t as cost-effective as they look on paper.

Although it’s not clear if cloud storage price cutting is an effective measure of netting clients, many cloud providers do it anyway, wrote TechTarget contributor Sonia Lelii. When selecting a vendor, it’s important to discern whether they’re slashing services along with prices. Don’t make a decision based on price alone. Network infrastructure, security and the cloud interface are important features that should not take a backseat to a price quote. Be wary that a low upfront cost could be masking higher expenses during a time of need.

Looking for flexibility is another way to maximize value in a cloud storage services investment, wrote Enterprise Storage Forum contributor Drew Robb. A fee structure that offers locked-in prices for regular or predictable needs, combined with an outline for services that can be purchased on-demand when they’re needed, is better than relying on one that determines fixed usage prices based on service offerings.

Why it's time to invest in a third-party data center

Rising data center complexity means that for many companies, continuing to support an in-house data center is rapidly becoming unsustainable. Spending on data center infrastructure management services and software is expected to top $4.5 billion by 2020, according to Navigant Research, and many organizations are already feeling the pinch on a local level.

Organizations that want to maintain an on-premises facility will likely fall into one of two paths: The first is to enter a cycle of massive spending on everything – new equipment, management tools, software upgrades and IT staff – to handle an otherwise bloated IT environment. The other is to keep costs down by restricting expansion, which could have significant problems for continuity, security and the company's competitive advantages. Neither situation is preferable as both have a fairly high potential for confusion, mismanagement and far-reaching operability issues. 

Outsourcing data center needs, either through colocation or managed servers, enables an organization to gain control of its data center spending while retaining access to top storage, network and security technologies. It also offers companies scalability, which is incredibly difficult or highly expensive to come by in the on-premises facility. As an organization requires additional server space, network bandwidth or software support, it can work with the data center provider to increase available capacity. This way, it doesn't pay for what it doesn't use, and can depend on complete IT support for every aspect of its investment.

Data Center Journal contributor Yuri Rabover compared continued on-site data center development to reorganizing one's garage: Although space is at a premium, the realities of equipment size and non-expert human planning likely mean that it won't be used as effectively. Instead of playing a never-ending game of data center Tetris, hemorrhaging resources at every turn, it makes sense to eliminate an otherwise ongoing problem with a decisive, conclusive act. Outsourcing data center infrastructure and its management helps enterprises avoid the spending sprees that can curtail their competitive advantage.

Why it’s time to invest in a third-party data center

Rising data center complexity means that for many companies, continuing to support an in-house data center is rapidly becoming unsustainable. Spending on data center infrastructure management services and software is expected to top $4.5 billion by 2020, according to Navigant Research, and many organizations are already feeling the pinch on a local level.

Organizations that want to maintain an on-premises facility will likely fall into one of two paths: The first is to enter a cycle of massive spending on everything – new equipment, management tools, software upgrades and IT staff – to handle an otherwise bloated IT environment. The other is to keep costs down by restricting expansion, which could have significant problems for continuity, security and the company's competitive advantages. Neither situation is preferable as both have a fairly high potential for confusion, mismanagement and far-reaching operability issues. 

Outsourcing data center needs, either through colocation or managed servers, enables an organization to gain control of its data center spending while retaining access to top storage, network and security technologies. It also offers companies scalability, which is incredibly difficult or highly expensive to come by in the on-premises facility. As an organization requires additional server space, network bandwidth or software support, it can work with the data center provider to increase available capacity. This way, it doesn't pay for what it doesn't use, and can depend on complete IT support for every aspect of its investment.

Data Center Journal contributor Yuri Rabover compared continued on-site data center development to reorganizing one's garage: Although space is at a premium, the realities of equipment size and non-expert human planning likely mean that it won't be used as effectively. Instead of playing a never-ending game of data center Tetris, hemorrhaging resources at every turn, it makes sense to eliminate an otherwise ongoing problem with a decisive, conclusive act. Outsourcing data center infrastructure and its management helps enterprises avoid the spending sprees that can curtail their competitive advantage.

Optimizing data center strategies for financial services firms

 

Data center investment strategies are critical to the lifeblood of financial services organizations. While finance firms have long used proprietary or third party data centers for information storage and business continuity, big data has given rise to a new set of complications and considerations. Not the least of these are a variety of regulatory and compliance measures that place restrictions on information storage and archival practices. New technologies, rising costs and data management issues are driving compatibility issues in traditional data center models, and financial services firms need to adapt.

Data management in finance is a problem with several moving parts that impact each other. Accumulating and storing data is a relatively straightforward issue, albeit a resource-intensive one. Under the traditional model, a firm would procure additional servers for its onsite facility or enlarge its third-party data center investment, either through colocation or leasing the provider’s equipment.

The deluge of data can make this approach prohibitively costly, forcing organizations to rethink their infrastructure approach, Wall Street & Technology editorial director Greg MacSweeney wrote. Firms with proprietary data centers now stand to save significantly by outsourcing its storage, architecture and management demands. A third-party data center can provide state-of-the-art server hardware, but more importantly has the infrastructure to deploy next-gen network solutions such as virtualization, which drastically reduces the amount of physical equipment needed to contain rising petabytes of data and information-crunching applications.

Working with a third-party data center provider also helps businesses tackle more rapidly moving targets – data integrity and compliance. Data quality and validation are some “small data” issues that grow more problematic as firms accumulate more information from a wider source pool, said software developer Oleg Komissarov, according to a recent FierceFinanceIT article.

Keeping data clean, complete and consistent is a tough task that requires powerful tools and a dedicated team. A managed data center services provider can help offer this level of attention. It can also help in compliance efforts, as any blind spots or inconsistency in information or reporting leave the door open for compliance issues to crop up. As big data expands and accelerates, financial services firms need their data centers to stay one step ahead.

Managed services can help organizations avoid top 10 business hazards

Managed services enable businesses to more successfully navigate a threat-laden enterprise landscape. Although an organization’s biggest IT, operations and security anxieties vary by region, industry and company size, what they’re most afraid of is generally the same across the board – lost profitability, client churn and a tarnished reputation.

In the Twitter age, no confirmed threat goes unpublished or unanalyzed, and it’s difficult for an organization to escape blame even if it’s only affected as a byproduct of another incident. The woes of retailer Target, which reported a 22 percent decrease in its client base in January following a massive data breach during the 2013 holiday season, serve to underscore consumer response to an enterprise that demonstrates less-than-exemplary information security, data management and business continuity.

According to a recent Business Continuity Institute study of nearly 700 enterprise respondents in 82 different countries, the top 10 most common perceived threats to disaster recovery and business continuity are:

  1. Unplanned IT outages
  2. Cyberattacks
  3. Data breaches
  4. Adverse weather effects
  5. Utility supply interruptions
  6. Fires
  7. Security compromises
  8. Health or safety incident
  9. Act of terrorism
  10. New laws or regulations

How managed services assuage anxiety
Managed services offer vast potential for companies to mitigate potential problems in many areas because a provider’s solutions are customized to the needs of the company. The above list offers a variety of incidents stemming from the company’s location, industry, employee behavior and general security management. Overseeing prevention and contingency plans that effectively respond to all of these potential hazards is time consuming, resource intensive and costly. While it’s impossible to prevent adverse weather or control regulatory measures, it’s possible to keep these threats from doing any real damage.

Managed services are scalable, so the amount of a provider’s involvement can correspond exactly a company’s anxieties and potential hazards. One organization may simply require online backup services via an offsite server in order to increase its data loss prevention activities. Another may want to virtualize nearly all of its infrastructure so its employees can stay connected and productive during a wave of bad weather. As a company’s needs change over time, it doesn’t have to rearrange its entire back-end infrastructure in order to keep danger at bay.

Who really cares about BYOD?

The bring-your-own-device movement is well on its way to fundamentally reshaping enterprise communications. So why do so few organizations seem to care about device management? A fairly wide gap formed almost immediately between BYOD user excitement and enterprise policy engagement, and it's only going to expand.

Entrenched employee attitudes absolving them of responsibility create problems for IT, and many organizations let worker preferences overwhelm clear-cut business priorities. The central problem with BYOD is a company's capacity to show that its cares – not only about the ways that BYOD can be hazardous, but about creating strategies that cater to worker preferences while keeping security at the forefront.

BYOD is clearly important to employees. One recent LANDESK survey found that the average European worker spends more on BYOD every year than he or she does on tea of coffee. They care about having the devices, but not protecting the data stored on them.

"It's not my problem" was a common refrain in a recent survey by Absolute Software about data security in the mobile enterprise. More than a quarter of those surveyed said they felt there should be no penalties for leaked or lost corporate data. Additionally, more than one third of respondents who lost their mobile devices said that they didn't make any changes to their security habits, while 59 percent estimated that their corporate data was barely worth more than the cost of replacing a phone.

Who is to blame for BYOD problems?
It's up to companies to exhibit the same passion for data security that employees have for using their own smartphones. Of those who acknowledged a penalty for data loss might be in order, most received nothing more than a slap on the wrist from employers, and often much less – 21 percent had a talking-to, 30 percent had to replace their lost device themselves and 34 percent reported that "nothing" happened when they lost company information. This reflects poorly on companies, observed Absolute mobile enterprise data expert Tim Williams, and will continue unless companies get proactive about BYOD management.

"If firms don't set clear policies that reflect the priority of corporate data security, they can't expect employees to make it a priority on their own," Williams said.

Establishing and enforcing BYOD practices is a good first step. Regulations have to acknowledge the ways personnel use BYOD and avoid limiting productivity as much as possible. There are several technological tools that can help a company secure mobile devices behind the scenes. Investing in managed infrastructure and IT support services provides a scalable, adaptable and continuous resource for effective network monitoring and data management

You've got mail, and it's a virus: Why organizations need cloud storage services for email

Security researchers recently discovered a cache of personal records for sale on the Internet's black market, including 1.25 billion email addresses, according to the Independent. Finding one email address for every seven people in the world in the care of hackers is alarming. Email continues to be the central repository for the digital transmission and storage of confidential information and remains one of cybercriminals' prime targets. Cloud storage services are a must for organizations struggling to take control of email security and management.

Keeping on top of email storage and archival is challenging for organizations of any size. Smaller organizations lack the IT resources of their larger peers, making it difficult to process email and ensure that all files are stored safely. Bigger companies have dedicated IT departments, but they also have massive email systems generated by bigger user bases and more diverse device profiles. The expertise and resources required to maintain in-house email storage are usually too costly. Either way, upholding the integrity of protection and system management at all times is beyond the purview of virtually every organization.

Adhering to traditional models of email storage simply won't suffice in the face of today's threat landscape. Moving email to cloud storage services, on the other hand, allows organizations to outsource the hardware and storage support to a trusted third party provider, wrote Nashville Business Journal contributor Richard Pinson. 

"Hosting your own email requires constant upgrading, patching, backing up and monitoring," Pinson wrote. "Once email transitions to the cloud, the service provider is responsible for all storage maintenance tasks and provides the most-recent version of their product."

Cloud storage services are scalable, meaning that an organization won't pay for what they don't use. Over the long term, this is a much more cost-effective option than having to update legacy in-house environments every few years to respond to new security and productivity challenges. It only takes one malicious email ending up in a user's inbox to let hackers in. In this landscape, organizations need the help of a dedicated cloud provider to keep their confidential information safe.