Posts

How companies protect data centers against the threat of physical intruders

The range of threats impacting business data is diverse, but while substantial attention gets paid to protecting systems from hackers, the actual infrastructure that houses sensitive information can be an attack vector as well. Companies have grown increasingly aware of the threats posed by a physical intruder in the data center, and certain best practices have emerged around physical security as a result. Leading enterprise data centers and colocation facilities use solutions such as surveillance, security checks, hardened exteriors and mantraps to protect themselves from these threats.

"Companies spend multi-millions of dollars on network security," Enterprise Storage Forum contributor Christine Taylor wrote in a recent article. "Yet if an attacker, disaster, or energy shortage takes down your data center then what was it all for? Don't leave your data center gaping open, and make very sure that your data center provider isn't either."

Limiting physical access by outsiders to the data center is key, as it is a sensitive environment that can be easily sabotaged – either knowingly or unknowingly. One initial protection many data centers use is to have a hardened exterior with extra thick walls and windows (as well as no windows in the server room), Taylor wrote. This precaution helps protect against both physical attacks such as explosives and natural disasters. Similar protections might include crash barriers or landscaping features around the data center to help hide it and protect it from an event like a car crash, for instance.

Security checks and mantraps
Another basic security practice is to use 24/7 surveillance with cameras that move and cover the entire premises, ideally backed by an on-premise security guard. During business hours, security guards can also perform security checks on visitors. In a recent column for TechRepublic, contributor Michael Kassner described a visit to an enterprise data center for which he was required to show two forms of ID and turn over his electronic devices to prevent him from taking pictures.

He then faced some internal physical barriers in the form of a turnstile and mantraps, which are essentially airlocks designed to prevent more than one person from passing through a door at once. The ones Kassner encountered had sensitive weight scales to detect if more than one person was coming through, as well as if that person had carried something in and not out or vice versa. Mantraps and turnstiles prevent tailgating, or the practice of following an approved employee through a secure entrance.

As companies make data center decisions, choosing a provider that can offer these robust solutions for protecting physical infrastructure is essential. Just as businesses need to secure their digital perimeter, they should look to achieve best practices for locking down their physical perimeter as well.

Developing the customized cloud in the data center

The past few years have seen a wholesale embrace of cloud storage and application hosting approaches, and companies are continuing to look for solutions that meet their evolving computing needs. Despite the rapid growth of cloud services, however, the majority of cloud deployments are still private, occurring in the on-premise or managed data center, according to VMware CEO Pat Gelsinger. While the move to public cloud services is set to continue as companies look for cheaper delivery models for certain services, this general reality is expected to stay consistent for the foreseeable future.

"On-premise cloud is a $2 trillion market … 92 percent of cloud is on-premise," Gelsinger said at the Cloud Factory conference in Banff, Alberta, according to VentureBeat. "And Gartner says that by 2020 it will still be 77 percent."

Companies are keeping their clouds on-premise due to reasons tied to security, cost, government regulations and availability, Gelsinger added. However, there's another reason for the ongoing use of on-premise cloud: There are many ways to use the cloud, and not all of them are best suited for public cloud deployments. Companies can leverage virtual servers in a variety of ways, and the dominant model for new IT deployments is moving toward custom implementations on a per-use basis, ITBusinessEdge's Arthur Cole wrote in a recent column. Rather than following industry trends, companies are looking for the right way to meet their needs for individual applications, whether they have certain scale requirements that necessitate using the public cloud or regulatory demands that make it easier to keep data in a single colocation facility.

"If ever there was an example of a technology being all things to all people … it is the cloud," Cole wrote. He later added, "Cloud infrastructure, then, is likely to become as diverse as today's legacy environments, but with the added twist that it can be made and remade according to the needs of the moment."

As companies look for the best balance of cloud deployments, they can benefit from working with a managed services provider or IT consulting firm to determine the optimal data center infrastructure and virtual server architecture solutions to meet their needs.

Managed services equip companies to deal with changing cybersecurity landscape

Each year seems to bring a broader and more complex array of cyber threats to businesses, and many companies are struggling to keep up with the rapid pace of change. According to a recent survey from security software firm KnowBe4, more than half of IT managers – 51 percent – find security harder to maintain now than a year ago. Preventing cyberthreats and responding quickly to security issues are some of the biggest challenges for companies, which is why many are turning to managed services providers for a more secure infrastructure, as well as functions like malware removal and application support.

"Cybercriminals are constantly devising cunning new ways to trick users into clicking their phishing links or opening infected attachments," KnowBe4 CEO Stu Sjouwerman stated, adding that companies need to respond with thorough cybersecurity procedures, policies and training.

Another recent study from Solutionary and the NTT Group found that 54 percent of new malware goes undetected by antivirus software. As a result, companies need to make sure they are protected at the application level by using secure software and applying updates, ITBusinessEdge contributor Sue Poremba wrote in a recent column. Leveraging managed services for application support can help ensure software is kept updated and secured against threats, while external expertise can also be valuable in implementing state-of-the-art perimeter solutions and secure data center infrastructure.

Additionally, a managed services provider that offers malware removal can be a valuable partner in responding to and limiting the damage of an incident like an SQL injection attack, which the Solutionary study noted can easily cost a business $200,000 or more. Such protection might be unaffordable for a small business to implement in-house, but, by outsourcing certain IT management functions, companies can access state-of-the-art security solutions and industry-leading expertise. With the right portfolio of tools protecting it, a small business can avoid these ever-expanding threats.

Federal big data initiatives make data management paramount concern

Effective data management will be a critical concern as the United States federal government ramps up its exploration of big data. While information-driven initiatives have the potential to transform a variety of civil and infrastructure projects, as well as contribute to a meaningful cybersecurity plan, a lack of data oversight could make these projects ineffective and put people at risk. 

Federal agencies have already put some big data initiatives in motion, while other industry analysts tout the potential benefits of information analysis. Recent research found that organizations including the Department of Homeland Security and the Government Accountability Office think that big data tools can help them combat cyberthreats on a country-wide scale, according to InformationWeek. Efforts to combat climate change, establish "smart" utilities and improve national healthcare can also capitalize on the insights big data provides.

However, data management, already a thorn in the side of many federal agencies, will become more difficult as data storage demands skyrocket. The Federal Data Center Consolidation Initiative, a project to close 40 percent of federal data centers – saving $5 billion by 2015 in the process – may be losing steam amid cost concerns and facilities closures that don't align with best practices, according to FCW. Out of the more than 7,000 government data centers, only 640 have been shut down. Although 470 are slated to shut down by September 2014, 2,400 would have to close within the next year and a half to reach the stated goal of 40 percent.

The government's struggles are a reminder that data management cannot take a backseat to cost or facilities considerations.

Disaster recovery services, cybersecurity critical to protecting electric grid from attacks

Over the past few years, the utilities industry has made a concentrated effort to make key infrastructure "smarter." The integration of data-capturing devices and automated, software-based management systems has the potential to create smart electric grids that can more effectively use and distribute power, reducing energy costs and environmental impact in the process.

However, turning power grids into connected devices has potentially harrowing implications – a concentrated cyberattack could cause lengthy and widespread outages, not only withholding electricity from businesses and residences, but disrupting communications, healthcare systems and the economy. According to many cybersecurity researchers, the likelihood of a potential problem occurring is less of an "if" and more of a "when." 

Ramping up disaster recovery services and cybersecurity protocols is key to shielding the smart electric grid from a devastating attack. While the federal government tries to increase the efficacy and stringency of its own security measures, it's important that utility companies – from national generators to local distributors – build up their own prevention and backup systems, according to a recent white paper by the three co-chairs of the Bipartisan Policy Center's Electric Grid Cybersecurity Initiative. This effort will require a hybrid system that responds to both physical and cybersecurity threats. 

"Managing cybersecurity risks on the electric grid raises challenges unlike those in more traditional business IT networks and systems," the report stated. "[I]t will be necessary to resolve differences that remain between the frameworks that govern cyber attack response and traditional disaster response."

Disaster recovery efforts need to include backup digital systems that rival physical ones. Electric grids require faultless failover technology that can depend on a secondary backup network if the primary one is taken offline for any reason. As the Baker Institute pointed out in a recent Forbes article, the measure of a disaster recovery system's effectiveness is based on whether the grid can be restarted following a major breach, disruption or cyberattack. Without a system that can effectively monitor, prevent and immediately respond to such threats, the smart electric grid could be putting many key infrastructure systems in danger.

Disaster-recovery-as-a-service market emphasizes changing priorities

Disaster recovery, once a relative afterthought or nonentity in business planning, is now a central consideration. Advanced threats and high-profile data breaches have helped to convince organizations that it's time to stop dragging their feet and start taking disaster recovery more seriously. The rapid rise of the market for disaster-recovery-as-a-service highlights an important shift in priorities.

According to TechNavio, the global market for DRaaS is expected to rise at a compound annual growth rate of 54.6 percent between 2014 and 2018. Demand for hybrid and cloud-based disaster recovery has driven investment, especially in small- and medium-sized businesses that have found a "flying under the radar" approach by virtue of their size is no longer a viable approach to avoiding the consequences of information security compromises.

Larger organizations have also realized that IT departments are generally unable to maintain complete oversight and disaster recovery protection amid data deluges and rapid network expansion. To cite one sector, the banking industry has begun to invest heavily in the cloud to relieve the amount of resources it has to spend on application updates, software patches and IT infrastructure, according to Bank Systems & Technology.

The report did note that relying too much on a generic cloud solution or paying insufficient attention to backup data could diminish the effectiveness of DRaaS investment. A company is better served by using a multi-service provider that focuses on customization, specificity and addressing pain points. This way, it can avoid any data integrity or governance issues stemming from a lackluster vendor. 

Getting cloud storage services at the right price

 

Cloud storage services offer organizations peace of mind by providing a secure location to store and back up data. But what happens when the company needs to recover it on a short timetable? Some cloud providers offer an easy road to retrieving data and resetting environments following an incident, but others may have hidden their helpful services behind a dense thicket of extra fees. It’s important to know how to avoid cloud storage pricing models that aren’t as cost-effective as they look on paper.

Although it’s not clear if cloud storage price cutting is an effective measure of netting clients, many cloud providers do it anyway, wrote TechTarget contributor Sonia Lelii. When selecting a vendor, it’s important to discern whether they’re slashing services along with prices. Don’t make a decision based on price alone. Network infrastructure, security and the cloud interface are important features that should not take a backseat to a price quote. Be wary that a low upfront cost could be masking higher expenses during a time of need.

Looking for flexibility is another way to maximize value in a cloud storage services investment, wrote Enterprise Storage Forum contributor Drew Robb. A fee structure that offers locked-in prices for regular or predictable needs, combined with an outline for services that can be purchased on-demand when they’re needed, is better than relying on one that determines fixed usage prices based on service offerings.

Why it's time to invest in a third-party data center

Rising data center complexity means that for many companies, continuing to support an in-house data center is rapidly becoming unsustainable. Spending on data center infrastructure management services and software is expected to top $4.5 billion by 2020, according to Navigant Research, and many organizations are already feeling the pinch on a local level.

Organizations that want to maintain an on-premises facility will likely fall into one of two paths: The first is to enter a cycle of massive spending on everything – new equipment, management tools, software upgrades and IT staff – to handle an otherwise bloated IT environment. The other is to keep costs down by restricting expansion, which could have significant problems for continuity, security and the company's competitive advantages. Neither situation is preferable as both have a fairly high potential for confusion, mismanagement and far-reaching operability issues. 

Outsourcing data center needs, either through colocation or managed servers, enables an organization to gain control of its data center spending while retaining access to top storage, network and security technologies. It also offers companies scalability, which is incredibly difficult or highly expensive to come by in the on-premises facility. As an organization requires additional server space, network bandwidth or software support, it can work with the data center provider to increase available capacity. This way, it doesn't pay for what it doesn't use, and can depend on complete IT support for every aspect of its investment.

Data Center Journal contributor Yuri Rabover compared continued on-site data center development to reorganizing one's garage: Although space is at a premium, the realities of equipment size and non-expert human planning likely mean that it won't be used as effectively. Instead of playing a never-ending game of data center Tetris, hemorrhaging resources at every turn, it makes sense to eliminate an otherwise ongoing problem with a decisive, conclusive act. Outsourcing data center infrastructure and its management helps enterprises avoid the spending sprees that can curtail their competitive advantage.

Why it’s time to invest in a third-party data center

Rising data center complexity means that for many companies, continuing to support an in-house data center is rapidly becoming unsustainable. Spending on data center infrastructure management services and software is expected to top $4.5 billion by 2020, according to Navigant Research, and many organizations are already feeling the pinch on a local level.

Organizations that want to maintain an on-premises facility will likely fall into one of two paths: The first is to enter a cycle of massive spending on everything – new equipment, management tools, software upgrades and IT staff – to handle an otherwise bloated IT environment. The other is to keep costs down by restricting expansion, which could have significant problems for continuity, security and the company's competitive advantages. Neither situation is preferable as both have a fairly high potential for confusion, mismanagement and far-reaching operability issues. 

Outsourcing data center needs, either through colocation or managed servers, enables an organization to gain control of its data center spending while retaining access to top storage, network and security technologies. It also offers companies scalability, which is incredibly difficult or highly expensive to come by in the on-premises facility. As an organization requires additional server space, network bandwidth or software support, it can work with the data center provider to increase available capacity. This way, it doesn't pay for what it doesn't use, and can depend on complete IT support for every aspect of its investment.

Data Center Journal contributor Yuri Rabover compared continued on-site data center development to reorganizing one's garage: Although space is at a premium, the realities of equipment size and non-expert human planning likely mean that it won't be used as effectively. Instead of playing a never-ending game of data center Tetris, hemorrhaging resources at every turn, it makes sense to eliminate an otherwise ongoing problem with a decisive, conclusive act. Outsourcing data center infrastructure and its management helps enterprises avoid the spending sprees that can curtail their competitive advantage.

Optimizing data center strategies for financial services firms

 

Data center investment strategies are critical to the lifeblood of financial services organizations. While finance firms have long used proprietary or third party data centers for information storage and business continuity, big data has given rise to a new set of complications and considerations. Not the least of these are a variety of regulatory and compliance measures that place restrictions on information storage and archival practices. New technologies, rising costs and data management issues are driving compatibility issues in traditional data center models, and financial services firms need to adapt.

Data management in finance is a problem with several moving parts that impact each other. Accumulating and storing data is a relatively straightforward issue, albeit a resource-intensive one. Under the traditional model, a firm would procure additional servers for its onsite facility or enlarge its third-party data center investment, either through colocation or leasing the provider’s equipment.

The deluge of data can make this approach prohibitively costly, forcing organizations to rethink their infrastructure approach, Wall Street & Technology editorial director Greg MacSweeney wrote. Firms with proprietary data centers now stand to save significantly by outsourcing its storage, architecture and management demands. A third-party data center can provide state-of-the-art server hardware, but more importantly has the infrastructure to deploy next-gen network solutions such as virtualization, which drastically reduces the amount of physical equipment needed to contain rising petabytes of data and information-crunching applications.

Working with a third-party data center provider also helps businesses tackle more rapidly moving targets – data integrity and compliance. Data quality and validation are some “small data” issues that grow more problematic as firms accumulate more information from a wider source pool, said software developer Oleg Komissarov, according to a recent FierceFinanceIT article.

Keeping data clean, complete and consistent is a tough task that requires powerful tools and a dedicated team. A managed data center services provider can help offer this level of attention. It can also help in compliance efforts, as any blind spots or inconsistency in information or reporting leave the door open for compliance issues to crop up. As big data expands and accelerates, financial services firms need their data centers to stay one step ahead.