Moving toward the virtual data center

Virtualization – the process of abstracting hardware functions to a software level – is one of the signature advancements of modern computing, allowing companies to consolidate their server footprints and increase the flexibility of their infrastructure. With virtualization, businesses can quickly create new virtual servers and move workloads from one physical location to another on a software level. As server virtualization becomes increasingly standard in the data center, companies are beginning to look at other forms of virtualization that can also be applied to increase flexibility, such as storage virtualization and network virtualization. With virtualization in all its forms becoming more important for managing a data center, companies are turning to managed services partners to help.

InformationWeek’s 2013 Virtualization Management Survey found that 72 percent of companies reported extensive use of server virtualization, and just 4 percent had no plans for use. In comparison, only 22 percent reported extensive use of storage virtualization, with 28 percent saying they had no plans for use, and a mere 11 percent reported extensive network virtualization, with 44 percent saying they had no plans for use. The main drivers for virtualization included operational flexibility and agility (56 percent) and business continuity (55 percent).

“Undoubtedly, a fully virtualized data operation offers many advantages,” ITBusinessEdge’s Arthur Cole wrote in a recent column. “Aside from the lower capital and operating costs, it will be much easier to support mobile communications, collaboration, social networking and many of the other trends that are driving the knowledge workforce to new levels of productivity.”

The evolving virtual data center
At the same time, Cole cautioned, much of the virtual technology that extends beyond server virtualization is still in its early phases. As a result, companies may encounter challenges as they look to enjoy the management benefits of abstracting elements of their data centers. A trusted data center partner can help businesses evaluate and implement emerging technologies, and even oversee transitions such as server virtualization and consolidation.

The standard for what counts as a virtualized data center is set to evolve in the coming years as more physical components are virtualized, and businesses will want to be at the cutting edge of whatever emerges. By outsourcing some infrastructure management tasks to a trusted third-party provider, they can ensure they are adopting these innovations even if they do not have the in-house technical expertise or capital to make the changes. To keep close tabs on the move toward the virtual data center, a managed services and IT consulting partnership is essential.

Data center construction increases, driven by demand for colocation services

As more companies shift to an increasingly digital business model, the demand for colocation services is growing. In turn, data center construction is set to increase at a steady rate in the coming years, according to a recent study from Research and Markets. With new, state-of-the-art infrastructure coming online and the industry gravitating toward large-scale data center deployments, companies may want to reconsider how these trends can simplify their own IT strategies.

According to the Research and Markets study, the global data center construction market is set to grow at a compound annual rate of 21 percent through 2018. This trend is largely being driven by the increasing challenge of managing a data center as new demands in terms of efficient energy use, alternative power sources and industry regulations complicate the logistics of building and running an enterprise facility. Additionally, the growing complexity of network infrastructure is proving a challenge for many companies to handle internally, prompting them to look for outsourced solutions.

In addition to new construction, many existing facilities are also being forced to retrofit with new server, power and cooling equipment to absorb the challenges of the contemporary tech landscape, ITBusinessEdge's Arthur Cole noted in a recent column. The result is a change in the profile of the average data center.

"Going forward, infrastructure will be leaner and meaner, but the individual pieces will be more powerful and flexible," Cole wrote. "And the [data centers themselves] will be fewer in number, but much, much bigger."

Rather than try to weather these changes themselves, companies may find it expedient to embrace the trend toward colocation and instead look for a trusted third-party data center provider. With the right partner, companies can position themselves to transition smoothly into the future.

Recognize the business advantages of data colocation

For many companies, it can be tempting to approach data storage in a fairly insular manner, keeping files on-premise so they can be easily accessed and IT can maintain absolute control. But businesses are increasingly jettisoning their expensive storage and server infrastructure in favor of switching to an outsourced colocation model. By making IT a fixed operational expenditure rather than a massive capital expenditure, companies can remain more flexible. Colocation data centers also provide numerous IT benefits in terms of disaster resilience and collaboration.

"Hosting your own infrastructure can require significant capital investment in real estate," IT executive James Carnie told ComputerWeekly in a recent feature about the merits of different spending models.

In addition to real estate costs, companies investing in their own infrastructure face massive hardware expenses, and they also must accurately anticipate future expansion to know how much equipment to buy during purchase cycles. Additionally, ownership includes the need to pay for their own ongoing maintenance, which leads to unexpected costs as companies deal with issues that arise. In contrast, a colocation model shifts businesses to a planning approach built around fixed monthly costs, IT executive Akshay Kalle wrote in a recent column for the Globe & Mail.

"Managed services models reduce the considerable costs of storage, upgrades, data recovery, converting big capital outlays and unpredictable maintenance costs in time and materials, into predictable monthly fees with clear expectations and guarantees," Kalle explained.

Colocation also simplifies the challenge of dealing with disasters by moving data off-site to a resilient facility, and, by centralizing business information, it enables easier audits, Kalle added. Centralization and virtualization also foster collaboration: By moving data to shared resources in a data center rather than letting it languish on desktops, companies can simplify file sharing and other collaborative processes among their employees.

Make sure disaster recovery is done the right way

The threat of natural disasters or other business interruptions such as power outages and viruses means that companies need robust backup and disaster recovery solutions for their data environment. Often, however, backup and disaster recovery services are conflated, and businesses end up with solutions that don't necessarily offer all the functionality they actually need. To ensure the enterprise IT environment is fully recoverable in the wake of a disaster, companies can benefit from working with a managed services provider to develop a customized plan that fits their needs.

One common misconception about disaster recovery is that it offers nothing appreciably different from a backup or cloud storage solution, a recent MSP Mentor article explained. Most companies already have some form of backup solution, perhaps hosted in the cloud, which may make a separate recovery service seem superfluous.

However, simply relying on backup storage doesn't take the need for getting key applications running again into account, and it can quickly become expensive or difficult to manage as the volume of data increases, Sundar Raman, CTO of Perpetuuiti Technosoft Services, noted in a recent interview with CIOL. This complexity can make shortcuts even more tempting.

"CIOs tasked with addressing business continuity (BC) and disaster recovery issues are keen to achieve quick wins, and the 'tick box' audit approach, which tries to copy successful strategies used elsewhere, is often adopted without consideration of the suitability," Raman explained.

To combat this problem, companies can benefit from working with a dedicated managed service provider to craft a customized solution that fits their specific needs. By determining the best plan to meet recovery time objectives for various applications and data while also working within a manageable budget, companies can establish a disaster recovery plan that gives them more than basic backup without overextending themselves.

Achieve IT savings with better data center management

 

IT departments face a wide variety of budgetary pressures, which means that finding more efficient ways to deliver the same services is always a goal for technology staff. One of the biggest sources of inefficiency for many companies is the corporate data center, which can create substantial costs that have nothing to do with actual IT in the form of power and cooling needs. Companies are increasingly looking for ways to make these operations more efficient and turning to data center infrastructure management solutions as a result. Additionally, many businesses have found that by switching to a managed services provider for their data center, they are able to access the gains of instituting such technology without the upfront costs and complexity.

A recent Navigant Research study found that the market for data center infrastructure management technology is expected to grow more than sixfold in the next six years as data center operators take advantage of new solutions that offer visibility into both key facilities metrics and server management variables. A separate study of one DCIM solution conducted by Forrester found that the return on investment in terms of power and space planning was 93 percent, while the ROI in terms of energy monitoring was 216 percent.

“DCIM – the software, systems, and services that monitor, measure, and help control data centers’ IT and facilities infrastructure – is quickly becoming a must-have technology for managers of modern data centers,” said Eric Woods, research director at Navigant Research.

Given the substantial savings companies can achieve by using state-of-the-art monitoring and management tools, they should look to leverage data center solutions that incorporate these technologies. Managed services providers should have granular insight over their facilities that enables them to create tangible operational savings and, in turn, pass those savings along to clients.

How companies protect data centers against the threat of physical intruders

The range of threats impacting business data is diverse, but while substantial attention gets paid to protecting systems from hackers, the actual infrastructure that houses sensitive information can be an attack vector as well. Companies have grown increasingly aware of the threats posed by a physical intruder in the data center, and certain best practices have emerged around physical security as a result. Leading enterprise data centers and colocation facilities use solutions such as surveillance, security checks, hardened exteriors and mantraps to protect themselves from these threats.

"Companies spend multi-millions of dollars on network security," Enterprise Storage Forum contributor Christine Taylor wrote in a recent article. "Yet if an attacker, disaster, or energy shortage takes down your data center then what was it all for? Don't leave your data center gaping open, and make very sure that your data center provider isn't either."

Limiting physical access by outsiders to the data center is key, as it is a sensitive environment that can be easily sabotaged – either knowingly or unknowingly. One initial protection many data centers use is to have a hardened exterior with extra thick walls and windows (as well as no windows in the server room), Taylor wrote. This precaution helps protect against both physical attacks such as explosives and natural disasters. Similar protections might include crash barriers or landscaping features around the data center to help hide it and protect it from an event like a car crash, for instance.

Security checks and mantraps
Another basic security practice is to use 24/7 surveillance with cameras that move and cover the entire premises, ideally backed by an on-premise security guard. During business hours, security guards can also perform security checks on visitors. In a recent column for TechRepublic, contributor Michael Kassner described a visit to an enterprise data center for which he was required to show two forms of ID and turn over his electronic devices to prevent him from taking pictures.

He then faced some internal physical barriers in the form of a turnstile and mantraps, which are essentially airlocks designed to prevent more than one person from passing through a door at once. The ones Kassner encountered had sensitive weight scales to detect if more than one person was coming through, as well as if that person had carried something in and not out or vice versa. Mantraps and turnstiles prevent tailgating, or the practice of following an approved employee through a secure entrance.

As companies make data center decisions, choosing a provider that can offer these robust solutions for protecting physical infrastructure is essential. Just as businesses need to secure their digital perimeter, they should look to achieve best practices for locking down their physical perimeter as well.

Developing the customized cloud in the data center

The past few years have seen a wholesale embrace of cloud storage and application hosting approaches, and companies are continuing to look for solutions that meet their evolving computing needs. Despite the rapid growth of cloud services, however, the majority of cloud deployments are still private, occurring in the on-premise or managed data center, according to VMware CEO Pat Gelsinger. While the move to public cloud services is set to continue as companies look for cheaper delivery models for certain services, this general reality is expected to stay consistent for the foreseeable future.

"On-premise cloud is a $2 trillion market … 92 percent of cloud is on-premise," Gelsinger said at the Cloud Factory conference in Banff, Alberta, according to VentureBeat. "And Gartner says that by 2020 it will still be 77 percent."

Companies are keeping their clouds on-premise due to reasons tied to security, cost, government regulations and availability, Gelsinger added. However, there's another reason for the ongoing use of on-premise cloud: There are many ways to use the cloud, and not all of them are best suited for public cloud deployments. Companies can leverage virtual servers in a variety of ways, and the dominant model for new IT deployments is moving toward custom implementations on a per-use basis, ITBusinessEdge's Arthur Cole wrote in a recent column. Rather than following industry trends, companies are looking for the right way to meet their needs for individual applications, whether they have certain scale requirements that necessitate using the public cloud or regulatory demands that make it easier to keep data in a single colocation facility.

"If ever there was an example of a technology being all things to all people … it is the cloud," Cole wrote. He later added, "Cloud infrastructure, then, is likely to become as diverse as today's legacy environments, but with the added twist that it can be made and remade according to the needs of the moment."

As companies look for the best balance of cloud deployments, they can benefit from working with a managed services provider or IT consulting firm to determine the optimal data center infrastructure and virtual server architecture solutions to meet their needs.

Managed services equip companies to deal with changing cybersecurity landscape

Each year seems to bring a broader and more complex array of cyber threats to businesses, and many companies are struggling to keep up with the rapid pace of change. According to a recent survey from security software firm KnowBe4, more than half of IT managers – 51 percent – find security harder to maintain now than a year ago. Preventing cyberthreats and responding quickly to security issues are some of the biggest challenges for companies, which is why many are turning to managed services providers for a more secure infrastructure, as well as functions like malware removal and application support.

"Cybercriminals are constantly devising cunning new ways to trick users into clicking their phishing links or opening infected attachments," KnowBe4 CEO Stu Sjouwerman stated, adding that companies need to respond with thorough cybersecurity procedures, policies and training.

Another recent study from Solutionary and the NTT Group found that 54 percent of new malware goes undetected by antivirus software. As a result, companies need to make sure they are protected at the application level by using secure software and applying updates, ITBusinessEdge contributor Sue Poremba wrote in a recent column. Leveraging managed services for application support can help ensure software is kept updated and secured against threats, while external expertise can also be valuable in implementing state-of-the-art perimeter solutions and secure data center infrastructure.

Additionally, a managed services provider that offers malware removal can be a valuable partner in responding to and limiting the damage of an incident like an SQL injection attack, which the Solutionary study noted can easily cost a business $200,000 or more. Such protection might be unaffordable for a small business to implement in-house, but, by outsourcing certain IT management functions, companies can access state-of-the-art security solutions and industry-leading expertise. With the right portfolio of tools protecting it, a small business can avoid these ever-expanding threats.

Federal big data initiatives make data management paramount concern

Effective data management will be a critical concern as the United States federal government ramps up its exploration of big data. While information-driven initiatives have the potential to transform a variety of civil and infrastructure projects, as well as contribute to a meaningful cybersecurity plan, a lack of data oversight could make these projects ineffective and put people at risk. 

Federal agencies have already put some big data initiatives in motion, while other industry analysts tout the potential benefits of information analysis. Recent research found that organizations including the Department of Homeland Security and the Government Accountability Office think that big data tools can help them combat cyberthreats on a country-wide scale, according to InformationWeek. Efforts to combat climate change, establish "smart" utilities and improve national healthcare can also capitalize on the insights big data provides.

However, data management, already a thorn in the side of many federal agencies, will become more difficult as data storage demands skyrocket. The Federal Data Center Consolidation Initiative, a project to close 40 percent of federal data centers – saving $5 billion by 2015 in the process – may be losing steam amid cost concerns and facilities closures that don't align with best practices, according to FCW. Out of the more than 7,000 government data centers, only 640 have been shut down. Although 470 are slated to shut down by September 2014, 2,400 would have to close within the next year and a half to reach the stated goal of 40 percent.

The government's struggles are a reminder that data management cannot take a backseat to cost or facilities considerations.

Disaster recovery services, cybersecurity critical to protecting electric grid from attacks

Over the past few years, the utilities industry has made a concentrated effort to make key infrastructure "smarter." The integration of data-capturing devices and automated, software-based management systems has the potential to create smart electric grids that can more effectively use and distribute power, reducing energy costs and environmental impact in the process.

However, turning power grids into connected devices has potentially harrowing implications – a concentrated cyberattack could cause lengthy and widespread outages, not only withholding electricity from businesses and residences, but disrupting communications, healthcare systems and the economy. According to many cybersecurity researchers, the likelihood of a potential problem occurring is less of an "if" and more of a "when." 

Ramping up disaster recovery services and cybersecurity protocols is key to shielding the smart electric grid from a devastating attack. While the federal government tries to increase the efficacy and stringency of its own security measures, it's important that utility companies – from national generators to local distributors – build up their own prevention and backup systems, according to a recent white paper by the three co-chairs of the Bipartisan Policy Center's Electric Grid Cybersecurity Initiative. This effort will require a hybrid system that responds to both physical and cybersecurity threats. 

"Managing cybersecurity risks on the electric grid raises challenges unlike those in more traditional business IT networks and systems," the report stated. "[I]t will be necessary to resolve differences that remain between the frameworks that govern cyber attack response and traditional disaster response."

Disaster recovery efforts need to include backup digital systems that rival physical ones. Electric grids require faultless failover technology that can depend on a secondary backup network if the primary one is taken offline for any reason. As the Baker Institute pointed out in a recent Forbes article, the measure of a disaster recovery system's effectiveness is based on whether the grid can be restarted following a major breach, disruption or cyberattack. Without a system that can effectively monitor, prevent and immediately respond to such threats, the smart electric grid could be putting many key infrastructure systems in danger.