Using colocation in the era of the cloud

Colocation facilities have long been vital resources for organizations that require high-performing data centers but prefer to entrust infrastructure management to a third-party provider. In addition to sparing IT departments the headaches of maintaining servers, switches and other equipment, colocation produces tangible benefits such as:

  • Redundant power supplies: Individual endpoint failures or even natural disasters won’t compromise uptime.
  • Streamlined IT costs: Colocation removes many of IT’s considerable expenditures on equipment and personnel
  • Cutting-edge performance: A colo facility typically has access to best of breed IP services and equipment, more often than not enabling better speed and reliability than the client could achieve strictly in-house.

Accordingly, in North America, the colocation and managed hosting services market is primed for strong expansion. TechNavio recently projected that it would increase at a 13.6 percent compound annual growth rate from 2013 to 2018.

Reduction of capital and operating expenditures is expected to be a key driver of colocation uptake. But what is colocation’s place in an IT landscape increasingly dominated by cloud services?

Finding the right colocation provider in the era of cloud computing
Cloud computing has fundamentally changed IT by giving developers, testers and operations teams access to unprecedented amounts of on-demand resources. Organizations have more options than ever for scaling their businesses, and the cloud has already enabled the success of blockbluster services such as Netflix and Instagram.

Colocation can play an important part as companies modernize their infrastructure and take advantage of remote infrastructure. Many IT departments are in the midst of migrating some on-premises systems to the cloud, creating mixed environments known as hybrid clouds. Colocation providers can step to the plate and supply the security, flexibility and know-how needed for evolving IT for the cloud age.

To that end, buyers should look for experienced managed services providers adept at handling a variety of infrastructure. Although colocation has been around since before the cloud entered the mainstream, cutting-edge offerings may offer a level of usability on par with public cloud, via top-flight service management.

“[C]olocation providers need to offer more than just remote hands,” wrote Keao Caindec, chief marketing officer at 365 Data Centers, for Data Center Knowledge. “They need to offer basic managed services such as firewall management, server management, backup and recovery services as well as other managed IT operations services for the dedicated infrastructure of each client.”

Determining bandwidth requirements in the data center

How much bandwidth does a data center really need? It depends on how many workloads and virtual machines are in regular operation, as well as what the facility is designed to support. For example, a data center providing resources to a public cloud requires much more bandwidth than one that is simply powering internal systems and operations shielded by the company firewall. The increasing uptake of remote data centers and colocation arrangements, in tandem with server virtualization, has added to organizations' bandwidth considerations.

How virtualization complicates bandwidth requirements
Server and desktop virtualization have made companies less reliant on physical infrastructure and the specific sites that house it. Here's how they work:

  • With desktop virtualization, or VDI, an operating system can be hosted by a single machine (even an aging one), simplifying management of both software and hardware while reducing costs
  • Server virtualization involves a single physical server being turned into multiple virtual devices. Each instance is isolated and the end user cannot usually see the technical details of the underlying infrastructure.

By getting more out of IT assets via virtualization, companies have reshaped IT operations. More specifically, they have spread out their infrastructure across multiple sites and put themselves in position to move toward cloud computing.

With increased reliance on virtualization, organizations have looked to ensure that remote facilities receive the bandwidth needed to provide software, instances and data to users. However, liabilities still go overlooked, jeopardizing reliability – especially when data centers are too far apart from each other.

Ensuring low latency is just one piece of the data center optimization puzzle, though. Sufficient bandwidth must also be supplied to support the organization's particular workloads. In the past, Microsoft has advised Exchange users to think beyond round trip latency.

"[R]ound trip latency requirements may not be the most stringent network bandwidth and latency requirement for a multi-data center configuration," advised Microsoft's Exchange Server 2013 documentation. "You must evaluate the total network load, which includes client access, Active Directory, transport, continuous replication and other application traffic, to determine the necessary network requirements for your environment."

Knowing how much bandwidth is needed
Figuring out bandwidth requirements is a unique exercise for each enterprise. In a blog post, data center networking expert Ivan Pepelnjak looked at the nitty-gritty of assessing bandwidth-related needs, honing in on some of the problems that reveal a need to rethink how bandwidth is allocated and utilized.
These issues include:

  • Over-reliance on slow legacy equipment
  • Oversubscription to services
  • Miscalculation of how much traffic each virtual machine generates 

In addition, data center operators sometimes overlook bottlenecks such as how virtual machines can sometimes interact slowly with storage. If they have to frequently access data stored on an HDD, for example, quality of service may degrade. Networks may require extra bandwidth in order to avoid data transfer hiccups. 

Virtualization, open source switches changing the face of data centers

Data center technology moves quickly. With the emergence of wide-scale cloud computing over the past decade, enterprises have constructed new facilities and adopted cutting-edge equipment to keep up with demand and/or worked with managed services providers to receive capacity through colocation sites.

Virtualization drives strong growth of data center networking market
Last year, MarketsandMarkets estimated that the data center networking market alone could top $21 billion by 2018 as virtualization and specific technologies such as 40 Gigabit Ethernet continue to gain traction. Rather than rely on legacy physical switches that are challenging to upgrade and scale, enterprises are turning to virtual alternatives.

Virtualizing the network makes equipment and services much easier to modify. Since the fundamental advantage of cloud computing is the ability to get resources on demand, such extensibility is critical for helping companies keep pace with changing requirements.

"Virtualization being a disruptive technology is one of the major driving factors in [the] data center networking market," MarketsandMarkets analyst Neha Sinha told Network Computing. "The adoption of high-performance virtual switches is critical to support increasing number of virtual machines used in multi-tenant data centers. The virtual switches include programmatically managed and extensible capabilities to connect the virtual machines to both physical and virtual networks."

Down the road, such interest in mixing and matching legacy, physical and virtual assets may lead organizations to take up software-defined networking. This practice entails managing network services in a more intelligent, CPU-centric way.

However, SDN is still over the horizon for many companies right now. Both the use case and the underlying technology are not widely understood. Plus, enterprises are still trying to accrue enough personnel expertise in areas such as server virtualization to give them a solid foundation for future modifications of their networks and data centers.

Facebook announces open source data center switch
The demand for higher data center efficiency is unabating, and tech giants such as Facebook are looking to get in on the action. PCWorld reported that the social network has confirmed an open source switch, released through the Open Compute Project, that could challenge longstanding incumbents such as Cisco.

Facebook's switch is a top of the rack appliance that connects servers to other data center infrastructure. It has 16 individual 40 Gigabit Ethernet ports. The endpoint is designed for maximum flexibility for developers and data center operators, and it may contribute to broader efforts to make infrastructure more flexible.

Banks, other organizations use UC to improve client service and user experience

The unified communications market is changing. Feature-rich Internet messaging and voice-over-IP telephony were once mostly the domain of CIOs and IT departments, but these services are entering the mainstream, driven by employees’ uptake of mobile hardware through BYOD initiatives and easy-to-use applications, as well as the subsequent entry of these endpoints into the workplace. Costs have declined and the underlying technology has been simplified, making UC, whether delivered through the cloud or on-premises infrastructure, an increasingly attractive option.

“[F]ocus has shifted to the end-user experience, including ease of use, as well as the business value of UC,” observed COMMfusion president Blair Pleasant in article for No Jitter. “There’s a growing realization that the user experience must be intuitive, relevant to the user’s work and tools, and competitive with the experiences delivered by consumer devices and apps. It’s no longer about getting the ‘latest and greatest’ – it’s delivering intuitive and contextual UC solutions, and the business results that are achieved by simplifying collaboration and meetings and enhancing the mobile experience.”

Unified communications market reshaped by consumer focus
The shifts toward intuitive UC user experience comes at just the right time, as UC begins displacing legacy systems. In the past, communications infrastructure was too limited, costly and complex to cater to the end user. Much of IT’s time was devoted to simply maintaining the status quo, with little left over for improving usability or refining the user interface.

With the emergence of cloud computing as well as flexible, highly capable on-premises solutions, all of that has changed. Third-party hosting companies now steward UC technology, optimizing it for day-to-day use by their clients. At the same time, organizations with large call volumes increasingly utilize on-site UC – with installation help from managed services providers – for maximum stability and cost-effectiveness. Either way, businesses and their clients now benefit from amenities such as:

  • Contextual services: Relevant call histories, emails, texts and documents can be retrieved for each conversation.
  • Embedded technologies: Computer telephony integration in integrated into most contact center solutions, and UC is moving in the same direction. It is no longer a standalone services so much as fundamental communications infrastructure.
  • Video meeting rooms: Video conferencing enables better remote collaboration, and with VMRs it is possible for users to connect using a client of their choice, whether they are inside or outside the company firewall.

All of these features add up to a rich UC experience for users and tangible benefits for the organization. Banks, for instance, have deployed wide-area networks and contact centers to better support UC and improve interactions with clients. According to AllAfrica, Comnavig ICT Advisers CEO Olufemi Adeagbo recently identified a well-designed, technologically sound contact center – with features such as UC and video conferencing – as the only way to ensure that business opportunities are realized and brand reputation maintained.

“Imagine a car sale opportunity that is lost because the advertised mobile number is off, unavailable or cannot be answered,” stated Adeagbo. “Imagine the dormant account the bank does not proactively place a call about to understand the issue and reactivate.”

How BYOD can be made easier through desktop virtualization

 

Bring your own device policies, already buoyed by rapid uptake of smartphones and tablets, may gather additional momentum as prominent technology vendors devote attention to making mobile hardware valuable in the workplace. Dropbox for Business has made several big acquisitions related to BYOD, with the aim of helping businesses transition to multi-device, highly consumerized IT environments. Meanwhile, Apple has included advanced support for email, device enrollment and calendar collaboration in iOS 8, making the mobile OS more amenable to BYOD than ever.

It’s clear that BYOD isn’t going away. However, organizations are still adjusting to the new pressures that the phenomenon places on data control, security and compliance. While major firms continue to work on BYOD-centric solutions, enterprises have to assess their mobility needs and decide whether to implement measures such as desktop virtualization to enable BYOD.

Virtualization makes BYOD more secure for leading steel producer
The central issue with any BYOD policy is the transfer of control – over hardware, software and data – from the IT department to employees, who may be less scrupulous in terms of what applications they use. For example, files that should remain behind the company firewall may be shared with consumer-facing cloud services. Mobile devices enable such habits, even as they hold potential to enhance collaboration and remote work.

Fortunately, desktop virtualization facilitates a middle ground between BYOD adoption and enterprise security. Rather than let each endpoint have its own OS and applications, IT departments distribute a single desktop experience via a virtual machine. Devices connect to the VM securely and gain access to approved software. Data is not retained on user hardware after a session ends.

Essar Group, a conglomerate involved in steel, oil and telecom services, turned to desktop virtualization to standardize and secure its employees’ mobile experience when working with company assets. Ultimately, it moved 5,000 users to its new virtualized platform.

“Security of data was the primary point of scope for looking for [a] desktop virtualization solution,” Jayantha Prabhu, CTO at the Essar Group, told Dataquest. “We had a good experience of the ability to control the data at the disposal of the employee when we deployed the same for some of our teams which handled data which was very critical both from a confidentiality and a brand perspective. We had around 3,000 BlackBerry users and more than 2,000 people with tablets, and with all the applications being accessed on the tablets, it was tough to ensure security of critical information.”

Desktop virtualization is a powerful tool for securing data and controlling mobile devices, but its benefits don’t stop there. Other perks include:

  • Reduced power consumption through the use of thin clients (machines that depend on a server for most or all of their software).
  • Centralized management of software and devices, with much more efficient patch distribution and application upgrades.
  • Support for remote collaboration since users can get the same experience from any Internet-enabled device.

With a broad set of advantages for organizations in finance, healthcare, education and other sectors, desktop virtualization is a practical, versatile way to incorporate BYOD while maintaining the integrity of company data.

Getting better device and data security through desktop virtualization

Desktop virtualization is an increasingly popular way to get more out of old IT systems while enabling access to company applications from virtually any device. By hosting an operating system on a centralized virtual machine, organizations can avoid the hassle of installing and managing extra software on every last piece of equipment. Under ideal circumstances, virtualization contributes to high levels of security and convenience.

Virtual desktop infrastructure and mobile security
The influx of mobile endpoints into the workplace, fueled by bring-your-own-device policies, has made such virtual desktop infrastructure appealing. IDC recently estimated that 155 million smartphones would be used for BYOD in the Asia-Pacific region in 2014.

But what about security? Employees who use their own hardware may be prone to mingling personal habits and data with corporate assets. A classic example is managing sensitive work documents through consumer applications such as Dropbox.

Enter VDI. Important data can be kept in cloud storage services and accessed exclusively via secure connections. Information is usually not retained locally, and all permissions are protected by authentication mechanisms. VDI basically provides a catch-all solution to managing application access in the context of BYOD.

"The move to BYOD was a wakeup call for mobile security because information security is a key IT responsibility – regardless of whether the mobile device in question is company-provided or user-owned," observed Michael Finneran for TechTarget. "Unless an organization opts for a solution that avoids storing corporate data on a mobile device, systems will be needed to protect that information."

Virtualization vendors target health care, financial industries
VDI's potential for securing applications and data has caught the attention of organizations in health care, finance and other regulated sectors. At the same time, major technology vendors have worked on thin client solutions for these markets, crafting products and services that enable desktop virtualization through minimal infrastructure.

Still, as virtualization becomes more popular, there have been concerns about balancing performance and security. Network Computing's Jim O'Reilly dug into the dilemma by noting that many providers have added instance storage, which are usually solid-state drives that provide the speed and muscle to overcome common bottlenecks such as VDI boot storms (i.e., when everyone logs in at around the same time).

Instance storage enables outstanding performance, but it also results in data states being preserved and, in theory, prone to surveillance and theft. Persistent data could be an issue for health care organizations obligated to comply with legislation such as the Health Insurance Portability and Accountability Act. Organizations should understand the ins and out of any virtualization solution before entrusting data to it.

Colocation increasingly popular among banks and other enterprises

Many companies are deciding to forgo building and maintaining their own data centers in favor of colocation options offered by third-party providers. There are numerous advantages to this approach, especially now that organizations are crafting their own applications, many of them mission-critical, designed to scale for many users.

What can be accomplished through colocation? Key benefits include:

  • Physical and data security: Colocation providers typically manage highly secure facilities that feature closed-circuit cameras, biometric scanners and alarm systems. Furthermore, actual server cabinets may be locked and full-time security staff may be on site. With an in-house data center, the costs of such measures would be exorbitant, but they are included in many colocation plans.
  • Scalability: Keeping up with evolving requirements and user demand can be tricky with an in-house data center. When a change is needed, extra equipment may need to purchased and additional staff hired. With colocation, upgrading is a much simpler matter of simply changing the service plan.
  • Redundancy and reliability: Colocation facilities are often decked out with environmental monitoring to ensure that conditions are optimal, plus they usually feature redundant power supplies. Similarly, top-flight equipment enables high network reliability. Clients can worry less about the fallout from natural disasters, power outages or downtime that would more acutely affect a self-run data center than a colo site.
  • Performance: Building on the last point, colocation data centers have strong, consistent performance as a result of redundancies, as well as optimal power and networking arrangements.
  • Support: Colocation packages feature around-the-clock client support. Rather than having to troubleshoot an issue in the middle of the night on their own, IT personnel can contact a representative for technical assistance.

Taken together, these perks facilitate economical and reliable utilization of data centers. Colocation helps enterprises keep pace with the emergence of cloud computing and new requirements for application development and data processing, all while controlling costs. Unsurprisingly, colocation providers have been ramping up their budgets to serve the growing number of organizations interested in their services.

Colocation spending surges as enterprises spend less on in-house data centers
Rising interest in colocation has forced providers to expand capacity and services. An Uptime Institute study found that almost 90 percent of surveyed colocation companies had increased their budgets year-over-year.

Similar growth hasn’t occurred within enterprises, with only half of them (excepting financial institutions) reporting larger budgets than the year before. However, more than 60 percent of banks and other financial services providers saw gains.

Across the board, though, more resources are being moved off premises and into colocation data centers or cloud computing environments. The study found that:

  • One quarter of respondents’ capacity was running in colocation facilities, while seven percent was tied to public cloud.
  • Almost 40 percent of companies using at least 5,000 servers – most of them in finance – relied on more than five providers.
  • Availability, geographic scalability and capital cost reduction were primary drivers of colocation adoption.

These findings give a good cross-section of where enterprises are at as they try to deal with the expanding roles of software and cloud computing. Colocation gives institutions in verticals such as finance a leg up in controlling costs and improving reliability.

How data center backup helps with business continuity

 

Backing up data should be standard operating procedure for everyone by now. It’s not worth the risk of losing important documents, photos and videos, especially now that there are so many options for stashing these assets in redundant cloud storage services or on an external hard drive.

But while consumers may have only a few gigabytes of data to back up, organizations have a much more extensive undertaking when it comes to ensuring the safety of mission-critical information. With the ongoing shift to data centers, colocation and cloud computing, companies have to be conscious of how and where their massive data stores are kept.

Back up data center assets to guard against loss
Data center backup plans are increasingly important hedges against disaster. A Unisonius Consulting study of Turkish data centers noted that many sites are still vulnerable to threats such as:

  • Flooding: Some data centers are located in low-elevation locales or in cellars.
  • Power outage: Having only a single electricity supplier means that a facility cannot easily accommodate redundancy.
  • Earthquakes: Depending on their locations, data centers may be subject to seismic activity.

While the report examined data centers in Turkey, similar conditions exist around the world, endangering data that is properly housed or protected by backup. Frequent outages underscore what could happen to data and business continuity in a worst-case scenario.

“There is always a data center outage in the U.S. It’s whether you have invested in backup services or not that will determine whether your services are affected,” Jordan Lowe, CEO at ServerCentral told DatacenterDynamics. “As a business you make an active decision whether to invest into backup sites. It is a lot of time, a lot of effort and a lot of money but, if you don’t want a service affected that’s a bullet you have to bite.”

Features to look for in data center backup
For organizations in sectors such as finance, it is worth having peace of mind when it comes to data backup and restoration. Here are some specific features that make for a good backup solution:

  • Secure, automatic multi-site backup: Having just one backup isn’t enough. It is better to use a service that automatically and securely (using encryption) relays backup data from one site to another.
  • Disaster recovery: It may be beneficial to migrate assets to a secure off-site location to mitigate risk from flooding, hurricanes and fires.
  • Easy administration: A Web-based, graphical interface protected by a virtual private network is valuable for managing backup policies and workflows from anywhere.

These capabilities contribute to the ongoing safety and integrity of data. Even in the event of a natural disaster or outage, key assets are preserved and business continuity is sustained.

BYOD could evolve into bring your own everything

Bring your own device has certainly had its fans in recent years, with a growing number of organizations drawn to the collaborative and productive capabilities of smartphones and tablets. Gartner has estimated that by 2017 half of firms will not only allow but encourage employees to take advantage of company BYOD policies.

BYOD and the path toward bringing in anything and everything
Although BYOD is commonly associated with someone simply bringing in a personal device and connecting it to various company applications (as well as attendant security mechanisms), it has the potential to become, or at least influence, something much bigger. Some firms believe that the same ethos behind BYOD could lead to employee-supplied software, storage and services – basically a bring your own anything/everything state of affairs.

According to the Australian government document "Victorian Government ICT Strategy 2014 to 2015," chronicled the shift to BYOD by public sector agencies, citing core benefits such as:

  • Efficient working arrangements: Employees can work from anywhere, potentially driving down organizational costs for office space and power
  • Higher productivity: Workers may feel more comfortable operating out of their homes and using their own devices
  • Easier hardware management: Organizations don't have to buy new endpoints as often, if at all.

The implications of BYOD adoption are wide-reaching. With enterprises becoming increasingly less reliant on hardware, software and even facilities that they have paid for upfront, there's the allure of supporting more operations through cloud computing services that can be delivered to any device, anywhere. It's possible that workers could supplement their personal smartphones and tablets with productivity tools and online storage of their choosing.

"Underpinning BYOD, a range of policies and standards are required to ensure that security, interoperability and performance are not compromised. BYOD is a first step in a broader approach to employee [information and communications technology] productivity, leading to bringing your own productivity software and some storage – i.e. BYOE ('bring your own everything')," wrote the report authors.

The range of use cases for BYOD is certainly impressive. Eight years ago, Seton Academy in South Holland, Indiana introduced student laptops, containing 70 percent of the required textbooks, and now it is transitioning to BYOD. More specifically, its educators plan to use cutting-edge hardware to support school-wide initiatives such as delivering books through the cloud and moving to electronic-only submission of papers.

Unified communications brings real benefits to education

As its name suggests, a unified communications suite gives the user a variety of business services, from instant messaging and email to voice calling and video conferencing, in a single convenient platform. On top of that, UC can serve as a natural path into cloud computing. Many of its key functionalities do not even require on-premises physical equipment since they can be run from remotely hosted servers.

UC solutions are ideal for organizations looking to consolidate their communications processes and save money while doing so. Let's look at some of the general benefits of UC, as well as how it has worked in practice for institutions in higher education.

Why organizations should consider replacing legacy systems with UC
The well known limitations of legacy hardware and software – inflexibility, difficult maintenance and total cost – can hold back businesses that are in the midst of rapid growth. Rather than deal with arduous upgrades of traditional phone systems, for instance, companies can adopt a UC platform that provides a broader set of communications services with a lower price tag.

Some of the most notable perks of UC include:

  • Better overall user experience: Thanks to the rise of mobile computing, individuals now expect immersive, intuitive interactions with all devices and applications. UC offerings may feature rich interfaces, plus they're usually compatible with smartphones and tablets, making it possible to work from anywhere.
  • Reduced equipment- and support-related costs: Investing in a UC system may not even require purchasing new hardware upfront. Services are delivered through an IP network, an appealing arrangement for cost-conscious small and midsize businesses. The UC provider may also handle support issues, freeing up the IT department to attend to other matters.
  • Flexibility and scalability: Backed by cloud storage services, UC solutions can be easily modified and extended as business requirements evolve and the user base grows.

University puts UC to work in modernizing practices
How does UC look in the real world? EdTech chronicled the Florida State University College of Medicine's adoption of a UC suite that included video conferencing and was supported by server virtualization.

Five years ago, the institution upgraded its network to support the added bandwidth requirements of video conferencing, and more recently it virtualized its servers to save rack space. The result has been a UC video conferencing platform that enables remote work and easy video viewing by students and guests.

"We now have more than 2,000 recordings that take up in excess of 2 terabytes of data," college media specialist Patrick Sparkman told EdTech. "While virtualizing our UC servers was part of the college's effort to modernize its server infrastructure, adding the [storage area network] gave us the storage capabilities we needed. And, through the server virtualization, we now have the redundancy we didn't have when we started."

The college is still moving toward full implementation of UC. Over the next few years, it hopes to continue making use of platforms such as Microsoft Lync and also integrate voicemail with email.