Colocation increasingly popular among banks and other enterprises

Many companies are deciding to forgo building and maintaining their own data centers in favor of colocation options offered by third-party providers. There are numerous advantages to this approach, especially now that organizations are crafting their own applications, many of them mission-critical, designed to scale for many users.

What can be accomplished through colocation? Key benefits include:

  • Physical and data security: Colocation providers typically manage highly secure facilities that feature closed-circuit cameras, biometric scanners and alarm systems. Furthermore, actual server cabinets may be locked and full-time security staff may be on site. With an in-house data center, the costs of such measures would be exorbitant, but they are included in many colocation plans.
  • Scalability: Keeping up with evolving requirements and user demand can be tricky with an in-house data center. When a change is needed, extra equipment may need to purchased and additional staff hired. With colocation, upgrading is a much simpler matter of simply changing the service plan.
  • Redundancy and reliability: Colocation facilities are often decked out with environmental monitoring to ensure that conditions are optimal, plus they usually feature redundant power supplies. Similarly, top-flight equipment enables high network reliability. Clients can worry less about the fallout from natural disasters, power outages or downtime that would more acutely affect a self-run data center than a colo site.
  • Performance: Building on the last point, colocation data centers have strong, consistent performance as a result of redundancies, as well as optimal power and networking arrangements.
  • Support: Colocation packages feature around-the-clock client support. Rather than having to troubleshoot an issue in the middle of the night on their own, IT personnel can contact a representative for technical assistance.

Taken together, these perks facilitate economical and reliable utilization of data centers. Colocation helps enterprises keep pace with the emergence of cloud computing and new requirements for application development and data processing, all while controlling costs. Unsurprisingly, colocation providers have been ramping up their budgets to serve the growing number of organizations interested in their services.

Colocation spending surges as enterprises spend less on in-house data centers
Rising interest in colocation has forced providers to expand capacity and services. An Uptime Institute study found that almost 90 percent of surveyed colocation companies had increased their budgets year-over-year.

Similar growth hasn’t occurred within enterprises, with only half of them (excepting financial institutions) reporting larger budgets than the year before. However, more than 60 percent of banks and other financial services providers saw gains.

Across the board, though, more resources are being moved off premises and into colocation data centers or cloud computing environments. The study found that:

  • One quarter of respondents’ capacity was running in colocation facilities, while seven percent was tied to public cloud.
  • Almost 40 percent of companies using at least 5,000 servers – most of them in finance – relied on more than five providers.
  • Availability, geographic scalability and capital cost reduction were primary drivers of colocation adoption.

These findings give a good cross-section of where enterprises are at as they try to deal with the expanding roles of software and cloud computing. Colocation gives institutions in verticals such as finance a leg up in controlling costs and improving reliability.

How data center backup helps with business continuity

 

Backing up data should be standard operating procedure for everyone by now. It’s not worth the risk of losing important documents, photos and videos, especially now that there are so many options for stashing these assets in redundant cloud storage services or on an external hard drive.

But while consumers may have only a few gigabytes of data to back up, organizations have a much more extensive undertaking when it comes to ensuring the safety of mission-critical information. With the ongoing shift to data centers, colocation and cloud computing, companies have to be conscious of how and where their massive data stores are kept.

Back up data center assets to guard against loss
Data center backup plans are increasingly important hedges against disaster. A Unisonius Consulting study of Turkish data centers noted that many sites are still vulnerable to threats such as:

  • Flooding: Some data centers are located in low-elevation locales or in cellars.
  • Power outage: Having only a single electricity supplier means that a facility cannot easily accommodate redundancy.
  • Earthquakes: Depending on their locations, data centers may be subject to seismic activity.

While the report examined data centers in Turkey, similar conditions exist around the world, endangering data that is properly housed or protected by backup. Frequent outages underscore what could happen to data and business continuity in a worst-case scenario.

“There is always a data center outage in the U.S. It’s whether you have invested in backup services or not that will determine whether your services are affected,” Jordan Lowe, CEO at ServerCentral told DatacenterDynamics. “As a business you make an active decision whether to invest into backup sites. It is a lot of time, a lot of effort and a lot of money but, if you don’t want a service affected that’s a bullet you have to bite.”

Features to look for in data center backup
For organizations in sectors such as finance, it is worth having peace of mind when it comes to data backup and restoration. Here are some specific features that make for a good backup solution:

  • Secure, automatic multi-site backup: Having just one backup isn’t enough. It is better to use a service that automatically and securely (using encryption) relays backup data from one site to another.
  • Disaster recovery: It may be beneficial to migrate assets to a secure off-site location to mitigate risk from flooding, hurricanes and fires.
  • Easy administration: A Web-based, graphical interface protected by a virtual private network is valuable for managing backup policies and workflows from anywhere.

These capabilities contribute to the ongoing safety and integrity of data. Even in the event of a natural disaster or outage, key assets are preserved and business continuity is sustained.

BYOD could evolve into bring your own everything

Bring your own device has certainly had its fans in recent years, with a growing number of organizations drawn to the collaborative and productive capabilities of smartphones and tablets. Gartner has estimated that by 2017 half of firms will not only allow but encourage employees to take advantage of company BYOD policies.

BYOD and the path toward bringing in anything and everything
Although BYOD is commonly associated with someone simply bringing in a personal device and connecting it to various company applications (as well as attendant security mechanisms), it has the potential to become, or at least influence, something much bigger. Some firms believe that the same ethos behind BYOD could lead to employee-supplied software, storage and services – basically a bring your own anything/everything state of affairs.

According to the Australian government document "Victorian Government ICT Strategy 2014 to 2015," chronicled the shift to BYOD by public sector agencies, citing core benefits such as:

  • Efficient working arrangements: Employees can work from anywhere, potentially driving down organizational costs for office space and power
  • Higher productivity: Workers may feel more comfortable operating out of their homes and using their own devices
  • Easier hardware management: Organizations don't have to buy new endpoints as often, if at all.

The implications of BYOD adoption are wide-reaching. With enterprises becoming increasingly less reliant on hardware, software and even facilities that they have paid for upfront, there's the allure of supporting more operations through cloud computing services that can be delivered to any device, anywhere. It's possible that workers could supplement their personal smartphones and tablets with productivity tools and online storage of their choosing.

"Underpinning BYOD, a range of policies and standards are required to ensure that security, interoperability and performance are not compromised. BYOD is a first step in a broader approach to employee [information and communications technology] productivity, leading to bringing your own productivity software and some storage – i.e. BYOE ('bring your own everything')," wrote the report authors.

The range of use cases for BYOD is certainly impressive. Eight years ago, Seton Academy in South Holland, Indiana introduced student laptops, containing 70 percent of the required textbooks, and now it is transitioning to BYOD. More specifically, its educators plan to use cutting-edge hardware to support school-wide initiatives such as delivering books through the cloud and moving to electronic-only submission of papers.

Unified communications brings real benefits to education

As its name suggests, a unified communications suite gives the user a variety of business services, from instant messaging and email to voice calling and video conferencing, in a single convenient platform. On top of that, UC can serve as a natural path into cloud computing. Many of its key functionalities do not even require on-premises physical equipment since they can be run from remotely hosted servers.

UC solutions are ideal for organizations looking to consolidate their communications processes and save money while doing so. Let's look at some of the general benefits of UC, as well as how it has worked in practice for institutions in higher education.

Why organizations should consider replacing legacy systems with UC
The well known limitations of legacy hardware and software – inflexibility, difficult maintenance and total cost – can hold back businesses that are in the midst of rapid growth. Rather than deal with arduous upgrades of traditional phone systems, for instance, companies can adopt a UC platform that provides a broader set of communications services with a lower price tag.

Some of the most notable perks of UC include:

  • Better overall user experience: Thanks to the rise of mobile computing, individuals now expect immersive, intuitive interactions with all devices and applications. UC offerings may feature rich interfaces, plus they're usually compatible with smartphones and tablets, making it possible to work from anywhere.
  • Reduced equipment- and support-related costs: Investing in a UC system may not even require purchasing new hardware upfront. Services are delivered through an IP network, an appealing arrangement for cost-conscious small and midsize businesses. The UC provider may also handle support issues, freeing up the IT department to attend to other matters.
  • Flexibility and scalability: Backed by cloud storage services, UC solutions can be easily modified and extended as business requirements evolve and the user base grows.

University puts UC to work in modernizing practices
How does UC look in the real world? EdTech chronicled the Florida State University College of Medicine's adoption of a UC suite that included video conferencing and was supported by server virtualization.

Five years ago, the institution upgraded its network to support the added bandwidth requirements of video conferencing, and more recently it virtualized its servers to save rack space. The result has been a UC video conferencing platform that enables remote work and easy video viewing by students and guests.

"We now have more than 2,000 recordings that take up in excess of 2 terabytes of data," college media specialist Patrick Sparkman told EdTech. "While virtualizing our UC servers was part of the college's effort to modernize its server infrastructure, adding the [storage area network] gave us the storage capabilities we needed. And, through the server virtualization, we now have the redundancy we didn't have when we started."

The college is still moving toward full implementation of UC. Over the next few years, it hopes to continue making use of platforms such as Microsoft Lync and also integrate voicemail with email.

Clear policies and user education enable effective BYOD

Bring your own device has been all the rage in recent years, ever since iOS and Android smartphones and tablets entered the mainstream. Although modestly powered compared to a modern desktop PC, these devices have many built-in advantages over older hardware, including high-resolution, pixel-dense displays, 3G and 4G LTE cellular connectivity, and excellent portability.

Still, there are some key considerations to make when adopting a BYOD strategy. Are devices properly secured? How much will it take to support each new endpoint? There are plenty of options out there for organizations seeking to make the most of BYOD and overcome common obstacles related to security and cost.

BYOD has solid momentum, raising the stakes for user education and sound implementation
One of the biggest benefits of BYOD is that it potentially frees the organization from having to shoulder the costs of additional hardware upgrades, since each user supplies his or her own device. On top of that, the freedom and flexibility conferred by BYOD can translate into new business opportunities, such as ones for sales teams that need to make presentations or access corporate data while on the road.

“BYOD strategies are the most radical change to the economics and the culture of client computing in business in decades,” stated Gartner analyst David Willis. “The benefits of BYOD include creating new mobile workforce opportunities, increasing employee satisfaction, and reducing or avoiding costs.”

Gartner predicts that by 2017, half of all businesses will require employees to adopt  BYOD, in hopes of achieving these benefits and others. Setting up and enforcing BYOD policies could save companies a lot of money that would have otherwise gone toward building dedicated networks and procuring compatible hardware.

Solutions such as desktop virtualization have come to the fore alongside BYOD, making it increasingly possible to provide a consistent operating system experience to every device within the organization. Vendors such as Samsung have also created device-specific security suites designed to ease BYOD management.

Ensuring security and productivity with a BYOD strategy
It is important to have an actionable plan in place before implementing BYOD. As ZDNet’s Adrian Kingsley-Hughes pointed out, a seat-of-the-pants approach usually does not work since companies can run into trouble trying to ensure that, for instance, assets are not moved from the internal network to public-facing cloud storage services.

Instead, companies have to train employees on using BYOD-enabled hardware responsibly and regularly reinforce guidelines. More specifically, some considerations for a sensible BYOD policy might include:

  • Guidance on how to deal with lost devices
  • What happens if a BYOD user leaves the organization
  • Listing of what company data, if any, is governed by regulations

Ultimately, BYOD is an exciting opportunity for organizations, but one that must be approached with care. Common sense and technical know-how can transform employee devices into valuable company assets.

“We are now entering a period of transformation,” Samsung Telecommunications America vice president David Lowe told FierceMobileIT. “It started out with clients being very reluctant to support mobility in their enterprise, trying to figure a way to keep it out. We are now in the transformation stage where enterprises are finally embracing it. That’s where the real innovation is going to come.”

Developing a strategic view is essential in the cloud storage era

The business world is moving toward what may soon be an all-cloud computing deployment model, a recent IT World Canada article noted. But as companies are increasingly prompted to shift their resources to colocation hosting and cloud storage solutions, they also must make sure to do so with a strategic vision. Although the cloud can offer valuable solutions to existing problems, IT departments will still want to rely on a range of specific technologies based on application needs.

"[P]ublic procurement processes will almost inevitably lead to more than one type of cloud being used which, for many reasons, is probably preferable but also more complex," IT World Canada contributor Don Sheppard wrote.

Getting into the cloud is often seen as such a key imperative for businesses that there is less consideration about the best way to do so, ITBusinessEdge contributor Arthur Cole explained. A lack of planning can hold back a cloud implementation from offering maximum benefit. In many cases, companies have multiple departments attempting separate moves to the cloud, which can limit the technology's inherent agility and integration benefits.

For some organizations, the public cloud may not be the ideal solution, and a custom virtualization deployment may make more sense, the article noted. To determine the best form of cloud storage for an individual business, it can be advantageous to work with a managed services provider to develop a custom plan. More often than not, a hybrid cloud offering that gives organizations the security and control advantages of on-premises solutions and the scalability and flexibility of the cloud will be the ideal route to take.

Government agencies show power, potential of VDI

Why use virtual desktop infrastructure? Many government agencies have found plenty of reasons to do so, including supporting their increasingly diverse device fleets and reducing overall power consumption.

What are the main benefits of VDI?
With VDI, the computing power needed to deliver a desktop environment moves from on-site PCs to servers housed in a data center. For public and private sector organizations alike, there are several key benefits to this arrangement, including:

  • Heightened device and data security – hardware running virtual desktops via VDI is connected to servers through an encrypted channel. That means it is safe to grant these devices access to core applications, such as enterprise resource planning and client relationship management.
  • Streamlined system administration – IT personnel can worry less about having to implement complex mobile device management for smartphones and tablets
  • Lower costs – VDI can be a viable alternative to building entirely new applications and services tailored specifically for mobile screens. The use of zero/thin clients – minimal hardware with little to no installed software – also drives down electricity consumption compared to desktop PCs.
  • Support for mobility – teams working off-site can still access important assets by connecting to VDI.

U.S. government sees success with VDI implementations
Implementing virtualization and VDI have already produced real gains for the U.S. Department of Energy, as well as the Defense Intelligence Agency and the Navy. For instance, FCW reported that the DOE conducted a 500-seat VDI pilot program that exhibited an excellent level of user experience and proved that VDI could also help trim expenses.

Going forward, VDI may evolve, moving off-premises and into cloud computing environments. More specifically, desktop-as-a-service may provide similar amenities to VDI, with the exception that infrastructure is managed by a cloud services provider rather than the organization itself.

Freeing IT of this responsibility could potentially streamline costs even further. However, there are still the core issues of ensuring that data is kept safe in the cloud and that organization’s particular needs, especially for bandwidth, are being met.

VDI and bandwidth requirements
For organizations that adopt VDI, it’s critical to figure out right away what is expected from the VDI implementation. That way, they reduce the risk of setting up something that doesn’t align with their goals and ends up running over budget. In many cases, these issues manifest themselves as poor end-user experiences or insufficient bandwidth as a result of “boot storms” (many users connecting to VDI simultaneously).

“You also need to bear in mind that VDI almost always results in a change in usage patterns,” explained The Register’s Trevor Pott. “Whatever your usage patterns are today, expect that VDI deployments will ultimately see more people working remotely, be that telecommuting from home or pulling down their desktop at a hotel or business meeting. You need enough [wide area network] bandwidth to meet not just today’s needs, but tomorrow’s.”

Handling changes in bandwidth usage requires careful consideration of VDI storage and networking equipment such as switches and ports. Managers also have to learn more about what types of applications teams will be using via VDI. While word processors won’t really push server CPUs to their limits, any software that works with graphics and/or video will significantly alter calculations of what kinds of resources will be required to ensure optimal VDI experience.

With the help of a managed services provider, companies can set up VDI that works for them. When VDI first became a hot topic several years ago, organizations were eager to use it as a catch-all solution, which led to many underperforming implementations. If aligned to specific goals, though, VDI is an effective, economical way to use the same applications anywhere.

How desktop virtualization enables better BYOD management

Virtual desktop infrastructure gained traction recently as more organizations adopt and support mobility initiatives. Employees, empowered by bring-your-own-device policies, are increasingly capable of working from anywhere. Companies that are phasing out Windows XP PCs may even choose to replace these aging machines with mobile devices that provide more streamlined user experiences and offer a slew of modern applications.

VDI’s role in enabling mobility
Where does VDI fit into this picture? With device fleets becoming more fragmented, VDI can be a cost-effective means of providing critical access to core company assets such as enterprise resource planning, client relationship management and line-of-business applications. It’s economical because it requires less investment, both in time and money, than crafting mobile experiences from scratch.

“Applications can also be enhanced for mobile access on the server end, without building a mobile development capability within your organization or hiring expensive outside help,” explained TechRepublic contributor Patrick Gray. “You could use your existing ERP developers to create a dozen screens and reports that have a limited number of fields, and space them more appropriately for mobile use, without writing a single line of mobile code.”

In practical terms, VDI can give sales teams access to full desktops so that they can make use of important tools such as CRM and PowerPoint while on the road. Even if the organization has a BYOD initiative in place, VDI simplifies common BYOD issues such as hardware management and security enforcement – each device communicates securely with the VDI servers via an encrypted session.

VDI and the growing uses of desktop virtualization
Moreover, VDI fits into many organizations’ growing interest in virtualization. Forrester Research’s David Johnson told InfoWorld that more than half of IT decision-makers cite desktop virtualization as a top priority for 2014. Although the market for PCs may be stagnant right now, there is still enough demand for virtual desktops that many companies have turned to VDI to deliver secure computing environments and access to applications on any device.

Certainly, there can be technical and financial challenges in implementing effective VDI, but these obstacles can be overcome with the expertise of an IT solutions and managed services provider. Organizations can also optimize VDI through the use of thin-client software to connect to VDI systems. Eventually, VDI implementations can pay for themselves by making workers more capable, regardless of where they are, while also streamlining mobile device management.

How can companies improve the disaster resilience of their data center infrastructure?

According to a recent benchmark survey by the Disaster Recovery Preparedness Council, nearly three quarters of companies worldwide are failing in terms of disaster readiness, with struggles in downtime for specific critical applications or even entire data center environments. Close to 20 percent of companies reported losses of over $50,000 stemming from outages. Companies can protect themselves against this possibility by investing in resilient data center solutions from a colocation provider focused on business continuity.

"Reliability starts with high industry standards in a checklist of requirements: climate-controlled environments, intelligent security structure and state-of-the-art equipment, technologies and design," BizTimes.com contributor Kevin Knuese wrote in a recent article.

He noted that companies should look for data center solutions with redundant networking and power supplies, as well as redundant cooling systems and all-around state-of-the-art technology. Additional data center features such as 24/7 monitoring and physical security safeguards meant to withstand both break-ins and natural disasters such as floods and earthquakes are important as well. A hosting provider based in the Midwest can be particularly reliable due to the reduced likelihood of certain natural disasters like earthquakes, hurricanes and mudslides that are more common on the coasts.

A provider that offers backup and business continuity services is also important, Knuese wrote. Executives can sometimes be skeptical of "disaster recovery," seeing it as an alarmist term and frustrating cost driver, according to industry expert Steve Kahan. However, the argument for a reliable data center and backup solution is more clear-cut, as such technology solves the problem of many IT headaches. As a result, a colocation provider with business continuity services can be key for maintaining brand credibility from an IT side.

"Some audiences are more responsive when the conversation is focused on the crucial role that IT plays in ensuring 'business continuity' or the operational costs triggered by an 'extended outage,'" Kahan wrote for DRBenchmark.org. "Here's one more suggestion: think of disaster preparedness as 'an investment in brand security,' a way to protect your company's reputation."

Moving toward the virtual data center

Virtualization – the process of abstracting hardware functions to a software level – is one of the signature advancements of modern computing, allowing companies to consolidate their server footprints and increase the flexibility of their infrastructure. With virtualization, businesses can quickly create new virtual servers and move workloads from one physical location to another on a software level. As server virtualization becomes increasingly standard in the data center, companies are beginning to look at other forms of virtualization that can also be applied to increase flexibility, such as storage virtualization and network virtualization. With virtualization in all its forms becoming more important for managing a data center, companies are turning to managed services partners to help.

InformationWeek’s 2013 Virtualization Management Survey found that 72 percent of companies reported extensive use of server virtualization, and just 4 percent had no plans for use. In comparison, only 22 percent reported extensive use of storage virtualization, with 28 percent saying they had no plans for use, and a mere 11 percent reported extensive network virtualization, with 44 percent saying they had no plans for use. The main drivers for virtualization included operational flexibility and agility (56 percent) and business continuity (55 percent).

“Undoubtedly, a fully virtualized data operation offers many advantages,” ITBusinessEdge’s Arthur Cole wrote in a recent column. “Aside from the lower capital and operating costs, it will be much easier to support mobile communications, collaboration, social networking and many of the other trends that are driving the knowledge workforce to new levels of productivity.”

The evolving virtual data center
At the same time, Cole cautioned, much of the virtual technology that extends beyond server virtualization is still in its early phases. As a result, companies may encounter challenges as they look to enjoy the management benefits of abstracting elements of their data centers. A trusted data center partner can help businesses evaluate and implement emerging technologies, and even oversee transitions such as server virtualization and consolidation.

The standard for what counts as a virtualized data center is set to evolve in the coming years as more physical components are virtualized, and businesses will want to be at the cutting edge of whatever emerges. By outsourcing some infrastructure management tasks to a trusted third-party provider, they can ensure they are adopting these innovations even if they do not have the in-house technical expertise or capital to make the changes. To keep close tabs on the move toward the virtual data center, a managed services and IT consulting partnership is essential.