Adopting the right cloud services portfolio

 

Attracted by the promise of leaner operations, reduced costs and simplified management, companies are switching from on-premises data centers to colocation providers and cloud storage solutions. As this transition occurs, however, it is important for businesses to ensure they are adopting third-party solutions that actually fit their specific needs and work toward their stated goals of leaner approaches.

In a recent interview with Network World, pharmaceutical IT executive Nathan McBride outlined the ways his company has shifted toward a more efficient IT model by abandoning its own data centers in favor of outsourced solutions. One of the factors he pointed to in the decision-making process for creating a leaner business was diversifying the set of cloud service and managed services vendors his company used.

“[F]rom a cloud vendor approach, we’re constantly looking at the market and saying, ‘Here’s our business need, and who is the best in the industry in this particular niche?'” he told Network World. “Our loyalty only extends to those who innovate the fastest.”

With a managed services model, companies gain the ability to tie their IT to solutions providers who are implementing state-of-the-art technologies rather than waiting for the slow refresh cycles endemic to on-premises deployments. At the same time, a cloud provider can be an inhibitor as well, industry expert Bill Kleyman noted in a recent post for Data Center Knowledge. With pre-packaged solutions, many cloud services vendors simply offer companies standardized products and offer little interest in clients beyond signing them up.

To actually ensure they’re not only taking advantage of the latest technology but also the technology that’s best suited for their specific needs, businesses can benefit from adopting a cloud services portfolio that includes smaller vendors dedicated to deploying custom solutions. Moving away from the on-premises data center carries many advantages, but businesses should be thoughtful about doing it in the right way with managed solutions. In particular, choosing a managed solutions provider that is technology agnostic is key, as this breadth of support can help organizations more seamlessly transition to the cloud at their own pace and address unique business challenges.

Overcoming common obstacles to VDI implementation

Implementing virtual desktop infrastructure is a big change for any organization. It almost always leads to significant shifts in how the network is used, plus VDI can really strain the storage and bandwidth resources in company data centers. If an enterprise is unprepared, its VDI efforts could get of to a rocky start.

The VDI 'boot storm' and others issues to keep in mind
Last year, ZDNet's Steven Vaughan-Nichols examined some of the common obstacles to successfully setting up VDI. These include:

  • Insufficient bandwidth: The key advantage of VDI is that it enables everyone to work more easily from anywhere, through the delivery of a consistent desktop. But once employees are working outside the office, there's no guarantee that they'll have Internet connection speeds that are suitable for an optimal VDI experience.
  • Bring-your-own-device security: In many scenarios, it makes sense for VDI users to use a virtual private network, which is not always easily accomplished if they're connecting from, say, a public Wi-Fi hotspot.
  • License and storage management: VDI licensing can become really complex on Microsoft Windows. On top of that, accommodating user habits can require large amounts of storage, while "boot storms" (everyone connecting to VDI within a short timeframe) push servers to capacity.

The latter phenomenon is particularly noteworthy, since it not only compromises the end user's ability to be productive through VDI, but also reveals which parts of the IT infrastructure are insufficient or at least unsuited to VDI. With VDI now a popular method for facilitating corporate mobile device usage, the boot storm can make it seem really out of place alongside native mobile applications.

"The issue with virtual desktops is the so-called 'boot storm" when everyone fires up their computers at 8 AM. As any PC user knows, a hard drive running flat out at 150 IOPS takes a couple of minutes to complete boot," wrote Jim O'Reilly for Network Computing. "A quicker boot time will be important for VDI, especially with most users having instant-on experience with tablets and mobile phones. These are becoming the endpoint for the virtual desktops to be displayed, and a long period to boot up every time the desktop is accessed isn't going to fly."

These problems are not intractable, though. Managed services providers can assist with desktop virtualization and ensure that organizations get the levels of storage, licensing and bandwidth that they need to make VDI work for them.

Data storage and VDI
O'Reilly also looked at some data storage considerations to make when considering VDI. For example, replacing some or all traditional hard disk drives with solid-state drives can provide the performance boost required for first-rate VDI. While SSDs are more expensive than HDDs on a per-GB basis, they can support many more VDI instances.

Organizations such as the Bank of Stockton in California have shifted their storage strategies to respond to surging VDI traffic. The bank used a combination of DRAM, SSDs and flash memory, as well as virtualization and decompression, to ensure that its appliances could keep pace with VDI usage. Implementing VDI requires new approaches to hardware, security and device management, but it is possible to get it right with help from vendors and IT services providers.

Government agencies show power, potential of VDI

Why use virtual desktop infrastructure? Many government agencies have found plenty of reasons to do so, including supporting their increasingly diverse device fleets and reducing overall power consumption.

What are the main benefits of VDI?
With VDI, the computing power needed to deliver a desktop environment moves from on-site PCs to servers housed in a data center. For public and private sector organizations alike, there are several key benefits to this arrangement, including:

  • Heightened device and data security – hardware running virtual desktops via VDI is connected to servers through an encrypted channel. That means it is safe to grant these devices access to core applications, such as enterprise resource planning and client relationship management.
  • Streamlined system administration – IT personnel can worry less about having to implement complex mobile device management for smartphones and tablets
  • Lower costs – VDI can be a viable alternative to building entirely new applications and services tailored specifically for mobile screens. The use of zero/thin clients – minimal hardware with little to no installed software – also drives down electricity consumption compared to desktop PCs.
  • Support for mobility – teams working off-site can still access important assets by connecting to VDI.

U.S. government sees success with VDI implementations
Implementing virtualization and VDI have already produced real gains for the U.S. Department of Energy, as well as the Defense Intelligence Agency and the Navy. For instance, FCW reported that the DOE conducted a 500-seat VDI pilot program that exhibited an excellent level of user experience and proved that VDI could also help trim expenses.

Going forward, VDI may evolve, moving off-premises and into cloud computing environments. More specifically, desktop-as-a-service may provide similar amenities to VDI, with the exception that infrastructure is managed by a cloud services provider rather than the organization itself.

Freeing IT of this responsibility could potentially streamline costs even further. However, there are still the core issues of ensuring that data is kept safe in the cloud and that organization’s particular needs, especially for bandwidth, are being met.

VDI and bandwidth requirements
For organizations that adopt VDI, it’s critical to figure out right away what is expected from the VDI implementation. That way, they reduce the risk of setting up something that doesn’t align with their goals and ends up running over budget. In many cases, these issues manifest themselves as poor end-user experiences or insufficient bandwidth as a result of “boot storms” (many users connecting to VDI simultaneously).

“You also need to bear in mind that VDI almost always results in a change in usage patterns,” explained The Register’s Trevor Pott. “Whatever your usage patterns are today, expect that VDI deployments will ultimately see more people working remotely, be that telecommuting from home or pulling down their desktop at a hotel or business meeting. You need enough [wide area network] bandwidth to meet not just today’s needs, but tomorrow’s.”

Handling changes in bandwidth usage requires careful consideration of VDI storage and networking equipment such as switches and ports. Managers also have to learn more about what types of applications teams will be using via VDI. While word processors won’t really push server CPUs to their limits, any software that works with graphics and/or video will significantly alter calculations of what kinds of resources will be required to ensure optimal VDI experience.

With the help of a managed services provider, companies can set up VDI that works for them. When VDI first became a hot topic several years ago, organizations were eager to use it as a catch-all solution, which led to many underperforming implementations. If aligned to specific goals, though, VDI is an effective, economical way to use the same applications anywhere.

Approach BYOD with a realistic mindset

Companies are increasingly embracing bring your own device programs, but BYOD is also introducing new security risks into the business. As a result, having a clear plan for BYOD deployment that acknowledges the realities of the way users behave is essential for avoiding a data breach or other security incident. To smoothly manage a BYOD rollout, companies can benefit from working with a managed services provider and adopting sanctioned hosted collaboration solutions.

At the recent CITE Conference in San Francisco, Cisco executive Brett Belding and Sanofi executive Brian Katz explained that the security problem of BYOD is a simple one: No matter what restrictions are placed on them, users will find a way to access the services they want for cloud storage, collaboration and email on any device with a screen. They said that users are going to be drawn to the services they are familiar with, such as Evernote or Apple's iCloud, CITEworld reported.

Short-term benefits but long-term risk
Using ad hoc or consumer solutions to store and share data gives employees short-term benefits but can create long-term exposure risks, Alex Gorbansky, CEO of document management company Docurated, told Business News Daily. In many cases, employees are bypassing IT and adopting consumer solutions, which then can linger in the cloud without corporate knowledge after those employees have left. The solution to these management issues is for IT to provide sanctioned solutions.

"Employees need to work with IT to adopt a consumer-grade experience with enterprise-grade security," tech executive David Lavenda told Business News Daily. "Without IT buy-in, end users will continue to choose between engaging in risky file sharing behavior with consumer-centric alternatives, or taking a productivity hit through clunky legacy enterprise file sharing systems."

Working with a managed services provider, companies can craft a custom BYOD deployment plan that leverages sanctioned cloud storage and collaboration tools, avoiding the risk, inherent to BYOD, that employees will head off on their own and deploy risky consumer solutions. A third-party vendor experienced in BYOD strategy and cloud systems can help businesses of any size navigate this type of rollout and ensure employees buy into it. With guidance for employees, achieving BYOD success is more likely, Katz said, according to CITEWorld.

"Nobody follows a standard, but everybody follows a recommendation," he explained.

Heading into a BYOD deployment with a realistic mindset and an understanding of how employees will behave is essential, and a managed services partner can help.

Creating BYOD value while minimizing risk

Bring Your Own Device programs are growing in popularity, and, as they evolve, the techniques for managing them are evolving as well. Introducing BYOD programs into the workplace comes with obvious security risks, as more connected devices present more vectors for malware or network breaches, but there's no avoiding the reality that smartphones and tablets are here to stay. Nonetheless, companies need to be deliberate in the way they deploy BYOD.

In many cases, employees are either unaware of the security risks their device use can introduce, or they simply don't care. According to a recent survey by identity management software firm Centrify, 15 percent of employees believe they have minimal to no responsibility to protect data stored on their personal devices. Additionally, 43 percent said they have accessed sensitive corporate data while connected to an unsecured public network.

Traditionally, the response to this type of threat has been to limit employees' device use with restrictive policies and enterprise mobility management tools, a recent TechTarget article noted. However, such limitations can easily restrict the benefits BYOD offers in the first place. As a result, the preferred approach is trending toward implementing better controls on the network and storage levels, giving users more choice of device while taking precautions like protecting their data via hosting it remotely in a secure cloud environment. The ideal security approach will vary by organization, making it useful to work with a managed services provider specializing in BYOD to develop a custom solution.

Embracing virtual desktop infrastructure through managed services

Virtual desktop infrastructure was recently named as the No. 3 highest "low-risk/high-reward" technology in Computer Economics' "Technology Trends 2014" study. Given the predictable cost structure of the technology, as well as its maturity, companies have a strong incentive to embrace it. And the incentive is even stronger when VDI is delivered through a managed services cloud provider, cutting out the capital investments that can otherwise be an impediment.

"VDI can ease desktop support and shrink energy consumption, but the advantages come at a cost," FCW contributor John Moore wrote in a recent article. "Organizations might need to invest in data center infrastructure – servers, storage, software and networking – to make the technology work. They will also need to train or hire employees to maintain the virtual environment."

Given these up-front costs, many organizations are moving to a cloud-based VDI deployment model, Moore noted. By working with a third-party managed services partner, companies can not only outsource capital investments, they can simplify management and access state-of-the-art infrastructure subject to constant refresh cycles. A VDI solution delivered through a managed service provider's data center can dramatically improve the effectiveness and cost efficiency of the technology – already a remarkably effective tool.

Actually meet enterprise security needs with managed services solutions

 

Today’s businesses face a wide range of cybersecurity threats.  While many are confident in their approach to protecting sensitive information, the reality is that security solutions still remain largely inadequate. According to a recent study from the Ponemon Institute, managing security investments and policies is a C-suite concern at 66 percent of companies. However, the amount of information that is actually passed to the C-suite to make informed decisions is “disturbingly incomplete,” with IT staff actively omitting negatives in more than half of cases.

“What is most concerning is that it would seem security in many organizations is based on perception and ‘gut feel,’ versus hard data,” said study author Larry Ponemon. “The stakeholders with the highest responsibility seem to be the least informed: a view that is amplified externally.”

For businesses, this may mean working with managed service providers that actually have an interest in meeting security needs rather than simply attracting as many clients as possible. A recent TechRadar article noted that concerns over staying on top of security needs appear to be driving many companies to avoid large cloud providers in favor of smaller managed services and colocation firms, where businesses can be aware of where specifically their data resides even as they leverage the benefits of virtualization and cloud infrastructure.

Moving toward the virtual data center

Virtualization – the process of abstracting hardware functions to a software level – is one of the signature advancements of modern computing, allowing companies to consolidate their server footprints and increase the flexibility of their infrastructure. With virtualization, businesses can quickly create new virtual servers and move workloads from one physical location to another on a software level. As server virtualization becomes increasingly standard in the data center, companies are beginning to look at other forms of virtualization that can also be applied to increase flexibility, such as storage virtualization and network virtualization. With virtualization in all its forms becoming more important for managing a data center, companies are turning to managed services partners to help.

InformationWeek’s 2013 Virtualization Management Survey found that 72 percent of companies reported extensive use of server virtualization, and just 4 percent had no plans for use. In comparison, only 22 percent reported extensive use of storage virtualization, with 28 percent saying they had no plans for use, and a mere 11 percent reported extensive network virtualization, with 44 percent saying they had no plans for use. The main drivers for virtualization included operational flexibility and agility (56 percent) and business continuity (55 percent).

“Undoubtedly, a fully virtualized data operation offers many advantages,” ITBusinessEdge’s Arthur Cole wrote in a recent column. “Aside from the lower capital and operating costs, it will be much easier to support mobile communications, collaboration, social networking and many of the other trends that are driving the knowledge workforce to new levels of productivity.”

The evolving virtual data center
At the same time, Cole cautioned, much of the virtual technology that extends beyond server virtualization is still in its early phases. As a result, companies may encounter challenges as they look to enjoy the management benefits of abstracting elements of their data centers. A trusted data center partner can help businesses evaluate and implement emerging technologies, and even oversee transitions such as server virtualization and consolidation.

The standard for what counts as a virtualized data center is set to evolve in the coming years as more physical components are virtualized, and businesses will want to be at the cutting edge of whatever emerges. By outsourcing some infrastructure management tasks to a trusted third-party provider, they can ensure they are adopting these innovations even if they do not have the in-house technical expertise or capital to make the changes. To keep close tabs on the move toward the virtual data center, a managed services and IT consulting partnership is essential.

Recognize the business advantages of data colocation

For many companies, it can be tempting to approach data storage in a fairly insular manner, keeping files on-premise so they can be easily accessed and IT can maintain absolute control. But businesses are increasingly jettisoning their expensive storage and server infrastructure in favor of switching to an outsourced colocation model. By making IT a fixed operational expenditure rather than a massive capital expenditure, companies can remain more flexible. Colocation data centers also provide numerous IT benefits in terms of disaster resilience and collaboration.

"Hosting your own infrastructure can require significant capital investment in real estate," IT executive James Carnie told ComputerWeekly in a recent feature about the merits of different spending models.

In addition to real estate costs, companies investing in their own infrastructure face massive hardware expenses, and they also must accurately anticipate future expansion to know how much equipment to buy during purchase cycles. Additionally, ownership includes the need to pay for their own ongoing maintenance, which leads to unexpected costs as companies deal with issues that arise. In contrast, a colocation model shifts businesses to a planning approach built around fixed monthly costs, IT executive Akshay Kalle wrote in a recent column for the Globe & Mail.

"Managed services models reduce the considerable costs of storage, upgrades, data recovery, converting big capital outlays and unpredictable maintenance costs in time and materials, into predictable monthly fees with clear expectations and guarantees," Kalle explained.

Colocation also simplifies the challenge of dealing with disasters by moving data off-site to a resilient facility, and, by centralizing business information, it enables easier audits, Kalle added. Centralization and virtualization also foster collaboration: By moving data to shared resources in a data center rather than letting it languish on desktops, companies can simplify file sharing and other collaborative processes among their employees.

Make sure disaster recovery is done the right way

The threat of natural disasters or other business interruptions such as power outages and viruses means that companies need robust backup and disaster recovery solutions for their data environment. Often, however, backup and disaster recovery services are conflated, and businesses end up with solutions that don't necessarily offer all the functionality they actually need. To ensure the enterprise IT environment is fully recoverable in the wake of a disaster, companies can benefit from working with a managed services provider to develop a customized plan that fits their needs.

One common misconception about disaster recovery is that it offers nothing appreciably different from a backup or cloud storage solution, a recent MSP Mentor article explained. Most companies already have some form of backup solution, perhaps hosted in the cloud, which may make a separate recovery service seem superfluous.

However, simply relying on backup storage doesn't take the need for getting key applications running again into account, and it can quickly become expensive or difficult to manage as the volume of data increases, Sundar Raman, CTO of Perpetuuiti Technosoft Services, noted in a recent interview with CIOL. This complexity can make shortcuts even more tempting.

"CIOs tasked with addressing business continuity (BC) and disaster recovery issues are keen to achieve quick wins, and the 'tick box' audit approach, which tries to copy successful strategies used elsewhere, is often adopted without consideration of the suitability," Raman explained.

To combat this problem, companies can benefit from working with a dedicated managed service provider to craft a customized solution that fits their specific needs. By determining the best plan to meet recovery time objectives for various applications and data while also working within a manageable budget, companies can establish a disaster recovery plan that gives them more than basic backup without overextending themselves.