Cloud or on-premises? Finding the right UC solution

"Cloud or on-premises?' has become a defining question for IT departments. With the rise of high-speed Internet and managed services, many of the routine chores of IT – everything from storing application data to deploying new servers – can now be easily performed at a colocation facility or via a cloud service provider's resources.

Leading consumer services such as Netflix and Airbnb have gone exclusively cloud to reduce infrastructure management and reach users at scale. Still, despite the cloud's rapid ascent, on-premises IT is alive and well, and in many scenarios it's the right choice.

Unified communications: cloud or on-premises?
Take unified communications, which can be deployed either way. Having the entire UC systems hosted by third-party confers many advantages, including:

  • Lower cost of ownership: The platform is managed by the provider and requires no capital expenditures.
  • Compatibility with multiple devices: Hosted solutions can be accessed by smartphones and tablets in addition to in-office PCs.
  • Intuitive operating systems: It's easy to customize settings for each device and tweak the OS to company requirements.

These benefits can ease a small or mid-size business's transition from legacy telephony to comprehensive UC, with data and voice efficiently flowing over the same network. However, companies with high call volumes may need something more. To that end, on-premises UC provides the performance and reliability needed to deal with numerous concurrent conversations.

Never miss a call
Such a UC system can be designed and installed by an experienced provider to ensure its integrity. When put to the test, it can send calls to different devices, adequately distributing the incoming volume and relieving pressure on agents. For the business at large, this means quicker, yet higher-quality, responses to sales queries and support requests.

"The implementation of UC also ensures that employees need never miss a call again, as incoming calls can be routed to desk phones, twinned devices or apps," stated Jon Nowell, head of product management at TalkTalk Business, according to Information Age. "Reliability and dependability are major aspects of a business's reputation and are equally important to companies of all sizes."

Accordingly, the CAPEX of an on-premises systems more than pays for itself through a redoubtable communications infrastructure. As always, though, the "right" choice of cloud or on-premises depends on the organization. Some will want UC installed on-site, but, as indicated by a recent MarketsandMarkets report predicting that the UC-as-a-service market will top $23 billion by 2019, cloud solutions are also on the upswing.

Determining bandwidth requirements in the data center

How much bandwidth does a data center really need? It depends on how many workloads and virtual machines are in regular operation, as well as what the facility is designed to support. For example, a data center providing resources to a public cloud requires much more bandwidth than one that is simply powering internal systems and operations shielded by the company firewall. The increasing uptake of remote data centers and colocation arrangements, in tandem with server virtualization, has added to organizations' bandwidth considerations.

How virtualization complicates bandwidth requirements
Server and desktop virtualization have made companies less reliant on physical infrastructure and the specific sites that house it. Here's how they work:

  • With desktop virtualization, or VDI, an operating system can be hosted by a single machine (even an aging one), simplifying management of both software and hardware while reducing costs
  • Server virtualization involves a single physical server being turned into multiple virtual devices. Each instance is isolated and the end user cannot usually see the technical details of the underlying infrastructure.

By getting more out of IT assets via virtualization, companies have reshaped IT operations. More specifically, they have spread out their infrastructure across multiple sites and put themselves in position to move toward cloud computing.

With increased reliance on virtualization, organizations have looked to ensure that remote facilities receive the bandwidth needed to provide software, instances and data to users. However, liabilities still go overlooked, jeopardizing reliability – especially when data centers are too far apart from each other.

Ensuring low latency is just one piece of the data center optimization puzzle, though. Sufficient bandwidth must also be supplied to support the organization's particular workloads. In the past, Microsoft has advised Exchange users to think beyond round trip latency.

"[R]ound trip latency requirements may not be the most stringent network bandwidth and latency requirement for a multi-data center configuration," advised Microsoft's Exchange Server 2013 documentation. "You must evaluate the total network load, which includes client access, Active Directory, transport, continuous replication and other application traffic, to determine the necessary network requirements for your environment."

Knowing how much bandwidth is needed
Figuring out bandwidth requirements is a unique exercise for each enterprise. In a blog post, data center networking expert Ivan Pepelnjak looked at the nitty-gritty of assessing bandwidth-related needs, honing in on some of the problems that reveal a need to rethink how bandwidth is allocated and utilized.
These issues include:

  • Over-reliance on slow legacy equipment
  • Oversubscription to services
  • Miscalculation of how much traffic each virtual machine generates 

In addition, data center operators sometimes overlook bottlenecks such as how virtual machines can sometimes interact slowly with storage. If they have to frequently access data stored on an HDD, for example, quality of service may degrade. Networks may require extra bandwidth in order to avoid data transfer hiccups. 

HealthKit, healthcare and managing BYOD

As smartphones become faster and increasingly capable of running sophisticated applications and services, health care organizations are faced with a dilemma. Do they allow doctors, nurses and staff to participate in bring-your-own-device policies and potentially unlock productivity gains that enable higher-quality care? Or do they hold back out of legitimate concerns about data security and compliance with regulations?

The growing interest of technology firms in health care tracking only complicates the situation. Individuals may now use devices such as wristbands, in addition to smartphones, to record and share health information, making it critical for providers to keep tabs on BYOD activity to ensure compliance.

HealthKit and the larger issue of sharing health information
At this year's Worldwide Developers Conference, Apple announced HealthKit, a platform built into iOS that underscores how healthcare on mobile devices is rapidly evolving and sparking questions about how sensitive data is handled. HealthKit isn't a discrete solution but a system of APIs that would allow, say, an application that tracks steps to share its information with medical software that could provide actionable advice.

Major health care organizations are already on board. The Mayo Clinic created an application that monitors vital signs and then relays anomalous readings to a physician. Given the already considerable presence of mobile applications in health care, HealthKit could give hospital and clinic staff additional tools for providing efficient care.

At the same time, HealthKit turns any iOS device into a potential compliance painpoint. Data that is stored on an iPhone, for example, would not fall under the purview of the Health Insurance Portability and Accountability Act, but if shared with a provider or one of their business associates, HIPAA would likely apply. Stakeholders will need time to adjust to the nuances of how healthcare applications interact with each other in the HealthKit ecosystem.

"The question would be whether the app is being used by a doctor or other health care provider. For example, is it on their tablet or smartphone?," asked Adam Greene of Davis Wright Tremaine LLP, according Network World. "Where the app is used by a patient, even to share information with a doctor, it generally will not fall under HIPAA. Where the app is used on behalf of a healthcare provider or health plan, it generally would fall under HIPAA."

Tracking and securing privileged health information
HealthKit is just one platform on a single OS, but it is part of a broader shift in data control, away from centralized IT departments and organizations and toward end users. For healthcare, this change is particularly challenging since providers have to ensure that the same compliance measures are enforced, even as BYOD and cloud storage services become fixtures of everyday operation.

A recent Ponemon Institute survey of more than 1,500 IT security practitioners found that almost 60 percent of respondents were most concerned about where sensitive data was located. BYOD complicates compliance, and healthcare organizations will have to ensure that they have well defined policies in place for governing security responsibilities.

"People trained in security also view IT as accountable for the security domain," Larry Ponemon, chair of the Ponemon Institute, stated in a Q&A session on Informatica's website. "But in today's world of cloud and BYOD, it's really a shared responsibility with IT serving as an advisor, but not necessarily having sole accountability and responsibility for many of these information assets."

It's no longer enough to rely on IT alone to enforce measures. Security teams and IT must work together and implement BYOD security as well as network monitoring to ensure that only authorized devices can connect to the system, and that data is safely shared.

Virtualization, open source switches changing the face of data centers

Data center technology moves quickly. With the emergence of wide-scale cloud computing over the past decade, enterprises have constructed new facilities and adopted cutting-edge equipment to keep up with demand and/or worked with managed services providers to receive capacity through colocation sites.

Virtualization drives strong growth of data center networking market
Last year, MarketsandMarkets estimated that the data center networking market alone could top $21 billion by 2018 as virtualization and specific technologies such as 40 Gigabit Ethernet continue to gain traction. Rather than rely on legacy physical switches that are challenging to upgrade and scale, enterprises are turning to virtual alternatives.

Virtualizing the network makes equipment and services much easier to modify. Since the fundamental advantage of cloud computing is the ability to get resources on demand, such extensibility is critical for helping companies keep pace with changing requirements.

"Virtualization being a disruptive technology is one of the major driving factors in [the] data center networking market," MarketsandMarkets analyst Neha Sinha told Network Computing. "The adoption of high-performance virtual switches is critical to support increasing number of virtual machines used in multi-tenant data centers. The virtual switches include programmatically managed and extensible capabilities to connect the virtual machines to both physical and virtual networks."

Down the road, such interest in mixing and matching legacy, physical and virtual assets may lead organizations to take up software-defined networking. This practice entails managing network services in a more intelligent, CPU-centric way.

However, SDN is still over the horizon for many companies right now. Both the use case and the underlying technology are not widely understood. Plus, enterprises are still trying to accrue enough personnel expertise in areas such as server virtualization to give them a solid foundation for future modifications of their networks and data centers.

Facebook announces open source data center switch
The demand for higher data center efficiency is unabating, and tech giants such as Facebook are looking to get in on the action. PCWorld reported that the social network has confirmed an open source switch, released through the Open Compute Project, that could challenge longstanding incumbents such as Cisco.

Facebook's switch is a top of the rack appliance that connects servers to other data center infrastructure. It has 16 individual 40 Gigabit Ethernet ports. The endpoint is designed for maximum flexibility for developers and data center operators, and it may contribute to broader efforts to make infrastructure more flexible.

UC market continues to grow as IT becomes more consumerized

Most enterprises are probably familiar with bring your own device, the practice of employees supplying their own hardware, typically smartphones and tablets, to supplement or replace traditional office PCs. Recently, the BYOD buzzword has given way to discussion of "shadow IT," a similar phenomenon that nevertheless is usually cast in a more negative light. Whereas BYOD is regularly construed as a potential boon to productivity, shadow IT is framed a threat to the IT department's control, especially as organizations increasingly migrate from on-premises to cloud-based software.

Unified communications' place as BYOD, shadow IT come to the fore
Unified communications solutions are in a unique position as BYOD and shadow IT infiltrate the enterprise:

  • UC may be hosted on-premises or provided through cloud resources, making it both a traditional and cutting-edge technology, depending on the implementation.
  • The widespread use of OTT voice, messaging and chat solutions – Apple, for instance, has pegged iMessage as the single most used iOS app – is changing how companies approach communications infrastructure. Circuit-switched telephony and email alone no longer suffice.
  • With such consumerization all across the enterprise messaging, technologies such as Wi-Fi are being advanced to make voice calls and Internet access more seamless.

Overall, UC has so far benefited from the widespread shift of IT toward the cloud and mobile devices. In a 2014 report, Infonetics Research estimated that the voice-over-IP market alone reached $68 billion in 2013, up 8 percent from 2012. Revenues could rise another $20 billion by 2018.

UC and Wi-Fi-enabled VoIP
Employees are now accustomed to seamless connectivity and high-quality, feature-rich software on mobile devices. For example, apps such as Skype and LINE are much more versatile than standard SMS and voice dialers.

A big part of achieving a better use experience with enterprise UC is getting the installation right. Firms that handle high daily call volumes may choose to host UC on-premises for maximum reliability. If VoIP is a major part of the solution, it is important to ensure that is supported by sufficient bandwidth and Wi-Fi access points.

"Believe it or not, not all antennas are created equal," stated. "[K]eep an eye out for the following details: the size of the antenna, the quality of the construction, the choice of metal used, corrosion prevention, the bracket, and other characteristics such as focus and radiation patterns. The more stable your pole, the more stable your connection will be."

Banks, other organizations use UC to improve client service and user experience

The unified communications market is changing. Feature-rich Internet messaging and voice-over-IP telephony were once mostly the domain of CIOs and IT departments, but these services are entering the mainstream, driven by employees’ uptake of mobile hardware through BYOD initiatives and easy-to-use applications, as well as the subsequent entry of these endpoints into the workplace. Costs have declined and the underlying technology has been simplified, making UC, whether delivered through the cloud or on-premises infrastructure, an increasingly attractive option.

“[F]ocus has shifted to the end-user experience, including ease of use, as well as the business value of UC,” observed COMMfusion president Blair Pleasant in article for No Jitter. “There’s a growing realization that the user experience must be intuitive, relevant to the user’s work and tools, and competitive with the experiences delivered by consumer devices and apps. It’s no longer about getting the ‘latest and greatest’ – it’s delivering intuitive and contextual UC solutions, and the business results that are achieved by simplifying collaboration and meetings and enhancing the mobile experience.”

Unified communications market reshaped by consumer focus
The shifts toward intuitive UC user experience comes at just the right time, as UC begins displacing legacy systems. In the past, communications infrastructure was too limited, costly and complex to cater to the end user. Much of IT’s time was devoted to simply maintaining the status quo, with little left over for improving usability or refining the user interface.

With the emergence of cloud computing as well as flexible, highly capable on-premises solutions, all of that has changed. Third-party hosting companies now steward UC technology, optimizing it for day-to-day use by their clients. At the same time, organizations with large call volumes increasingly utilize on-site UC – with installation help from managed services providers – for maximum stability and cost-effectiveness. Either way, businesses and their clients now benefit from amenities such as:

  • Contextual services: Relevant call histories, emails, texts and documents can be retrieved for each conversation.
  • Embedded technologies: Computer telephony integration in integrated into most contact center solutions, and UC is moving in the same direction. It is no longer a standalone services so much as fundamental communications infrastructure.
  • Video meeting rooms: Video conferencing enables better remote collaboration, and with VMRs it is possible for users to connect using a client of their choice, whether they are inside or outside the company firewall.

All of these features add up to a rich UC experience for users and tangible benefits for the organization. Banks, for instance, have deployed wide-area networks and contact centers to better support UC and improve interactions with clients. According to AllAfrica, Comnavig ICT Advisers CEO Olufemi Adeagbo recently identified a well-designed, technologically sound contact center – with features such as UC and video conferencing – as the only way to ensure that business opportunities are realized and brand reputation maintained.

“Imagine a car sale opportunity that is lost because the advertised mobile number is off, unavailable or cannot be answered,” stated Adeagbo. “Imagine the dormant account the bank does not proactively place a call about to understand the issue and reactivate.”

How BYOD can be made easier through desktop virtualization

 

Bring your own device policies, already buoyed by rapid uptake of smartphones and tablets, may gather additional momentum as prominent technology vendors devote attention to making mobile hardware valuable in the workplace. Dropbox for Business has made several big acquisitions related to BYOD, with the aim of helping businesses transition to multi-device, highly consumerized IT environments. Meanwhile, Apple has included advanced support for email, device enrollment and calendar collaboration in iOS 8, making the mobile OS more amenable to BYOD than ever.

It’s clear that BYOD isn’t going away. However, organizations are still adjusting to the new pressures that the phenomenon places on data control, security and compliance. While major firms continue to work on BYOD-centric solutions, enterprises have to assess their mobility needs and decide whether to implement measures such as desktop virtualization to enable BYOD.

Virtualization makes BYOD more secure for leading steel producer
The central issue with any BYOD policy is the transfer of control – over hardware, software and data – from the IT department to employees, who may be less scrupulous in terms of what applications they use. For example, files that should remain behind the company firewall may be shared with consumer-facing cloud services. Mobile devices enable such habits, even as they hold potential to enhance collaboration and remote work.

Fortunately, desktop virtualization facilitates a middle ground between BYOD adoption and enterprise security. Rather than let each endpoint have its own OS and applications, IT departments distribute a single desktop experience via a virtual machine. Devices connect to the VM securely and gain access to approved software. Data is not retained on user hardware after a session ends.

Essar Group, a conglomerate involved in steel, oil and telecom services, turned to desktop virtualization to standardize and secure its employees’ mobile experience when working with company assets. Ultimately, it moved 5,000 users to its new virtualized platform.

“Security of data was the primary point of scope for looking for [a] desktop virtualization solution,” Jayantha Prabhu, CTO at the Essar Group, told Dataquest. “We had a good experience of the ability to control the data at the disposal of the employee when we deployed the same for some of our teams which handled data which was very critical both from a confidentiality and a brand perspective. We had around 3,000 BlackBerry users and more than 2,000 people with tablets, and with all the applications being accessed on the tablets, it was tough to ensure security of critical information.”

Desktop virtualization is a powerful tool for securing data and controlling mobile devices, but its benefits don’t stop there. Other perks include:

  • Reduced power consumption through the use of thin clients (machines that depend on a server for most or all of their software).
  • Centralized management of software and devices, with much more efficient patch distribution and application upgrades.
  • Support for remote collaboration since users can get the same experience from any Internet-enabled device.

With a broad set of advantages for organizations in finance, healthcare, education and other sectors, desktop virtualization is a practical, versatile way to incorporate BYOD while maintaining the integrity of company data.

Unified communications solutions rapidly replacing legacy phone systems

Unified communications solutions are displacing legacy technologies with such speed that some industry observers have begun thinking about the end of the phone number. Facebook’s landmark $19 billion acquisition of WhatsApp – an ad-free mobile messaging service that relies on the user’s screen name rather than a phone number – underscored the rapid ascent of alternatives to the aging SMS/circuit-switched telephony infrastructure.

Chat apps overtake SMS, showing changing face of consumer and business communications
Last year, E.U. Commission vice president Neelie Kroes announced that OTT chat apps had overtaken SMS for worldwide messaging volume. While SMS likely isn’t going away just yet, much of the value in communications has certainly moved from basic services to richer platforms that make the most of high-speed data connections and provide amenities beyond text messaging and voice calling.

Consumer options such as Skype and LINE have become famous for video conferencing and stickers, respectively. Similarly, business-grade offerings often distinguish themselves by including data sharing and email services in addition to text and voice. Organizations across many verticals, including healthcare and education, have put UC to work as they modernize their IT operations. A managed services provider can help navigate the common obstacles that companies face as they move to

“Many of the benefits of unified communications center on internal productivity improvements, or the facilitation of collaborative working,” Liam Ward-Proud wrote for City A.M. “But [small and midsize businesses]  also face the challenge of managing numerous client contact points, and a total communications strategy can help ensure a consistent client experience is delivered.”

University of Washington remakes IT department with UC
Universities have been at the forefront of UC adoption as they adjust to the rapidly evolving communications habits of students. Take the University of Washington, which began planning its move from a legacy phone system to UC as far back as 2010, according to EdTech.

The institution began by overhauling its infrastructure, installing fresh switches to support a network that could handle converged voice and data. Since its systems served more than 22,000 users, upgrades were made in phases over the course of a few years, with medical call centers and campus public safety among the first to receive access to the new platforms.

The university’s UC systems grew to encompass voice, voicemail, chat and video conferencing. Down the road, it has its eye on video-as-a-service and additional cloud-based functionality. Indeed one of the underlying value propositions of UC is that it creates a clear pathway toward cloud computing. Many UC components can be hosted and managed by a third-party, freeing users from having to tend to their own infrastructure. Functionality can also be changed and scaled depending on demand.

In the University of Washington’s case, the rollout of UC is facilitating creation of hybrid cloud. Hybrid setups typically involve:

  • Some infrastructure that is managed in-house, often for reasons such as security, compliance, control and performance.
  • Other applications and resources – for computing, networking and storage – that are piped in from an external provider.
  • APIs and mechanisms for determining what gets run where and when. For example, a workload that is running internally but requires more capacity can be shifted to public cloud infrastructure.

For the university, some assets will be kept on-premises while software is increasingly shifted to the cloud for greater availability.

“Right now, people can use the features in Microsoft Lync, such as chat, voice, video and conference sharing, on a peer-to-peer basis,” Roland Rivera, director network strategy and telecommunications for the university, told EdTech. “Our goal is to provide these capabilities campus-wide. As the technology evolves, we plan to keep the [session initiation protocol] core in-house, but migrate applications to software-as-a-service cloud solutions as those become available.”

Getting better device and data security through desktop virtualization

Desktop virtualization is an increasingly popular way to get more out of old IT systems while enabling access to company applications from virtually any device. By hosting an operating system on a centralized virtual machine, organizations can avoid the hassle of installing and managing extra software on every last piece of equipment. Under ideal circumstances, virtualization contributes to high levels of security and convenience.

Virtual desktop infrastructure and mobile security
The influx of mobile endpoints into the workplace, fueled by bring-your-own-device policies, has made such virtual desktop infrastructure appealing. IDC recently estimated that 155 million smartphones would be used for BYOD in the Asia-Pacific region in 2014.

But what about security? Employees who use their own hardware may be prone to mingling personal habits and data with corporate assets. A classic example is managing sensitive work documents through consumer applications such as Dropbox.

Enter VDI. Important data can be kept in cloud storage services and accessed exclusively via secure connections. Information is usually not retained locally, and all permissions are protected by authentication mechanisms. VDI basically provides a catch-all solution to managing application access in the context of BYOD.

"The move to BYOD was a wakeup call for mobile security because information security is a key IT responsibility – regardless of whether the mobile device in question is company-provided or user-owned," observed Michael Finneran for TechTarget. "Unless an organization opts for a solution that avoids storing corporate data on a mobile device, systems will be needed to protect that information."

Virtualization vendors target health care, financial industries
VDI's potential for securing applications and data has caught the attention of organizations in health care, finance and other regulated sectors. At the same time, major technology vendors have worked on thin client solutions for these markets, crafting products and services that enable desktop virtualization through minimal infrastructure.

Still, as virtualization becomes more popular, there have been concerns about balancing performance and security. Network Computing's Jim O'Reilly dug into the dilemma by noting that many providers have added instance storage, which are usually solid-state drives that provide the speed and muscle to overcome common bottlenecks such as VDI boot storms (i.e., when everyone logs in at around the same time).

Instance storage enables outstanding performance, but it also results in data states being preserved and, in theory, prone to surveillance and theft. Persistent data could be an issue for health care organizations obligated to comply with legislation such as the Health Insurance Portability and Accountability Act. Organizations should understand the ins and out of any virtualization solution before entrusting data to it.

Healthcare providers turn to network security, desktop virtualization to protect data

There's plenty of work to do in shoring up network security at healthcare organizations. While the retail sector has been been making headlines for months due to oversights that led to record-setting breaches at Target, Neiman Marcus and Michaels, hospitals and clinics may be even more vulnerable to attack than these chains, even if they haven't been the subjects of similarly high-profile incidents yet.

Healthcare lags retail, finance in network security
A recent report from BitSight Technologies rated the security postures of different verticals on a scale from 250 to 900 (a higher figure means stronger protection). Healthcare received a 660, falling well behind retail at 685, with utilities and finance even farther up the ladder.

"Unlike the financial institutions and electric utilities in the S&P 500, the healthcare and pharmaceutical companies do not view cybersecurity as a strategic business issue," stated the authors of the BitSight report, according to Cruxial CIO. "They do not spend enough resources to protect their data, in part because cybersecurity has not received the executive level attention it deserves."

The results are surprising in light of how many regulations, including the Health Insurance Portability and Accountability Act, govern healthcare data. Security firm Redspin estimated that nearly 30 million records have been compromised in HIPAA breaches since 2009, and that the yearly total rose 138 percent between 2012 and 2014.

Mitigating risk with managed network security and virtualization
To avoid becoming victims, organizations can rely on a managed services provider to install and oversee mechanisms that shield important assets from surveillance and theft. Core capabilities may include:

  • Dedicated private IP networks that carry encrypted data
  • Secure remote access and collaboration
  • Network authentication and integrity checking
  • Firewalls for MPLS IP-VPN
  • Around-the-clock security management

These fully-featured solutions have become increasingly appealing to healthcare providers, especially as initiatives such as bring your own device and technologies like cloud computing have revolutionized IT. Administrators may no longer feel confident in their networks' safety in the face of threats that could enter from any one of many possible attacks surfaces, including smartphones or unauthorized cloud apps.

Health IT Security's Patrick Oullette chronicled how one healthcare security executive had recalibrated his organization's approach to network security in order to deal with today's threats and usage habits. In practice, this shift has entailed moving beyond data loss prevention and incorporating exfiltration techniques to keep tabs on device activity and traffic flows across the entire network.

"We also have a robust data exfiltration capability that we've instituted at the core of the network and the perimeter so we can watch data flows," David Reis, vice president at Lahey Health, told Health IT Security. "Looked at that way, it becomes illuminating pretty quickly and easy to flesh things out. You ask where the data is moving in and out from, what devices are plugging in and out and what users are doing once they're plugged in."

The adoption of advanced network security measures is promising, especially in light of the healthcare sector having accounted for 43 percent of all breaches in 2013, according to the Identity Theft Resource Center. On this same front, healthcare providers are implementing technologies such as desktop virtualization to bolster security.

Virtual desktops are appealing to hospitals and clinics because they involve little more than dumb terminals, to which operating systems are supplied from remote server. Accordingly, there's less risk of misconfiguration or data theft than with a machine that was running a locally installed OS. Speaking to Health IT Security, Chris Logan, chief information security officer at Care New England Health, described desktop virtualization as "a huge win for security."