Clear policies and user education enable effective BYOD

Bring your own device has been all the rage in recent years, ever since iOS and Android smartphones and tablets entered the mainstream. Although modestly powered compared to a modern desktop PC, these devices have many built-in advantages over older hardware, including high-resolution, pixel-dense displays, 3G and 4G LTE cellular connectivity, and excellent portability.

Still, there are some key considerations to make when adopting a BYOD strategy. Are devices properly secured? How much will it take to support each new endpoint? There are plenty of options out there for organizations seeking to make the most of BYOD and overcome common obstacles related to security and cost.

BYOD has solid momentum, raising the stakes for user education and sound implementation
One of the biggest benefits of BYOD is that it potentially frees the organization from having to shoulder the costs of additional hardware upgrades, since each user supplies his or her own device. On top of that, the freedom and flexibility conferred by BYOD can translate into new business opportunities, such as ones for sales teams that need to make presentations or access corporate data while on the road.

“BYOD strategies are the most radical change to the economics and the culture of client computing in business in decades,” stated Gartner analyst David Willis. “The benefits of BYOD include creating new mobile workforce opportunities, increasing employee satisfaction, and reducing or avoiding costs.”

Gartner predicts that by 2017, half of all businesses will require employees to adopt  BYOD, in hopes of achieving these benefits and others. Setting up and enforcing BYOD policies could save companies a lot of money that would have otherwise gone toward building dedicated networks and procuring compatible hardware.

Solutions such as desktop virtualization have come to the fore alongside BYOD, making it increasingly possible to provide a consistent operating system experience to every device within the organization. Vendors such as Samsung have also created device-specific security suites designed to ease BYOD management.

Ensuring security and productivity with a BYOD strategy
It is important to have an actionable plan in place before implementing BYOD. As ZDNet’s Adrian Kingsley-Hughes pointed out, a seat-of-the-pants approach usually does not work since companies can run into trouble trying to ensure that, for instance, assets are not moved from the internal network to public-facing cloud storage services.

Instead, companies have to train employees on using BYOD-enabled hardware responsibly and regularly reinforce guidelines. More specifically, some considerations for a sensible BYOD policy might include:

  • Guidance on how to deal with lost devices
  • What happens if a BYOD user leaves the organization
  • Listing of what company data, if any, is governed by regulations

Ultimately, BYOD is an exciting opportunity for organizations, but one that must be approached with care. Common sense and technical know-how can transform employee devices into valuable company assets.

“We are now entering a period of transformation,” Samsung Telecommunications America vice president David Lowe told FierceMobileIT. “It started out with clients being very reluctant to support mobility in their enterprise, trying to figure a way to keep it out. We are now in the transformation stage where enterprises are finally embracing it. That’s where the real innovation is going to come.”

Government agencies show power, potential of VDI

Why use virtual desktop infrastructure? Many government agencies have found plenty of reasons to do so, including supporting their increasingly diverse device fleets and reducing overall power consumption.

What are the main benefits of VDI?
With VDI, the computing power needed to deliver a desktop environment moves from on-site PCs to servers housed in a data center. For public and private sector organizations alike, there are several key benefits to this arrangement, including:

  • Heightened device and data security – hardware running virtual desktops via VDI is connected to servers through an encrypted channel. That means it is safe to grant these devices access to core applications, such as enterprise resource planning and client relationship management.
  • Streamlined system administration – IT personnel can worry less about having to implement complex mobile device management for smartphones and tablets
  • Lower costs – VDI can be a viable alternative to building entirely new applications and services tailored specifically for mobile screens. The use of zero/thin clients – minimal hardware with little to no installed software – also drives down electricity consumption compared to desktop PCs.
  • Support for mobility – teams working off-site can still access important assets by connecting to VDI.

U.S. government sees success with VDI implementations
Implementing virtualization and VDI have already produced real gains for the U.S. Department of Energy, as well as the Defense Intelligence Agency and the Navy. For instance, FCW reported that the DOE conducted a 500-seat VDI pilot program that exhibited an excellent level of user experience and proved that VDI could also help trim expenses.

Going forward, VDI may evolve, moving off-premises and into cloud computing environments. More specifically, desktop-as-a-service may provide similar amenities to VDI, with the exception that infrastructure is managed by a cloud services provider rather than the organization itself.

Freeing IT of this responsibility could potentially streamline costs even further. However, there are still the core issues of ensuring that data is kept safe in the cloud and that organization’s particular needs, especially for bandwidth, are being met.

VDI and bandwidth requirements
For organizations that adopt VDI, it’s critical to figure out right away what is expected from the VDI implementation. That way, they reduce the risk of setting up something that doesn’t align with their goals and ends up running over budget. In many cases, these issues manifest themselves as poor end-user experiences or insufficient bandwidth as a result of “boot storms” (many users connecting to VDI simultaneously).

“You also need to bear in mind that VDI almost always results in a change in usage patterns,” explained The Register’s Trevor Pott. “Whatever your usage patterns are today, expect that VDI deployments will ultimately see more people working remotely, be that telecommuting from home or pulling down their desktop at a hotel or business meeting. You need enough [wide area network] bandwidth to meet not just today’s needs, but tomorrow’s.”

Handling changes in bandwidth usage requires careful consideration of VDI storage and networking equipment such as switches and ports. Managers also have to learn more about what types of applications teams will be using via VDI. While word processors won’t really push server CPUs to their limits, any software that works with graphics and/or video will significantly alter calculations of what kinds of resources will be required to ensure optimal VDI experience.

With the help of a managed services provider, companies can set up VDI that works for them. When VDI first became a hot topic several years ago, organizations were eager to use it as a catch-all solution, which led to many underperforming implementations. If aligned to specific goals, though, VDI is an effective, economical way to use the same applications anywhere.

How desktop virtualization enables better BYOD management

Virtual desktop infrastructure gained traction recently as more organizations adopt and support mobility initiatives. Employees, empowered by bring-your-own-device policies, are increasingly capable of working from anywhere. Companies that are phasing out Windows XP PCs may even choose to replace these aging machines with mobile devices that provide more streamlined user experiences and offer a slew of modern applications.

VDI’s role in enabling mobility
Where does VDI fit into this picture? With device fleets becoming more fragmented, VDI can be a cost-effective means of providing critical access to core company assets such as enterprise resource planning, client relationship management and line-of-business applications. It’s economical because it requires less investment, both in time and money, than crafting mobile experiences from scratch.

“Applications can also be enhanced for mobile access on the server end, without building a mobile development capability within your organization or hiring expensive outside help,” explained TechRepublic contributor Patrick Gray. “You could use your existing ERP developers to create a dozen screens and reports that have a limited number of fields, and space them more appropriately for mobile use, without writing a single line of mobile code.”

In practical terms, VDI can give sales teams access to full desktops so that they can make use of important tools such as CRM and PowerPoint while on the road. Even if the organization has a BYOD initiative in place, VDI simplifies common BYOD issues such as hardware management and security enforcement – each device communicates securely with the VDI servers via an encrypted session.

VDI and the growing uses of desktop virtualization
Moreover, VDI fits into many organizations’ growing interest in virtualization. Forrester Research’s David Johnson told InfoWorld that more than half of IT decision-makers cite desktop virtualization as a top priority for 2014. Although the market for PCs may be stagnant right now, there is still enough demand for virtual desktops that many companies have turned to VDI to deliver secure computing environments and access to applications on any device.

Certainly, there can be technical and financial challenges in implementing effective VDI, but these obstacles can be overcome with the expertise of an IT solutions and managed services provider. Organizations can also optimize VDI through the use of thin-client software to connect to VDI systems. Eventually, VDI implementations can pay for themselves by making workers more capable, regardless of where they are, while also streamlining mobile device management.

Why managed services are essential for security success

 

Amid the growing range of cybersecurity threats, companies are facing questions about how to secure their data center and application environments. A recent study by Courion found that 78 percent of IT security executives are worried about the possibility of a breach at their organization, with concerns that included loss of client data and negative publicity for the brand. At the same time, while 95 percent of IT security staff believed preventing breaches is a serious issue, they said they thought just 45 percent of employees share their concerns.

This discrepancy underscores the value of having clear governance practices and security standards in place. For companies looking to bring their security operations up to date, a managed services approach can be valuable. With a trusted managed services provider, companies can develop a clear information governance plan, laying out a strategy to keep files safe throughout their life in the company environment.

“In light of the constant changes in the IT environment, all enterprises should look to IT governance to secure information from the moment it is created to the time it is destroyed,” IT executive Dan Chenok wrote in an article that appeared on FCW.com. “That is why, in the past decade, IT governance has moved to the forefront of enterprise efforts to effectively manage and appropriately protect IT systems and assets, contributing to the success of risk-based security and supporting strategic decisions made by C-level executives across the public and private sectors.”

In addition to helping develop a plan for companies to have clear security policies and keep data locked down, a managed services provider can offer ongoing support in the form of managing regulatory compliance and compliance testing, as well as through services such as continuous network monitoring. A third-party provider can access state-of-the-art security technologies and round-the-clock staffing services that a company might not be able to purchase on its own through an economy of scale. And with the growing complexity of cybersecurity risks, companies can benefit from the expertise and knowledge of a specialized outside provider as well.

How can companies improve the disaster resilience of their data center infrastructure?

According to a recent benchmark survey by the Disaster Recovery Preparedness Council, nearly three quarters of companies worldwide are failing in terms of disaster readiness, with struggles in downtime for specific critical applications or even entire data center environments. Close to 20 percent of companies reported losses of over $50,000 stemming from outages. Companies can protect themselves against this possibility by investing in resilient data center solutions from a colocation provider focused on business continuity.

"Reliability starts with high industry standards in a checklist of requirements: climate-controlled environments, intelligent security structure and state-of-the-art equipment, technologies and design," BizTimes.com contributor Kevin Knuese wrote in a recent article.

He noted that companies should look for data center solutions with redundant networking and power supplies, as well as redundant cooling systems and all-around state-of-the-art technology. Additional data center features such as 24/7 monitoring and physical security safeguards meant to withstand both break-ins and natural disasters such as floods and earthquakes are important as well. A hosting provider based in the Midwest can be particularly reliable due to the reduced likelihood of certain natural disasters like earthquakes, hurricanes and mudslides that are more common on the coasts.

A provider that offers backup and business continuity services is also important, Knuese wrote. Executives can sometimes be skeptical of "disaster recovery," seeing it as an alarmist term and frustrating cost driver, according to industry expert Steve Kahan. However, the argument for a reliable data center and backup solution is more clear-cut, as such technology solves the problem of many IT headaches. As a result, a colocation provider with business continuity services can be key for maintaining brand credibility from an IT side.

"Some audiences are more responsive when the conversation is focused on the crucial role that IT plays in ensuring 'business continuity' or the operational costs triggered by an 'extended outage,'" Kahan wrote for DRBenchmark.org. "Here's one more suggestion: think of disaster preparedness as 'an investment in brand security,' a way to protect your company's reputation."

Data center construction increases, driven by demand for colocation services

As more companies shift to an increasingly digital business model, the demand for colocation services is growing. In turn, data center construction is set to increase at a steady rate in the coming years, according to a recent study from Research and Markets. With new, state-of-the-art infrastructure coming online and the industry gravitating toward large-scale data center deployments, companies may want to reconsider how these trends can simplify their own IT strategies.

According to the Research and Markets study, the global data center construction market is set to grow at a compound annual rate of 21 percent through 2018. This trend is largely being driven by the increasing challenge of managing a data center as new demands in terms of efficient energy use, alternative power sources and industry regulations complicate the logistics of building and running an enterprise facility. Additionally, the growing complexity of network infrastructure is proving a challenge for many companies to handle internally, prompting them to look for outsourced solutions.

In addition to new construction, many existing facilities are also being forced to retrofit with new server, power and cooling equipment to absorb the challenges of the contemporary tech landscape, ITBusinessEdge's Arthur Cole noted in a recent column. The result is a change in the profile of the average data center.

"Going forward, infrastructure will be leaner and meaner, but the individual pieces will be more powerful and flexible," Cole wrote. "And the [data centers themselves] will be fewer in number, but much, much bigger."

Rather than try to weather these changes themselves, companies may find it expedient to embrace the trend toward colocation and instead look for a trusted third-party data center provider. With the right partner, companies can position themselves to transition smoothly into the future.

Achieve IT savings with better data center management

 

IT departments face a wide variety of budgetary pressures, which means that finding more efficient ways to deliver the same services is always a goal for technology staff. One of the biggest sources of inefficiency for many companies is the corporate data center, which can create substantial costs that have nothing to do with actual IT in the form of power and cooling needs. Companies are increasingly looking for ways to make these operations more efficient and turning to data center infrastructure management solutions as a result. Additionally, many businesses have found that by switching to a managed services provider for their data center, they are able to access the gains of instituting such technology without the upfront costs and complexity.

A recent Navigant Research study found that the market for data center infrastructure management technology is expected to grow more than sixfold in the next six years as data center operators take advantage of new solutions that offer visibility into both key facilities metrics and server management variables. A separate study of one DCIM solution conducted by Forrester found that the return on investment in terms of power and space planning was 93 percent, while the ROI in terms of energy monitoring was 216 percent.

“DCIM – the software, systems, and services that monitor, measure, and help control data centers’ IT and facilities infrastructure – is quickly becoming a must-have technology for managers of modern data centers,” said Eric Woods, research director at Navigant Research.

Given the substantial savings companies can achieve by using state-of-the-art monitoring and management tools, they should look to leverage data center solutions that incorporate these technologies. Managed services providers should have granular insight over their facilities that enables them to create tangible operational savings and, in turn, pass those savings along to clients.

Managed services equip companies to deal with changing cybersecurity landscape

Each year seems to bring a broader and more complex array of cyber threats to businesses, and many companies are struggling to keep up with the rapid pace of change. According to a recent survey from security software firm KnowBe4, more than half of IT managers – 51 percent – find security harder to maintain now than a year ago. Preventing cyberthreats and responding quickly to security issues are some of the biggest challenges for companies, which is why many are turning to managed services providers for a more secure infrastructure, as well as functions like malware removal and application support.

"Cybercriminals are constantly devising cunning new ways to trick users into clicking their phishing links or opening infected attachments," KnowBe4 CEO Stu Sjouwerman stated, adding that companies need to respond with thorough cybersecurity procedures, policies and training.

Another recent study from Solutionary and the NTT Group found that 54 percent of new malware goes undetected by antivirus software. As a result, companies need to make sure they are protected at the application level by using secure software and applying updates, ITBusinessEdge contributor Sue Poremba wrote in a recent column. Leveraging managed services for application support can help ensure software is kept updated and secured against threats, while external expertise can also be valuable in implementing state-of-the-art perimeter solutions and secure data center infrastructure.

Additionally, a managed services provider that offers malware removal can be a valuable partner in responding to and limiting the damage of an incident like an SQL injection attack, which the Solutionary study noted can easily cost a business $200,000 or more. Such protection might be unaffordable for a small business to implement in-house, but, by outsourcing certain IT management functions, companies can access state-of-the-art security solutions and industry-leading expertise. With the right portfolio of tools protecting it, a small business can avoid these ever-expanding threats.

Disaster recovery services, cybersecurity critical to protecting electric grid from attacks

Over the past few years, the utilities industry has made a concentrated effort to make key infrastructure "smarter." The integration of data-capturing devices and automated, software-based management systems has the potential to create smart electric grids that can more effectively use and distribute power, reducing energy costs and environmental impact in the process.

However, turning power grids into connected devices has potentially harrowing implications – a concentrated cyberattack could cause lengthy and widespread outages, not only withholding electricity from businesses and residences, but disrupting communications, healthcare systems and the economy. According to many cybersecurity researchers, the likelihood of a potential problem occurring is less of an "if" and more of a "when." 

Ramping up disaster recovery services and cybersecurity protocols is key to shielding the smart electric grid from a devastating attack. While the federal government tries to increase the efficacy and stringency of its own security measures, it's important that utility companies – from national generators to local distributors – build up their own prevention and backup systems, according to a recent white paper by the three co-chairs of the Bipartisan Policy Center's Electric Grid Cybersecurity Initiative. This effort will require a hybrid system that responds to both physical and cybersecurity threats. 

"Managing cybersecurity risks on the electric grid raises challenges unlike those in more traditional business IT networks and systems," the report stated. "[I]t will be necessary to resolve differences that remain between the frameworks that govern cyber attack response and traditional disaster response."

Disaster recovery efforts need to include backup digital systems that rival physical ones. Electric grids require faultless failover technology that can depend on a secondary backup network if the primary one is taken offline for any reason. As the Baker Institute pointed out in a recent Forbes article, the measure of a disaster recovery system's effectiveness is based on whether the grid can be restarted following a major breach, disruption or cyberattack. Without a system that can effectively monitor, prevent and immediately respond to such threats, the smart electric grid could be putting many key infrastructure systems in danger.

Disaster-recovery-as-a-service market emphasizes changing priorities

Disaster recovery, once a relative afterthought or nonentity in business planning, is now a central consideration. Advanced threats and high-profile data breaches have helped to convince organizations that it's time to stop dragging their feet and start taking disaster recovery more seriously. The rapid rise of the market for disaster-recovery-as-a-service highlights an important shift in priorities.

According to TechNavio, the global market for DRaaS is expected to rise at a compound annual growth rate of 54.6 percent between 2014 and 2018. Demand for hybrid and cloud-based disaster recovery has driven investment, especially in small- and medium-sized businesses that have found a "flying under the radar" approach by virtue of their size is no longer a viable approach to avoiding the consequences of information security compromises.

Larger organizations have also realized that IT departments are generally unable to maintain complete oversight and disaster recovery protection amid data deluges and rapid network expansion. To cite one sector, the banking industry has begun to invest heavily in the cloud to relieve the amount of resources it has to spend on application updates, software patches and IT infrastructure, according to Bank Systems & Technology.

The report did note that relying too much on a generic cloud solution or paying insufficient attention to backup data could diminish the effectiveness of DRaaS investment. A company is better served by using a multi-service provider that focuses on customization, specificity and addressing pain points. This way, it can avoid any data integrity or governance issues stemming from a lackluster vendor.