Disaster-recovery-as-a-service market emphasizes changing priorities

Disaster recovery, once a relative afterthought or nonentity in business planning, is now a central consideration. Advanced threats and high-profile data breaches have helped to convince organizations that it's time to stop dragging their feet and start taking disaster recovery more seriously. The rapid rise of the market for disaster-recovery-as-a-service highlights an important shift in priorities.

According to TechNavio, the global market for DRaaS is expected to rise at a compound annual growth rate of 54.6 percent between 2014 and 2018. Demand for hybrid and cloud-based disaster recovery has driven investment, especially in small- and medium-sized businesses that have found a "flying under the radar" approach by virtue of their size is no longer a viable approach to avoiding the consequences of information security compromises.

Larger organizations have also realized that IT departments are generally unable to maintain complete oversight and disaster recovery protection amid data deluges and rapid network expansion. To cite one sector, the banking industry has begun to invest heavily in the cloud to relieve the amount of resources it has to spend on application updates, software patches and IT infrastructure, according to Bank Systems & Technology.

The report did note that relying too much on a generic cloud solution or paying insufficient attention to backup data could diminish the effectiveness of DRaaS investment. A company is better served by using a multi-service provider that focuses on customization, specificity and addressing pain points. This way, it can avoid any data integrity or governance issues stemming from a lackluster vendor. 

Getting cloud storage services at the right price

 

Cloud storage services offer organizations peace of mind by providing a secure location to store and back up data. But what happens when the company needs to recover it on a short timetable? Some cloud providers offer an easy road to retrieving data and resetting environments following an incident, but others may have hidden their helpful services behind a dense thicket of extra fees. It’s important to know how to avoid cloud storage pricing models that aren’t as cost-effective as they look on paper.

Although it’s not clear if cloud storage price cutting is an effective measure of netting clients, many cloud providers do it anyway, wrote TechTarget contributor Sonia Lelii. When selecting a vendor, it’s important to discern whether they’re slashing services along with prices. Don’t make a decision based on price alone. Network infrastructure, security and the cloud interface are important features that should not take a backseat to a price quote. Be wary that a low upfront cost could be masking higher expenses during a time of need.

Looking for flexibility is another way to maximize value in a cloud storage services investment, wrote Enterprise Storage Forum contributor Drew Robb. A fee structure that offers locked-in prices for regular or predictable needs, combined with an outline for services that can be purchased on-demand when they’re needed, is better than relying on one that determines fixed usage prices based on service offerings.

Why it's time to invest in a third-party data center

Rising data center complexity means that for many companies, continuing to support an in-house data center is rapidly becoming unsustainable. Spending on data center infrastructure management services and software is expected to top $4.5 billion by 2020, according to Navigant Research, and many organizations are already feeling the pinch on a local level.

Organizations that want to maintain an on-premises facility will likely fall into one of two paths: The first is to enter a cycle of massive spending on everything – new equipment, management tools, software upgrades and IT staff – to handle an otherwise bloated IT environment. The other is to keep costs down by restricting expansion, which could have significant problems for continuity, security and the company's competitive advantages. Neither situation is preferable as both have a fairly high potential for confusion, mismanagement and far-reaching operability issues. 

Outsourcing data center needs, either through colocation or managed servers, enables an organization to gain control of its data center spending while retaining access to top storage, network and security technologies. It also offers companies scalability, which is incredibly difficult or highly expensive to come by in the on-premises facility. As an organization requires additional server space, network bandwidth or software support, it can work with the data center provider to increase available capacity. This way, it doesn't pay for what it doesn't use, and can depend on complete IT support for every aspect of its investment.

Data Center Journal contributor Yuri Rabover compared continued on-site data center development to reorganizing one's garage: Although space is at a premium, the realities of equipment size and non-expert human planning likely mean that it won't be used as effectively. Instead of playing a never-ending game of data center Tetris, hemorrhaging resources at every turn, it makes sense to eliminate an otherwise ongoing problem with a decisive, conclusive act. Outsourcing data center infrastructure and its management helps enterprises avoid the spending sprees that can curtail their competitive advantage.

Why it’s time to invest in a third-party data center

Rising data center complexity means that for many companies, continuing to support an in-house data center is rapidly becoming unsustainable. Spending on data center infrastructure management services and software is expected to top $4.5 billion by 2020, according to Navigant Research, and many organizations are already feeling the pinch on a local level.

Organizations that want to maintain an on-premises facility will likely fall into one of two paths: The first is to enter a cycle of massive spending on everything – new equipment, management tools, software upgrades and IT staff – to handle an otherwise bloated IT environment. The other is to keep costs down by restricting expansion, which could have significant problems for continuity, security and the company's competitive advantages. Neither situation is preferable as both have a fairly high potential for confusion, mismanagement and far-reaching operability issues. 

Outsourcing data center needs, either through colocation or managed servers, enables an organization to gain control of its data center spending while retaining access to top storage, network and security technologies. It also offers companies scalability, which is incredibly difficult or highly expensive to come by in the on-premises facility. As an organization requires additional server space, network bandwidth or software support, it can work with the data center provider to increase available capacity. This way, it doesn't pay for what it doesn't use, and can depend on complete IT support for every aspect of its investment.

Data Center Journal contributor Yuri Rabover compared continued on-site data center development to reorganizing one's garage: Although space is at a premium, the realities of equipment size and non-expert human planning likely mean that it won't be used as effectively. Instead of playing a never-ending game of data center Tetris, hemorrhaging resources at every turn, it makes sense to eliminate an otherwise ongoing problem with a decisive, conclusive act. Outsourcing data center infrastructure and its management helps enterprises avoid the spending sprees that can curtail their competitive advantage.

Optimizing data center strategies for financial services firms

 

Data center investment strategies are critical to the lifeblood of financial services organizations. While finance firms have long used proprietary or third party data centers for information storage and business continuity, big data has given rise to a new set of complications and considerations. Not the least of these are a variety of regulatory and compliance measures that place restrictions on information storage and archival practices. New technologies, rising costs and data management issues are driving compatibility issues in traditional data center models, and financial services firms need to adapt.

Data management in finance is a problem with several moving parts that impact each other. Accumulating and storing data is a relatively straightforward issue, albeit a resource-intensive one. Under the traditional model, a firm would procure additional servers for its onsite facility or enlarge its third-party data center investment, either through colocation or leasing the provider’s equipment.

The deluge of data can make this approach prohibitively costly, forcing organizations to rethink their infrastructure approach, Wall Street & Technology editorial director Greg MacSweeney wrote. Firms with proprietary data centers now stand to save significantly by outsourcing its storage, architecture and management demands. A third-party data center can provide state-of-the-art server hardware, but more importantly has the infrastructure to deploy next-gen network solutions such as virtualization, which drastically reduces the amount of physical equipment needed to contain rising petabytes of data and information-crunching applications.

Working with a third-party data center provider also helps businesses tackle more rapidly moving targets – data integrity and compliance. Data quality and validation are some “small data” issues that grow more problematic as firms accumulate more information from a wider source pool, said software developer Oleg Komissarov, according to a recent FierceFinanceIT article.

Keeping data clean, complete and consistent is a tough task that requires powerful tools and a dedicated team. A managed data center services provider can help offer this level of attention. It can also help in compliance efforts, as any blind spots or inconsistency in information or reporting leave the door open for compliance issues to crop up. As big data expands and accelerates, financial services firms need their data centers to stay one step ahead.

Target breach fallout highlights importance of comprehensive malware removal

Without proactive malware removal, organizations are putting themselves at serious risk. Recent developments in the Target data breach saga highlight the direct costs that can result from a lax approach to eliminating malware. As more details emerge about the hack, which resulted in the compromise of 40 million credit card numbers and 70 million pieces of personal information, it's become evident that the embattled retailer likely could have prevented the attack if it had a stronger, more comprehensive approach to malware removal.

The latest development, per Bloomberg Businessweek, is the discovery that Target was actually warned about the vulnerability that led to the breach through a malware detection tool. The $1.6 million technology monitored Target servers and computers around the clock, looking for anything amiss. The alert system worked the way it was supposed to, according to FireEye, the malware detection tool's producer, and the Bangalore-based security specialists in charge of scanning the retailer's network. They notified Target's Minneapolis-based security team according to procedure, who ended up not doing anything about it.

Of course, hindsight is 20/20, but it's worth pointing out that malware detection is only half of the battle. Malware removal requires organizations to be proactive. Whether Target's security team didn't recognize the severity of the vulnerability and the need for swift action is undetermined, but it's important to remember that cyberthreats don't wait. In an interview with NPR, Businessweek's Michael Riley said that Target's reactionary or indecisive approach was unable to keep the hacking attempt at bay.

"Whatever was going on inside Target's security team, they didn't recognize this as a serious breach," Riley told NPR. "There was no serious investigation that went on. They didn't go to the server itself to figure out what the malware was doing."

Insulating organizations against attacks and identifying malware are difficult tasks that require constant vigilance. A company unsure of whether it can provide this level of attention should strongly consider adopting a third-party malware removal service that can neutralize threats in a preventative fashion.

Managed services can help organizations avoid top 10 business hazards

Managed services enable businesses to more successfully navigate a threat-laden enterprise landscape. Although an organization’s biggest IT, operations and security anxieties vary by region, industry and company size, what they’re most afraid of is generally the same across the board – lost profitability, client churn and a tarnished reputation.

In the Twitter age, no confirmed threat goes unpublished or unanalyzed, and it’s difficult for an organization to escape blame even if it’s only affected as a byproduct of another incident. The woes of retailer Target, which reported a 22 percent decrease in its client base in January following a massive data breach during the 2013 holiday season, serve to underscore consumer response to an enterprise that demonstrates less-than-exemplary information security, data management and business continuity.

According to a recent Business Continuity Institute study of nearly 700 enterprise respondents in 82 different countries, the top 10 most common perceived threats to disaster recovery and business continuity are:

  1. Unplanned IT outages
  2. Cyberattacks
  3. Data breaches
  4. Adverse weather effects
  5. Utility supply interruptions
  6. Fires
  7. Security compromises
  8. Health or safety incident
  9. Act of terrorism
  10. New laws or regulations

How managed services assuage anxiety
Managed services offer vast potential for companies to mitigate potential problems in many areas because a provider’s solutions are customized to the needs of the company. The above list offers a variety of incidents stemming from the company’s location, industry, employee behavior and general security management. Overseeing prevention and contingency plans that effectively respond to all of these potential hazards is time consuming, resource intensive and costly. While it’s impossible to prevent adverse weather or control regulatory measures, it’s possible to keep these threats from doing any real damage.

Managed services are scalable, so the amount of a provider’s involvement can correspond exactly a company’s anxieties and potential hazards. One organization may simply require online backup services via an offsite server in order to increase its data loss prevention activities. Another may want to virtualize nearly all of its infrastructure so its employees can stay connected and productive during a wave of bad weather. As a company’s needs change over time, it doesn’t have to rearrange its entire back-end infrastructure in order to keep danger at bay.

Who really cares about BYOD?

The bring-your-own-device movement is well on its way to fundamentally reshaping enterprise communications. So why do so few organizations seem to care about device management? A fairly wide gap formed almost immediately between BYOD user excitement and enterprise policy engagement, and it's only going to expand.

Entrenched employee attitudes absolving them of responsibility create problems for IT, and many organizations let worker preferences overwhelm clear-cut business priorities. The central problem with BYOD is a company's capacity to show that its cares – not only about the ways that BYOD can be hazardous, but about creating strategies that cater to worker preferences while keeping security at the forefront.

BYOD is clearly important to employees. One recent LANDESK survey found that the average European worker spends more on BYOD every year than he or she does on tea of coffee. They care about having the devices, but not protecting the data stored on them.

"It's not my problem" was a common refrain in a recent survey by Absolute Software about data security in the mobile enterprise. More than a quarter of those surveyed said they felt there should be no penalties for leaked or lost corporate data. Additionally, more than one third of respondents who lost their mobile devices said that they didn't make any changes to their security habits, while 59 percent estimated that their corporate data was barely worth more than the cost of replacing a phone.

Who is to blame for BYOD problems?
It's up to companies to exhibit the same passion for data security that employees have for using their own smartphones. Of those who acknowledged a penalty for data loss might be in order, most received nothing more than a slap on the wrist from employers, and often much less – 21 percent had a talking-to, 30 percent had to replace their lost device themselves and 34 percent reported that "nothing" happened when they lost company information. This reflects poorly on companies, observed Absolute mobile enterprise data expert Tim Williams, and will continue unless companies get proactive about BYOD management.

"If firms don't set clear policies that reflect the priority of corporate data security, they can't expect employees to make it a priority on their own," Williams said.

Establishing and enforcing BYOD practices is a good first step. Regulations have to acknowledge the ways personnel use BYOD and avoid limiting productivity as much as possible. There are several technological tools that can help a company secure mobile devices behind the scenes. Investing in managed infrastructure and IT support services provides a scalable, adaptable and continuous resource for effective network monitoring and data management

You've got mail, and it's a virus: Why organizations need cloud storage services for email

Security researchers recently discovered a cache of personal records for sale on the Internet's black market, including 1.25 billion email addresses, according to the Independent. Finding one email address for every seven people in the world in the care of hackers is alarming. Email continues to be the central repository for the digital transmission and storage of confidential information and remains one of cybercriminals' prime targets. Cloud storage services are a must for organizations struggling to take control of email security and management.

Keeping on top of email storage and archival is challenging for organizations of any size. Smaller organizations lack the IT resources of their larger peers, making it difficult to process email and ensure that all files are stored safely. Bigger companies have dedicated IT departments, but they also have massive email systems generated by bigger user bases and more diverse device profiles. The expertise and resources required to maintain in-house email storage are usually too costly. Either way, upholding the integrity of protection and system management at all times is beyond the purview of virtually every organization.

Adhering to traditional models of email storage simply won't suffice in the face of today's threat landscape. Moving email to cloud storage services, on the other hand, allows organizations to outsource the hardware and storage support to a trusted third party provider, wrote Nashville Business Journal contributor Richard Pinson. 

"Hosting your own email requires constant upgrading, patching, backing up and monitoring," Pinson wrote. "Once email transitions to the cloud, the service provider is responsible for all storage maintenance tasks and provides the most-recent version of their product."

Cloud storage services are scalable, meaning that an organization won't pay for what they don't use. Over the long term, this is a much more cost-effective option than having to update legacy in-house environments every few years to respond to new security and productivity challenges. It only takes one malicious email ending up in a user's inbox to let hackers in. In this landscape, organizations need the help of a dedicated cloud provider to keep their confidential information safe.

Investing in the IoT? Consider data storage issues first

The Internet of Things is a game-changing force, not only in the technology sphere but with implications for numerous other industries. IDC research analysts projected that the IoT will consist of 212 billion connected devices by 2020, generating $8.9 trillion in global revenues. Cisco's forecasts are even rosier, with the tech giant and IoT cheerleader predicting that the market will be worth $19 trillion within the next few years. Any way it's sliced, the IoT is poised to make a massive and far-reaching impact in the enterprise and personal lifestyles.

While many organizations look to ramp up their investment in connected electronics over the next few years, fewer have mapped a course for the data storage issues that will arise from the influx of linked, information-producing devices, as well as applications and analytics tools used to evaluate them. Companies already dealing with limitations in infrastructure support, network connectivity and IT management could be in for a rude awakening during the IoT investment process. Understanding the implications the IoT has for data storage can help organizations ensure that they're prepared.

Redefining data storage
Organizations may have to revamp their data center configurations to deal with machine-generated data, wrote InformationWeek contributor George Crump. Typically, an enterprise data center would process one of two data types: The first is large-file data, such as videos and images, which is accessed sequentially. The second kind is small-file data, which might come from a sensor log, but its massive volume compels random access. Machine-generated data comes in both types. In order for an organization to benefit fully from its network of sensors, it would need to outfit two separate storage systems to deal with the dual data types.

A company planning to approach its IoT investment with piecemeal, ad hoc storage investments would be better served outsourcing their storage needs to a provider that supports quickly scaling infrastructure builds. Otherwise, a business risks limiting the value of its machine-generated data. As Crump noted, the point of the IoT is to use data to make better decisions. Investing in a managed data storage service enables a company to direct its attention away from the complexities of infrastructure management and toward improving their business models.

"The storage systems for these initiatives almost always start out ad hoc and then become a focal point," Crump wrote. "If you have sensors, or things, that are creating data, keep an eye on that data now. Protect it and be prepared for it to become more important to the organization."