Optimizing data center strategies for financial services firms

 

Data center investment strategies are critical to the lifeblood of financial services organizations. While finance firms have long used proprietary or third party data centers for information storage and business continuity, big data has given rise to a new set of complications and considerations. Not the least of these are a variety of regulatory and compliance measures that place restrictions on information storage and archival practices. New technologies, rising costs and data management issues are driving compatibility issues in traditional data center models, and financial services firms need to adapt.

Data management in finance is a problem with several moving parts that impact each other. Accumulating and storing data is a relatively straightforward issue, albeit a resource-intensive one. Under the traditional model, a firm would procure additional servers for its onsite facility or enlarge its third-party data center investment, either through colocation or leasing the provider’s equipment.

The deluge of data can make this approach prohibitively costly, forcing organizations to rethink their infrastructure approach, Wall Street & Technology editorial director Greg MacSweeney wrote. Firms with proprietary data centers now stand to save significantly by outsourcing its storage, architecture and management demands. A third-party data center can provide state-of-the-art server hardware, but more importantly has the infrastructure to deploy next-gen network solutions such as virtualization, which drastically reduces the amount of physical equipment needed to contain rising petabytes of data and information-crunching applications.

Working with a third-party data center provider also helps businesses tackle more rapidly moving targets – data integrity and compliance. Data quality and validation are some “small data” issues that grow more problematic as firms accumulate more information from a wider source pool, said software developer Oleg Komissarov, according to a recent FierceFinanceIT article.

Keeping data clean, complete and consistent is a tough task that requires powerful tools and a dedicated team. A managed data center services provider can help offer this level of attention. It can also help in compliance efforts, as any blind spots or inconsistency in information or reporting leave the door open for compliance issues to crop up. As big data expands and accelerates, financial services firms need their data centers to stay one step ahead.

Managed services can help organizations avoid top 10 business hazards

Managed services enable businesses to more successfully navigate a threat-laden enterprise landscape. Although an organization’s biggest IT, operations and security anxieties vary by region, industry and company size, what they’re most afraid of is generally the same across the board – lost profitability, client churn and a tarnished reputation.

In the Twitter age, no confirmed threat goes unpublished or unanalyzed, and it’s difficult for an organization to escape blame even if it’s only affected as a byproduct of another incident. The woes of retailer Target, which reported a 22 percent decrease in its client base in January following a massive data breach during the 2013 holiday season, serve to underscore consumer response to an enterprise that demonstrates less-than-exemplary information security, data management and business continuity.

According to a recent Business Continuity Institute study of nearly 700 enterprise respondents in 82 different countries, the top 10 most common perceived threats to disaster recovery and business continuity are:

  1. Unplanned IT outages
  2. Cyberattacks
  3. Data breaches
  4. Adverse weather effects
  5. Utility supply interruptions
  6. Fires
  7. Security compromises
  8. Health or safety incident
  9. Act of terrorism
  10. New laws or regulations

How managed services assuage anxiety
Managed services offer vast potential for companies to mitigate potential problems in many areas because a provider’s solutions are customized to the needs of the company. The above list offers a variety of incidents stemming from the company’s location, industry, employee behavior and general security management. Overseeing prevention and contingency plans that effectively respond to all of these potential hazards is time consuming, resource intensive and costly. While it’s impossible to prevent adverse weather or control regulatory measures, it’s possible to keep these threats from doing any real damage.

Managed services are scalable, so the amount of a provider’s involvement can correspond exactly a company’s anxieties and potential hazards. One organization may simply require online backup services via an offsite server in order to increase its data loss prevention activities. Another may want to virtualize nearly all of its infrastructure so its employees can stay connected and productive during a wave of bad weather. As a company’s needs change over time, it doesn’t have to rearrange its entire back-end infrastructure in order to keep danger at bay.

Who really cares about BYOD?

The bring-your-own-device movement is well on its way to fundamentally reshaping enterprise communications. So why do so few organizations seem to care about device management? A fairly wide gap formed almost immediately between BYOD user excitement and enterprise policy engagement, and it's only going to expand.

Entrenched employee attitudes absolving them of responsibility create problems for IT, and many organizations let worker preferences overwhelm clear-cut business priorities. The central problem with BYOD is a company's capacity to show that its cares – not only about the ways that BYOD can be hazardous, but about creating strategies that cater to worker preferences while keeping security at the forefront.

BYOD is clearly important to employees. One recent LANDESK survey found that the average European worker spends more on BYOD every year than he or she does on tea of coffee. They care about having the devices, but not protecting the data stored on them.

"It's not my problem" was a common refrain in a recent survey by Absolute Software about data security in the mobile enterprise. More than a quarter of those surveyed said they felt there should be no penalties for leaked or lost corporate data. Additionally, more than one third of respondents who lost their mobile devices said that they didn't make any changes to their security habits, while 59 percent estimated that their corporate data was barely worth more than the cost of replacing a phone.

Who is to blame for BYOD problems?
It's up to companies to exhibit the same passion for data security that employees have for using their own smartphones. Of those who acknowledged a penalty for data loss might be in order, most received nothing more than a slap on the wrist from employers, and often much less – 21 percent had a talking-to, 30 percent had to replace their lost device themselves and 34 percent reported that "nothing" happened when they lost company information. This reflects poorly on companies, observed Absolute mobile enterprise data expert Tim Williams, and will continue unless companies get proactive about BYOD management.

"If firms don't set clear policies that reflect the priority of corporate data security, they can't expect employees to make it a priority on their own," Williams said.

Establishing and enforcing BYOD practices is a good first step. Regulations have to acknowledge the ways personnel use BYOD and avoid limiting productivity as much as possible. There are several technological tools that can help a company secure mobile devices behind the scenes. Investing in managed infrastructure and IT support services provides a scalable, adaptable and continuous resource for effective network monitoring and data management

You've got mail, and it's a virus: Why organizations need cloud storage services for email

Security researchers recently discovered a cache of personal records for sale on the Internet's black market, including 1.25 billion email addresses, according to the Independent. Finding one email address for every seven people in the world in the care of hackers is alarming. Email continues to be the central repository for the digital transmission and storage of confidential information and remains one of cybercriminals' prime targets. Cloud storage services are a must for organizations struggling to take control of email security and management.

Keeping on top of email storage and archival is challenging for organizations of any size. Smaller organizations lack the IT resources of their larger peers, making it difficult to process email and ensure that all files are stored safely. Bigger companies have dedicated IT departments, but they also have massive email systems generated by bigger user bases and more diverse device profiles. The expertise and resources required to maintain in-house email storage are usually too costly. Either way, upholding the integrity of protection and system management at all times is beyond the purview of virtually every organization.

Adhering to traditional models of email storage simply won't suffice in the face of today's threat landscape. Moving email to cloud storage services, on the other hand, allows organizations to outsource the hardware and storage support to a trusted third party provider, wrote Nashville Business Journal contributor Richard Pinson. 

"Hosting your own email requires constant upgrading, patching, backing up and monitoring," Pinson wrote. "Once email transitions to the cloud, the service provider is responsible for all storage maintenance tasks and provides the most-recent version of their product."

Cloud storage services are scalable, meaning that an organization won't pay for what they don't use. Over the long term, this is a much more cost-effective option than having to update legacy in-house environments every few years to respond to new security and productivity challenges. It only takes one malicious email ending up in a user's inbox to let hackers in. In this landscape, organizations need the help of a dedicated cloud provider to keep their confidential information safe.

Why the higher education sector needs ITaaS

Data management continues to be an issue in the education sector. The recent flurry of information breaches highlights the lack of adequate information security practices at U.S. colleges and universities. Besides the sheer number of records potentially compromised, the leaks brought to light the dearth of IT infrastructure and governance policies capable of coping with the realities of today's cyberthreat landscape. As long as these institutions adhere to outdated IT security policies and questionable data management practices, they will be increasingly attractive targets to cyber espionage agents. IT-as-a-service can offer universities and colleges advanced IT support.

The recent university data breaches include:

  • A University of Maryland leak that exposed Social Security numbers, among other personal information, of more than 300,000 records. Some of these had been kept in a poorly maintained system since 1998, The New York Times reported.
  • Another recent leak compromised the information for 146,000 students and recent graduates at Indiana University, according to the Chicago Tribune. In following up on the breach, it was discovered that the data had been stored in an insecure server for 11 months.
  • Employee tax return problems at the University of Northern Iowa may be related to a compromised database, according to the Omaha World Herald.

Several unique issues contribute to poor data management at higher education institutions, including budgetary restrictions, work-study students with little experience serving as ad hoc IT support and sprawling networks with high user turnover. Migrating data storage, information security and other strategic IT planning demands to an ITaaS solution makes sense for universities and colleges that need to upgrade their IT support on a massive scale. ITaaS providers offer real-time data security, establish more stringent access and user protocols, and customize IT strategies to respond directly to the institution's most pressing needs. 

"Universities are a focus in today's global assaults on I.T. systems," said Wallace Loh, University of Maryland president in a statement following the breach. "Obviously, we need to do more and better, and we will."

Desktop virtualization: Why companies need to stop dragging their feet

Desktop virtualization is a necessary investment that reflects the changing technological paradigm. With employees increasingly mobile and companies more globalized, personnel need to be able to access their desktop operating system and applications from anywhere. Many organizations are eagerly sending data storage to the cloud and investing in as-a-service solutions to better manage and protect growing application environments. However, this accelerated investment wanes when it comes to desktop virtualization. Why? Shouldn't location-independent services extend to the level of the end user?

Cost continues to be an impediment to desktop virtualization in the eyes of many companies. While organizations acknowledge that the Internet offers a much more cost-effective and centralized medium through which to provide enterprise application and information access, they are worried about the expenses involved in reconfiguring enterprise infrastructure to make it compatible, according to a recent TechNavio report. While it's true that this can represent a sizeable capital investment, the long-term operational savings are enormous.

Bearing this in mind, ZDNet contributor Ken Hess wrote that it's surprising that companies are "still having this conversation" about the merits of desktop virtualization. Many companies who are worried about the costs of deploying virtual desktops and other infrastructure are the same ones clinging to hardware that is approaching or past its fifth year in use.Old equipment breaks down more frequently and often costs more to repair, and the more outdated hardware is, the more difficult it is to transition to a new IT program. Newer hardware likely has virtualization capacity. It makes sense to upgrade now, knowing that doing so when it is inevitable or reached a critical point will be extra complex.

Curing data management issues in the healthcare sector

Data management in the healthcare industry is reaching a tipping point. According to CDW Healthcare, the medical sector is gearing up to massive data growth – the 500 petabytes of data in 2013 are set to rise to 25,000 PBs by 2020. By 2015, the average hospital could be producing around 665 terabytes of data.

It's not just the amount of data that's the issue, but the types of information organizations collect. About 80 percent of data is unstructured, with imaging, scans and video requiring huge swaths of server space. Also, many healthcare providers are storing redundant information – the average hospital has 800,000 total records, but as many as 96,000 are duplicates. They are costly to store, making filing systems and data management efforts more complex without delivering additional security.

While big data offers potential benefits in patient care, research and treatment, the healthcare sector is flailing. In part, it's due to a relatively unique set of circumstances. The healthcare sector is traditionally fairly tech-averse – that acres of file cabinets containing patient records in manila folders still persist is a testament to how difficult it is to go digital. Initiatives such as electronic health records and healthcare information exchanges that increase the value of data have to contend with a slew of compliance, privacy and confidentiality issues.

Data management services can help healthcare organizations wield their vast information reserves in a cost-effective and secure way. Modern information technology infrastructure and business intelligence tools are critical to the effective utilization and protection of game-changing data-driven strategies, wrote Forbes contributor John Foley. Not only are massive file systems difficult to back up in a comprehensive way, many medical providers don't have any idea how long it would take to make files available following an unplanned incident. A data management services provider can help the organization establish a customized storage and backup system that prioritizes continuity and compliance. With people's lives potentially hanging in the balance, it's vital that healthcare providers alleviate big data headaches.

Colocation provides balance in a precarious world

 

Colocation is an increasingly popular choice for companies that want to cut down on data center spending without relinquishing control over their equipment. The market for wholesale and retail colocation is expected to surpass $43 billion by 2018, according to MarketsandMarkets. This represents a compound annual growth rate of 11 percent from 2013 to 2018. Retail colocation, in which businesses lease space in a large data center that services multiple clients, is rising in demand, with retail colocation deals often topping 1 megawatt of critical power to satisfy scaling client needs.

Many organizations that have little experience with massive infrastructure needs are now faced with increasing convergence between business and IT. This dive into the deep end can quickly subvert budgeting, resourcing, tech support and data strategies that companies have carefully planned. Colocation provides an alternative to an endless cycle of purchasing new equipment, building additions to onsite data centers and retraining staff. As Computer Weekly contributor Clive Longbottom pointed out, it makes little sense to build a facility given so much uncertainty, when it’s nearly impossible to predict demand even a few years down the road.

Unlike managed services, in which a company outsources the oversight of its infrastructure to a provider, colocation enables it to use its own servers and retain control of installation, maintenance and management. This can be a good first step for an organization that may have less experience with IT outsourcing but knows that it can’t subsist much longer on the status quo.

3 ways cloud storage solves IT complexity issues

Cloud storage enables businesses to exert more control over increasingly complex IT environments. Many IT departments are struggling with the management-related issues and costs stemming from infrastructure expansion. It's a physical problem, in terms of the storage equipment and support needed for big data and application provisioning. It's also an issue of management, as rising device and networking demands put more pressure on IT resourcing and policymaking capacities. At the same time, pressure to keep costs down can leave IT systems fractured or bloated. 

Cloud storage is critical to reducing the costs and complications of IT for a better bottom line. Here are three ways it makes a difference:

  1. Simplifies backup and recovery: Many organizations struggle to get employees to back up files in anything approaching real time. This reality is compounded by growing IT environments, wrote ZDNet senior editor Jason Perlow. Cloud storage offers organizations scalable storage space that expands as a business's needs do, plus automated syncing and backup to ensure real-time recovery availability.
  2. Reduces CAPEX and OPEX: The cloud can reduce storage-related capital and operating expenses in one fell swoop, observed CSO Online contributor Gordon Makryllos. Cloud storage offers upfront advantages to organizations by drastically reducing the amount of equipment they need to buy. Its scalability also offers OPEX cost benefits through streamlined security management, greater flexibility and more centralized IT support that provides continuity as organizations' priorities change.
  3. Improves collaborative potential: Communication and collaboration are more critical than ever to establishing a vibrant, successful organization. By centralizing file storage in a cloud server instead of on individual devices, employees can view, edit and share documents and files easily and in real time. IT departments can also leverage cloud environments to provide enhanced encryption and other security measures, automating access and preserving data integrity in the face of cyberthreats.

As complexity and costs rise, cloud storage can help relieve IT departments of many of the daily tasks that take up an increasing amount of their time. It enables them to spend more time on business-critical projects, with this alignment serving as another way to boost margins and take control of changing technological imperatives.

Real-world business continuity: The soaring costs of downtime

Many organizations approach business continuity as an afterthought. When a company is building up its hardware footprint and application investments in support of its growing business model, contingency plans are often relegated to the backseat and linger there. These organizations find out too late about the costs of prolonged downtime and the difficulty involved in righting the ship only in the aftermath of an unplanned event. One recent report offers some fairly chilling statistics about widespread shortcomings and expensive consequences of ignoring business continuity planning.

The Ponemon Institute report on the cost of data center outages in 2013 found that organizations lose $7,900 per minute of downtime. The mean cost of a single data center outage is $627,418 and the maximum amount lost to a single incident was more than $1.7 million. The total and per-minute costs correlated to the size of the facility and the duration of the outage, while IT equipment failure represented the most expensive root cause of unplanned data center downtime. Financial hits were worse for companies in data center-dependent industries such as e-commerce, financial services and telecommunications.

Costs can quickly escalate as a business recovers from an unplanned incident. From detection and containment to lost revenues and dwindled productivity, the expenditures can be immense. An organization will suffer more for each area of its business continuity planning that is lackluster or poorly thought out. 

These findings convey the importance of having an effective business continuity approach in place. The approach is twofold – prevention and recovery. Eliminating root causes of downtime is vital, especially in the case of expensive ones like IT equipment that can be more effectively managed. Visibility and redundancy can help streamline efforts to get the system back on track following a surprise incident.

Virtualization can be a great asset to both aspects of business continuity planning, as a recent CIO.com webinar pointed out. It provides a more manageable, agile environment for continuity efforts, mitigates hardware vulnerabilities by slashing equipment needs and helps a company access its safely stored systems and applications immediately following an unplanned occurrence.