Posts

Target breach fallout highlights importance of comprehensive malware removal

Without proactive malware removal, organizations are putting themselves at serious risk. Recent developments in the Target data breach saga highlight the direct costs that can result from a lax approach to eliminating malware. As more details emerge about the hack, which resulted in the compromise of 40 million credit card numbers and 70 million pieces of personal information, it's become evident that the embattled retailer likely could have prevented the attack if it had a stronger, more comprehensive approach to malware removal.

The latest development, per Bloomberg Businessweek, is the discovery that Target was actually warned about the vulnerability that led to the breach through a malware detection tool. The $1.6 million technology monitored Target servers and computers around the clock, looking for anything amiss. The alert system worked the way it was supposed to, according to FireEye, the malware detection tool's producer, and the Bangalore-based security specialists in charge of scanning the retailer's network. They notified Target's Minneapolis-based security team according to procedure, who ended up not doing anything about it.

Of course, hindsight is 20/20, but it's worth pointing out that malware detection is only half of the battle. Malware removal requires organizations to be proactive. Whether Target's security team didn't recognize the severity of the vulnerability and the need for swift action is undetermined, but it's important to remember that cyberthreats don't wait. In an interview with NPR, Businessweek's Michael Riley said that Target's reactionary or indecisive approach was unable to keep the hacking attempt at bay.

"Whatever was going on inside Target's security team, they didn't recognize this as a serious breach," Riley told NPR. "There was no serious investigation that went on. They didn't go to the server itself to figure out what the malware was doing."

Insulating organizations against attacks and identifying malware are difficult tasks that require constant vigilance. A company unsure of whether it can provide this level of attention should strongly consider adopting a third-party malware removal service that can neutralize threats in a preventative fashion.

Managed services can help organizations avoid top 10 business hazards

Managed services enable businesses to more successfully navigate a threat-laden enterprise landscape. Although an organization’s biggest IT, operations and security anxieties vary by region, industry and company size, what they’re most afraid of is generally the same across the board – lost profitability, client churn and a tarnished reputation.

In the Twitter age, no confirmed threat goes unpublished or unanalyzed, and it’s difficult for an organization to escape blame even if it’s only affected as a byproduct of another incident. The woes of retailer Target, which reported a 22 percent decrease in its client base in January following a massive data breach during the 2013 holiday season, serve to underscore consumer response to an enterprise that demonstrates less-than-exemplary information security, data management and business continuity.

According to a recent Business Continuity Institute study of nearly 700 enterprise respondents in 82 different countries, the top 10 most common perceived threats to disaster recovery and business continuity are:

  1. Unplanned IT outages
  2. Cyberattacks
  3. Data breaches
  4. Adverse weather effects
  5. Utility supply interruptions
  6. Fires
  7. Security compromises
  8. Health or safety incident
  9. Act of terrorism
  10. New laws or regulations

How managed services assuage anxiety
Managed services offer vast potential for companies to mitigate potential problems in many areas because a provider’s solutions are customized to the needs of the company. The above list offers a variety of incidents stemming from the company’s location, industry, employee behavior and general security management. Overseeing prevention and contingency plans that effectively respond to all of these potential hazards is time consuming, resource intensive and costly. While it’s impossible to prevent adverse weather or control regulatory measures, it’s possible to keep these threats from doing any real damage.

Managed services are scalable, so the amount of a provider’s involvement can correspond exactly a company’s anxieties and potential hazards. One organization may simply require online backup services via an offsite server in order to increase its data loss prevention activities. Another may want to virtualize nearly all of its infrastructure so its employees can stay connected and productive during a wave of bad weather. As a company’s needs change over time, it doesn’t have to rearrange its entire back-end infrastructure in order to keep danger at bay.

Who really cares about BYOD?

The bring-your-own-device movement is well on its way to fundamentally reshaping enterprise communications. So why do so few organizations seem to care about device management? A fairly wide gap formed almost immediately between BYOD user excitement and enterprise policy engagement, and it's only going to expand.

Entrenched employee attitudes absolving them of responsibility create problems for IT, and many organizations let worker preferences overwhelm clear-cut business priorities. The central problem with BYOD is a company's capacity to show that its cares – not only about the ways that BYOD can be hazardous, but about creating strategies that cater to worker preferences while keeping security at the forefront.

BYOD is clearly important to employees. One recent LANDESK survey found that the average European worker spends more on BYOD every year than he or she does on tea of coffee. They care about having the devices, but not protecting the data stored on them.

"It's not my problem" was a common refrain in a recent survey by Absolute Software about data security in the mobile enterprise. More than a quarter of those surveyed said they felt there should be no penalties for leaked or lost corporate data. Additionally, more than one third of respondents who lost their mobile devices said that they didn't make any changes to their security habits, while 59 percent estimated that their corporate data was barely worth more than the cost of replacing a phone.

Who is to blame for BYOD problems?
It's up to companies to exhibit the same passion for data security that employees have for using their own smartphones. Of those who acknowledged a penalty for data loss might be in order, most received nothing more than a slap on the wrist from employers, and often much less – 21 percent had a talking-to, 30 percent had to replace their lost device themselves and 34 percent reported that "nothing" happened when they lost company information. This reflects poorly on companies, observed Absolute mobile enterprise data expert Tim Williams, and will continue unless companies get proactive about BYOD management.

"If firms don't set clear policies that reflect the priority of corporate data security, they can't expect employees to make it a priority on their own," Williams said.

Establishing and enforcing BYOD practices is a good first step. Regulations have to acknowledge the ways personnel use BYOD and avoid limiting productivity as much as possible. There are several technological tools that can help a company secure mobile devices behind the scenes. Investing in managed infrastructure and IT support services provides a scalable, adaptable and continuous resource for effective network monitoring and data management

You've got mail, and it's a virus: Why organizations need cloud storage services for email

Security researchers recently discovered a cache of personal records for sale on the Internet's black market, including 1.25 billion email addresses, according to the Independent. Finding one email address for every seven people in the world in the care of hackers is alarming. Email continues to be the central repository for the digital transmission and storage of confidential information and remains one of cybercriminals' prime targets. Cloud storage services are a must for organizations struggling to take control of email security and management.

Keeping on top of email storage and archival is challenging for organizations of any size. Smaller organizations lack the IT resources of their larger peers, making it difficult to process email and ensure that all files are stored safely. Bigger companies have dedicated IT departments, but they also have massive email systems generated by bigger user bases and more diverse device profiles. The expertise and resources required to maintain in-house email storage are usually too costly. Either way, upholding the integrity of protection and system management at all times is beyond the purview of virtually every organization.

Adhering to traditional models of email storage simply won't suffice in the face of today's threat landscape. Moving email to cloud storage services, on the other hand, allows organizations to outsource the hardware and storage support to a trusted third party provider, wrote Nashville Business Journal contributor Richard Pinson. 

"Hosting your own email requires constant upgrading, patching, backing up and monitoring," Pinson wrote. "Once email transitions to the cloud, the service provider is responsible for all storage maintenance tasks and provides the most-recent version of their product."

Cloud storage services are scalable, meaning that an organization won't pay for what they don't use. Over the long term, this is a much more cost-effective option than having to update legacy in-house environments every few years to respond to new security and productivity challenges. It only takes one malicious email ending up in a user's inbox to let hackers in. In this landscape, organizations need the help of a dedicated cloud provider to keep their confidential information safe.

Investing in the IoT? Consider data storage issues first

The Internet of Things is a game-changing force, not only in the technology sphere but with implications for numerous other industries. IDC research analysts projected that the IoT will consist of 212 billion connected devices by 2020, generating $8.9 trillion in global revenues. Cisco's forecasts are even rosier, with the tech giant and IoT cheerleader predicting that the market will be worth $19 trillion within the next few years. Any way it's sliced, the IoT is poised to make a massive and far-reaching impact in the enterprise and personal lifestyles.

While many organizations look to ramp up their investment in connected electronics over the next few years, fewer have mapped a course for the data storage issues that will arise from the influx of linked, information-producing devices, as well as applications and analytics tools used to evaluate them. Companies already dealing with limitations in infrastructure support, network connectivity and IT management could be in for a rude awakening during the IoT investment process. Understanding the implications the IoT has for data storage can help organizations ensure that they're prepared.

Redefining data storage
Organizations may have to revamp their data center configurations to deal with machine-generated data, wrote InformationWeek contributor George Crump. Typically, an enterprise data center would process one of two data types: The first is large-file data, such as videos and images, which is accessed sequentially. The second kind is small-file data, which might come from a sensor log, but its massive volume compels random access. Machine-generated data comes in both types. In order for an organization to benefit fully from its network of sensors, it would need to outfit two separate storage systems to deal with the dual data types.

A company planning to approach its IoT investment with piecemeal, ad hoc storage investments would be better served outsourcing their storage needs to a provider that supports quickly scaling infrastructure builds. Otherwise, a business risks limiting the value of its machine-generated data. As Crump noted, the point of the IoT is to use data to make better decisions. Investing in a managed data storage service enables a company to direct its attention away from the complexities of infrastructure management and toward improving their business models.

"The storage systems for these initiatives almost always start out ad hoc and then become a focal point," Crump wrote. "If you have sensors, or things, that are creating data, keep an eye on that data now. Protect it and be prepared for it to become more important to the organization."

Why the higher education sector needs ITaaS

Data management continues to be an issue in the education sector. The recent flurry of information breaches highlights the lack of adequate information security practices at U.S. colleges and universities. Besides the sheer number of records potentially compromised, the leaks brought to light the dearth of IT infrastructure and governance policies capable of coping with the realities of today's cyberthreat landscape. As long as these institutions adhere to outdated IT security policies and questionable data management practices, they will be increasingly attractive targets to cyber espionage agents. IT-as-a-service can offer universities and colleges advanced IT support.

The recent university data breaches include:

  • A University of Maryland leak that exposed Social Security numbers, among other personal information, of more than 300,000 records. Some of these had been kept in a poorly maintained system since 1998, The New York Times reported.
  • Another recent leak compromised the information for 146,000 students and recent graduates at Indiana University, according to the Chicago Tribune. In following up on the breach, it was discovered that the data had been stored in an insecure server for 11 months.
  • Employee tax return problems at the University of Northern Iowa may be related to a compromised database, according to the Omaha World Herald.

Several unique issues contribute to poor data management at higher education institutions, including budgetary restrictions, work-study students with little experience serving as ad hoc IT support and sprawling networks with high user turnover. Migrating data storage, information security and other strategic IT planning demands to an ITaaS solution makes sense for universities and colleges that need to upgrade their IT support on a massive scale. ITaaS providers offer real-time data security, establish more stringent access and user protocols, and customize IT strategies to respond directly to the institution's most pressing needs. 

"Universities are a focus in today's global assaults on I.T. systems," said Wallace Loh, University of Maryland president in a statement following the breach. "Obviously, we need to do more and better, and we will."

Differentiating effective IT business continuity from disaster recovery

With constant threats posed by extreme weather and external attackers, companies have increasingly recognized the importance of protecting their IT assets in the wake of a disaster. But the nature of that protection plan is often up for debate. Recovering from disaster means leveraging tools like online backup services at the very least. However, true resilience in the face of a disaster requires a more all-encompassing business continuity approach.

The plan goes beyond data protection and recovery
While backing up data so it can be restored in the wake of an outage is the bedrock of any business continuity plan, it's only half the battle. Depending on a business's approach, its backup solution may do it little good in the event of an actual disaster. For instance, some businesses relying on off-site tape storage have found themselves unable to restore their files at a secondary location after a storm because they couldn't physically travel to the tape storage facility due to flooding, industry expert Jarrett Potts explained in a column for Data Center Knowledge. Having a plan that encompasses the full recovery process is essential.

"IT disaster recovery plans are very important when one considers how intertwined organizations are with technology, but it is important to note that IT disaster recovery plans are not, by themselves, a complete business continuity strategy," Continuity Central contributor Michael Bratton explained in a recent article.

The solution is oriented toward application uptime
A key differentiator between disaster recovery and business continuity is that the latter's focus is keeping core business operations running. As Bratton noted, this approach goes beyond simply IT. However, from a tech perspective, it primarily means keeping critical applications running with as little interruption as possible. Through technologies like virtualization and a distributed network of colocation facilities, businesses can establish a flexible application hosting model that can easily weather unexpected events. The exact nature of the plan is likely to vary from company to company, so working with a third-party solution provider to develop a custom response can also be beneficial.

Desktop virtualization: Why companies need to stop dragging their feet

Desktop virtualization is a necessary investment that reflects the changing technological paradigm. With employees increasingly mobile and companies more globalized, personnel need to be able to access their desktop operating system and applications from anywhere. Many organizations are eagerly sending data storage to the cloud and investing in as-a-service solutions to better manage and protect growing application environments. However, this accelerated investment wanes when it comes to desktop virtualization. Why? Shouldn't location-independent services extend to the level of the end user?

Cost continues to be an impediment to desktop virtualization in the eyes of many companies. While organizations acknowledge that the Internet offers a much more cost-effective and centralized medium through which to provide enterprise application and information access, they are worried about the expenses involved in reconfiguring enterprise infrastructure to make it compatible, according to a recent TechNavio report. While it's true that this can represent a sizeable capital investment, the long-term operational savings are enormous.

Bearing this in mind, ZDNet contributor Ken Hess wrote that it's surprising that companies are "still having this conversation" about the merits of desktop virtualization. Many companies who are worried about the costs of deploying virtual desktops and other infrastructure are the same ones clinging to hardware that is approaching or past its fifth year in use.Old equipment breaks down more frequently and often costs more to repair, and the more outdated hardware is, the more difficult it is to transition to a new IT program. Newer hardware likely has virtualization capacity. It makes sense to upgrade now, knowing that doing so when it is inevitable or reached a critical point will be extra complex.

Curing data management issues in the healthcare sector

Data management in the healthcare industry is reaching a tipping point. According to CDW Healthcare, the medical sector is gearing up to massive data growth – the 500 petabytes of data in 2013 are set to rise to 25,000 PBs by 2020. By 2015, the average hospital could be producing around 665 terabytes of data.

It's not just the amount of data that's the issue, but the types of information organizations collect. About 80 percent of data is unstructured, with imaging, scans and video requiring huge swaths of server space. Also, many healthcare providers are storing redundant information – the average hospital has 800,000 total records, but as many as 96,000 are duplicates. They are costly to store, making filing systems and data management efforts more complex without delivering additional security.

While big data offers potential benefits in patient care, research and treatment, the healthcare sector is flailing. In part, it's due to a relatively unique set of circumstances. The healthcare sector is traditionally fairly tech-averse – that acres of file cabinets containing patient records in manila folders still persist is a testament to how difficult it is to go digital. Initiatives such as electronic health records and healthcare information exchanges that increase the value of data have to contend with a slew of compliance, privacy and confidentiality issues.

Data management services can help healthcare organizations wield their vast information reserves in a cost-effective and secure way. Modern information technology infrastructure and business intelligence tools are critical to the effective utilization and protection of game-changing data-driven strategies, wrote Forbes contributor John Foley. Not only are massive file systems difficult to back up in a comprehensive way, many medical providers don't have any idea how long it would take to make files available following an unplanned incident. A data management services provider can help the organization establish a customized storage and backup system that prioritizes continuity and compliance. With people's lives potentially hanging in the balance, it's vital that healthcare providers alleviate big data headaches.

Colocation provides balance in a precarious world

 

Colocation is an increasingly popular choice for companies that want to cut down on data center spending without relinquishing control over their equipment. The market for wholesale and retail colocation is expected to surpass $43 billion by 2018, according to MarketsandMarkets. This represents a compound annual growth rate of 11 percent from 2013 to 2018. Retail colocation, in which businesses lease space in a large data center that services multiple clients, is rising in demand, with retail colocation deals often topping 1 megawatt of critical power to satisfy scaling client needs.

Many organizations that have little experience with massive infrastructure needs are now faced with increasing convergence between business and IT. This dive into the deep end can quickly subvert budgeting, resourcing, tech support and data strategies that companies have carefully planned. Colocation provides an alternative to an endless cycle of purchasing new equipment, building additions to onsite data centers and retraining staff. As Computer Weekly contributor Clive Longbottom pointed out, it makes little sense to build a facility given so much uncertainty, when it’s nearly impossible to predict demand even a few years down the road.

Unlike managed services, in which a company outsources the oversight of its infrastructure to a provider, colocation enables it to use its own servers and retain control of installation, maintenance and management. This can be a good first step for an organization that may have less experience with IT outsourcing but knows that it can’t subsist much longer on the status quo.