Posts

Increase in healthcare data breaches highlight need for improved storage solutions

While much of the news on cybersecurity and data breaches has been focused on attacks aimed at retail stores, security experts are increasingly warning healthcare organizations that hackers are more frequently going after targets in this $3 trillion industry.

In the underground market where cybercriminals sell their stolen goods, medical information can go for more than 10 times what credit card numbers are worth. Due to the high price medical records can fetch, attacks are increasing at an alarming rate. Just last month the FBI warned healthcare providers to be on high alert after Community Health Systems, one of the U.S.'s largest hospital operators was hacked and the information of 4.5 million patients was compromised. A recent study by the Ponemon Institute found that the number of healthcare organizations reporting a data breach is rising, with 40 percent of providers reporting an intrusion in 2013 as opposed to 20 percent in 2009.

Lack of awareness makes healthcare great target
As opposed to retail data breaches or personal identity theft, fraud involving medical information is rarely detected in a timely manner, making it more worthwhile for hackers to go after healthcare records instead of credit card numbers.

"As attackers discover new methods to make money, the healthcare industry is becoming a much riper target because of the ability to sell large batches of personal data for profit," said Dave Kennedy, CEO of TrustedSEC LLC in an interview with Reuters. "Hospitals have low security, so it's relatively easy for these hackers to get a large amount of personal data for medical fraud."

According to an FBI estimate, one medical record can sell for as much as $50 in an underground marketplace, in stark contrast to the few dollars a stolen credit card might bring in. Stolen medical information commonly on sale on the black market includes names, dates of birth, billing information, diagnosis codes and policy numbers. This data is then used by cybercriminals to create fake IDs in order to purchase prescriptions or medical equipment that can be resold, or to make phony insurance claims.

Low funding, high risk
One of the major drivers in the increase in healthcare data breaches is the recent switch to electronic medical records. In an interview with the Boston Globe, Beth Israel Deaconess Medical Center CIO John Halamka said that IT departments in the healthcare industry commonly receive between only 2 and 3 percent of an organization's budget, compared with the 20 percent offered to those in retail and financial industries, yet organizations are being forced to rely on technical solutions. Perhaps because of the lack of funding, a recent study by security firm BitSight Technology found that healthcare providers respond more slowly to data breaches than any other sector, compounding the problem.

The Ponemon Institute report found that the healthcare industry loses $5.6 billion a year due to security incidents. As cybercriminals continue to find more sophisticated attack methods and target larger amounts of information, healthcare providers will have to find a more secure way of storing their electronic medical records. A reliable way to protect patient data is to utilize cloud storage services. Data saved in the cloud can be easily encrypted and kept in a separate place from other enterprise information. Business continuity procedures are also improved by keeping health records in the cloud, as duplicate data can be stored offsite and kept safe in case a system is compromised or a disaster were to occur. Cloud services are a cost-effective storage option as they are highly scalable and require healthcare providers to only pay for the amount of service being used. This allows cash-strapped organizations to protect sensitive information without breaking the bank. 

FCC considering proposal on net neutrality regulations

The Federal Communications Commission continues to consider proposed rules this week that would change the way broadband providers are able to handle traffic moving across their networks.

The FCC first enacted regulations on net neutrality – the concept of treating all Internet packets equally – back in 2010, barring Internet service providers from blocking or unreasonably discriminating against any type of Web traffic. However, a federal court struck those rules down earlier this year. Now FCC chairman Tom Wheeler is working to create new requirements that will be capable of surviving future legal challenges.

The proposal the commission is currently considering would ban providers from intentionally blocking or slowing down traffic to specific websites, but create the possibility for sites to pay for special access to faster service for their clients, essentially creating a tiered system. The proposed regulations have caused backlash amongst proponents of net neutrality, and the agency received a record 3.7 million public comments on the issue, many of them against paid prioritization.

“The U.S. government should ensure that entrepreneurs do not face arbitrary roadblocks that limit their potential to build products and services on the Internet,” National Venture Capital Association President Bobby Franklin said in an interview with The Wall Street Journal. “If the FCC were to allow this, it would create a competitive advantage for well-established companies while disadvantaging entrepreneurs.”

FCC considering changing definition of broadband
Julie Veach, chief of the FCC’s Wireline Bureau, expressed interest in an agency blog post in the concepts proposed by educational and library groups that would create a new Internet reasonableness standard to ban fast-lane deals with broadband providers. A large group of net neutrality supporters have called on the commission to reclassify broadband Internet access as a public utility under telecommunications law, allowing it to be subject to greater regulations.

In an interview with NextGov, senior vice president of the non-profit public interest group Public Knowledge Harold Feld said he believes it’s a good sign that the FCC is seriously considering proposals that at least rely partially on reclassification, as it shows the agency’s seriousness in combating fast-lanes. However, in a speech last week, Wheeler emphasized his preference to avoid reclassification in favor of encouraging greater competition within the industry which he believes will lessen the need for regulation.

Wi-Fi Alliance announces improvements to Wi-Fi Direct service

 

This week it was announced that new improvements are coming to The Wi-Fi Alliance’s peer-to-peer technology Wi-Fi Direct. The service allows a variety of machines – including printers, PCs, phones and TVs – to communicate one-to-one without the need for a LAN, and the planned enhancements promise to make that action even easier.

The Alliance claims to have certified more than 6,000 products as Wi-Fi Direct-capable over the last four years, IDG News Service reported. Next week, the group plans to introduce four new mechanisms to make carrying out basic tasks simpler over Wi-Fi Direct. Adding the services to a certified device is optional, but they allow users to “discover, connect and do” certain functions with a single click, according to president and CEO of the Wi-Fi Alliance Edgar Figueroa.

The new services will make a variety of tasks simpler, but especially focus on simplifying the ability to share and print documents from mobile devices. The enhancements include:

  • Wi-Fi Direct Send: This feature will allow content to be quickly sent and received by one or more devices while keeping user interaction to a minimum.
  • Wi-Fi Miracast: Enables screen mirroring and display sharing in a single step when devices have implemented the updated device and service discovery mechanisms of Wi-Fi Direct.
  • Wi-Fi Direct for DLNA: Simplifies the process of allowing devices supporting Digital Living Network Alliance interoperability to find each other before connecting to stream content.
  • Wi-Fi Direct Printing: Allows users to print documents directly from PCs, tablets and smartphones with a single command.

New services remove previous complications
With the previous iteration of Wi-Fi Direct, a user could send a presentation from a computer to a Wi-Fi-enabled projector over the service as long as both devices were equipped with the basic technology. But after the initial connection, a variety of additional steps were required that made the process confusing for users. In the past, vendors weren’t developing enough Wi-Fi Direct implementations between products from different vendors, and because of poor interoperability some devices that were advertised as Wi-Fi Direct clients wouldn’t always be able to connect with a user’s device. The new services aim to improve interoperability between vendors’ products.

The new services do not require any additional hardware, allowing upgrades to be provided for products already in hand. The Alliance is making applications available for each of their services to vendors, so only a user interface will need to be created. According to Figueroa, the organization is also making a toolkit available so similar capabilities can be built for other processes in a standardized way.

The simplicity offered by the Wi-Fi Direct service is making devices with the capability increasingly popular, according to a recent study by ABI Research. The firm estimates that 2 billion Wi-Fi Direct certified devices have been shipped to date, RCR Wireless News reported. Over the next four years, ABI expects 81 percent of devices with Wi-Fi capabilities to be certified for Wi-Fi Direct.

Health workers look to the cloud to prevent infectious diseases

As the cloud becomes more widely adopted, the uses for the technology continue to grow. One of the sectors where the uses for cloud computing are advancing rapidly is healthcare.

In hospitals across the country, doctors and nurses are operating over the cloud on virtual desktops in order to access their desktops wherever they are. With the use of virtualization, medical staff are able to access their computers from the nearest thin clients instead of going back to their offices. Not only does this improve patient care, as charts can be updated more quickly and checked more frequently, but less movement helps to stop the spread of infection and decreases contamination. Fewer doctors and nurses entering the rooms of highly contagious people means a lower chance of spreading the disease, and virtual desktops enable medical staff to continue treating patients with a minimal risk of contamination. 

Aid workers look to the cloud for data sharing
On a larger scale, the University of California, San Francisco is in the process of creating a cloud-based platform that would utilize data from the Google Earth Engine to provide health workers around the world with actionable information to predict areas where malaria transmission is the most likely. Google Earth Engine is an aggregator of trillions of satellite images dating back almost 40 years ago, paired with online tools to help researchers map trends, identify changes and quantify differences in the Earth's surface. The project is aiming to provide resource-poor nations with the tools to more narrowly and effectively target campaigns against malaria, which kills 600,000 people each year.

The new tool will look at the relationship between occurrences of the disease and environmental factors like rainfall. Maps of the local areas on the Earth Engine will also help scientists and aid workers learn more about what drives malaria transmission. The malaria prediction tool will also allow health workers to share their information from the field about where and when malaria cases have occurred. By combining real time information with satellite data on environmental conditions within Earth Engine, the tool will be able to pinpoint where new cases are most likely to emerge. With more specific locations of expected outbreaks, healthcare officials can distribute bed nets, spray insecticides and give medicines directly to the people who need them most. 

The cloud platform will be launched in Swaziland, but there are plans to make the tool available to workers within the Global Health Group initiative operating in other countries. The program's creators are also looking into adapting the platform to help predict other infectious diseases.

Cloud helps hospitals treat patients more effectively 
​Cloud-based medical programs are also being used in hospitals across the country, including Memorial Hermann Health System in Texas which recently launched a cloud platform that monitors patients for signs of infections. The technology monitors all of its hospitals' patients simultaneously and continuously for signs of sepsis, a life-threatening infection complication that affects nearly 750,000 people nationwide each year and has a 50 percent mortality rate.

The sepsis monitoring system uses precise calculations to detect signs of infection in patients and alerts staff when at least two signs have been found, including rapid breathing, low blood pressure or fever. The tool alerts medical staff to the infection and enables them to quickly begin procedures to treat the condition.

Companies find increased reliability, flexibility with desktop virtualization

As bring-your-own-device policies and remote working have become increasingly popular and resource optimization has become more necessary, keeping enterprise IT current and efficient is growing more complex all the time. Between PCs and each employee's personal devices, upgrading the applications and operating systems on individual endpoints can consume time and resources that most companies just don't have. Luckily, desktop virtualization and remote application delivery have emerged as reliable alternatives to traditional network delivery.

As Tech Radar contributor David Howell noted, moving to a virtual desktop environment offers small- and medium-sized businesses dramatic gains in control, as well as being an effective way to future-proof IT systems. A recent study by VMware found that 90 percent of enterprise IT departments spend at least half of their time completing routine administrative tasks. SMBs that have implemented virtualization, however, reported experiencing an increase in productivity and 73 percent said they witnessed significant improvements in the amount of time spent completing administrative tasks.

When an office transitions to a virtual desktop environment, it means that the computers employees use have desktops delivered and controlled directly from a central server room. This offers centralized management of the office's desktops, since each one is virtualized and provided in an isolated state, creating a highly secure network environment. 

"Desktop, or endpoint, virtualization enables a centralized server to deliver and manage individual desktops remotely," according to Symantec. "While users enjoy full access to their systems, IT staff can provision, manage, upgrade, and patch them virtually, instead of physically. This also means that users can access files and applications on a central server. Companies might also opt for a hybrid scenario where users can access some applications through a central virtualized server and others through their local computers."

Enterprises see a variety of benefits with virtualization
Transitioning to a virtual environment and leaving behind traditionally installed OSs and applications enables businesses to be more flexible and agile, as virtual desktops can change in real time to reflect the work at hand while all being managed from a single, central location. Virtualization also allows companies to offer their employees more mobility, being able to access data and applications from the same work environment no matter where they are. Workers can easily connect to servers from multiple devices as all the necessary components are available at login.

Adopting desktop virtualization is also cost-effective and provides a high return on investment, as it offers a customized user experience that is more scalable and reliable than traditional options. Business continuity is improved with the use of desktop virtualization, with all data saved in an off-site data center that prevents lost, stolen or damaged devices from having a damaging impact on the organization's daily processes. At the same time virtualization makes for a logical addition to any enterprise disaster recovery plan, as desktop applications are being offered through an off-site server, so power outages or extreme weather won't affect business. Running operating systems and applications through a virtual machine increases enterprise security by allowing employees a safe way to access sensitive corporate information.

Survey finds companies aren't taking advantage of cloud automation

A recent survey of cloud use among CIOs found that many organizations are using much less computing power than they're paying for.

Only about half of the capacity companies have bought is being used, according to the study, and 90 percent of organizations reported that they consider over-provisioning as an unavoidable aspect of operating in the cloud. On the other side of the spectrum, 88 percent of CIOs surveyed said they have previously chosen to sacrifice performance at some of the busiest times in order to keep costs down.

According to the report, many companies continue the bad habits of doing without peak performance and over-provisioning their platforms because they are used to them from using on-premise solutions and have gotten used to these types of limitations.

According to the study, companies are paying nearly twice as much as they should be, considering the capacity they are utilizing, and a major reason for this is that a big portion of enterprises are manually managing their cloud services, adding to cost.

"…[a]s the research shows, and as half of respondents recognized, cloud as we have it today really isn't truly elastic — it does not expand and retract automatically to meet demands, and it is not paid for like a utility, based on consumption," said cloud computing expert Richard Davies. "However, with next-generation cloud and containerization technology, change is afoot." 

The report found that only 14 percent of CIOs surveyed had automated their cloud platform, something that can save time and money. As ITProPortal contributor Darren Allan noted, virtual machines and application programming interfaces can help businesses to scale and automate their cloud services. Once the task of having to manually make changes to capacity levels based on traffic volumes can be given to a machine and done on its own, enterprise IT departments can be freed up for more mission critical projects and benefit the company immensely.

Cloud services provider ISG can help to implement reliable solutions utilizing virtual machines and APIs that take advantage of automation and help companies make the most out of their cloud investments.

University of Nebraska's Memorial Stadium upgrades Wi-Fi offerings

The students attending the University of Nebraska – Lincoln received a nice surprise when they returned for classes last week. The school’s football arena, Memorial Stadium, was given a $12.3 million makeover in an effort to improve its sound system and Wi-Fi capabilities.

Upgrading stadium Internet access is a fairly new concept among universities, with only four other schools providing Wi-Fi access to sports fans – Auburn, Texas Christian University, Penn State and Stanford. While other schools have offered students free Internet inside their arenas, University of Nebraska director of information technology Dan Floyd noted that none of those projects were as big as the one taken on by UNL. To give spectators in Memorial Stadium broad coverage, 900 antennas were installed around the premises. The hardware took three months to fully install, but Floyd says his team will continue making adjustments all season.

“When they do a large venue like a stadium or an area, there are no people in it,” said Floyd in an interview with The Daily Nebraskan. “So you really don’t have the opportunity to test it until its full of people.”

Record-breaking upgrades
The Wi-Fi upgrade, dubbed Memorial Stadium Fan Experience Improvements, makes University of Nebraska’s football arena the largest collegiate stadium connected to Wi-Fi and the second largest connected stadium overall, second only to AT&T Stadium in Dallas. The improvements made to the Internet offerings allow football spectators to have access to special features on the school’s mobile app that are only available within the stadium, like instant replays. Floyd said that he wants Husker fans to be able to bring their mobile devices to football games and be able to connect them anywhere in the stadium.

“It’s very important for the stadium to be connected socially,” said Floyd. “You’re connected in the union as a student, you’re connected inside Starbucks as a client. Wherever you go, people have that expectation.”

According to Omaha.com, 80 percent of the stadium’s visitors should be able to access the Wi-Fi network at the same time without a problem, enabling fans to post pictures, Tweet about the game or view exclusive game footage as easily as they could at home. As students become increasingly attached to their tablets and smartphones, being able to provide reliable Internet access in the locations they spend most of their time is a boon to schools looking to increase student involvement and school spirit.

University of Nebraska’s Memorial Stadium upgrades Wi-Fi offerings

The students attending the University of Nebraska – Lincoln received a nice surprise when they returned for classes last week. The school’s football arena, Memorial Stadium, was given a $12.3 million makeover in an effort to improve its sound system and Wi-Fi capabilities.

Upgrading stadium Internet access is a fairly new concept among universities, with only four other schools providing Wi-Fi access to sports fans – Auburn, Texas Christian University, Penn State and Stanford. While other schools have offered students free Internet inside their arenas, University of Nebraska director of information technology Dan Floyd noted that none of those projects were as big as the one taken on by UNL. To give spectators in Memorial Stadium broad coverage, 900 antennas were installed around the premises. The hardware took three months to fully install, but Floyd says his team will continue making adjustments all season.

“When they do a large venue like a stadium or an area, there are no people in it,” said Floyd in an interview with The Daily Nebraskan. “So you really don’t have the opportunity to test it until its full of people.”

Record-breaking upgrades
The Wi-Fi upgrade, dubbed Memorial Stadium Fan Experience Improvements, makes University of Nebraska’s football arena the largest collegiate stadium connected to Wi-Fi and the second largest connected stadium overall, second only to AT&T Stadium in Dallas. The improvements made to the Internet offerings allow football spectators to have access to special features on the school’s mobile app that are only available within the stadium, like instant replays. Floyd said that he wants Husker fans to be able to bring their mobile devices to football games and be able to connect them anywhere in the stadium.

“It’s very important for the stadium to be connected socially,” said Floyd. “You’re connected in the union as a student, you’re connected inside Starbucks as a client. Wherever you go, people have that expectation.”

According to Omaha.com, 80 percent of the stadium’s visitors should be able to access the Wi-Fi network at the same time without a problem, enabling fans to post pictures, Tweet about the game or view exclusive game footage as easily as they could at home. As students become increasingly attached to their tablets and smartphones, being able to provide reliable Internet access in the locations they spend most of their time is a boon to schools looking to increase student involvement and school spirit.

Cloud computing increases innovation, collaboration survey finds

With all of the new technological advances that have affected business in recent years, the one that has had the biggest impact is probably cloud computing. The cloud has changed the way technology is viewed by companies, as it provides a simple, effective way to implement changes, engage with clients and spark innovations. Because the cloud is cost-effective and easy to deploy, it is now possible to experiment with technology, develop new products more quickly and distribute more widely and to a scale once out of reach for all but the biggest organizations.

A recent survey conducted by Oxford Economics surveyed 350 tech and business executives to find out what drove them toward adopting the cloud for their companies. Researchers found that 36 percent of respondents implemented a cloud platform because they found the technology to be critical to the innovation strategy of the enterprise.

“Cloud computing today is fundamentally altering business processes and changing the way organizations interact with clients, partners and employees,” read the report. “This transformation brings incredible opportunities, including the ability to build a real-time enterprise where interaction and innovation flourish.”

Collaboration flourishes in the cloud
One of the biggest benefits of the cloud is the ease with which it allows people to communicate and share. In fact, the survey found that 63 percent of executives believe use of the cloud is increasing collaboration among the business units of their companies. A recent development in the use of the cloud is the ability to connect records systems with engagement platforms to provide companies with the most creative and productive use of all of the data they collect. If each department can see the information that is being collected by everyone else, new ways to use that data can and will be found much quicker and be used more creatively than if data sets were kept separated by business unit.

In an article for Forbes, contributor Robert LeBlanc noted that this concept of information sharing is being put to use by El Corte Ingles, Europe’s largest department store. The retailer utilized the cloud to rapidly expand their online presence and currently employs the technology to monitor client preferences and buying habits to offer promotions and accurate pricing in real time.

For companies interested in implementing a similar initiative in their organizations, hybrid cloud environments are a reliable solution. Services can be automated in a hybrid platform and allow enterprises to see how they are being used and control them to better protect the security and privacy of the business.

Companies embracing cloud for increased flexibility, lower costs

 

As technology becomes an increasingly important part of doing business, companies are realizing the benefits of the cloud. Utilizing cloud-based applications is a great way to enhance business operations by reducing costs, increasing flexibility and improving collaboration and productivity.

While traditional software can only be used by the device it was uploaded to, cloud applications can be accessed from any Internet-connected device. The accessible nature of the cloud makes it much more convenient for companies with employees who are frequently on the go or like to telecommute.

The cloud also allows organizations to break free from traditional software and the associated upgrades and high licensing fees for a more cost-effective option. Cloud applications work on a simpler subscription model, making it easier for enterprises to scale their service and pay only for what they really need. Small businesses can especially benefit from implementing a cloud infrastructure, as the lower costs and increased flexibility are ideal for companies with tighter budgets and smaller IT departments.

Another benefit of the cloud is an increase in collaboration between coworkers. The cloud’s ease-of-use and its ability to make documents and services available from anywhere with an Internet connection improves editing and sharing capabilities. Employees who might be miles away from one another physically can be on the same page virtually through the cloud, easily and effectively making edits to documents or changing presentations together over the Internet.

Companies of all sizes will also benefit from the cloud’s ability to serve as a disaster recovery center. Duplicate files can be easily stored in the cloud and kept offsite in case of an emergency at a much lower cost than employing a physical data center for disaster recovery services. Utilizing a cloud disaster recovery solution dramatically improves enterprise data security, ensuring critical information and systems won’t disappear because of a natural disaster. The cloud also increases protection from cyberattacks, as all data stored on the platform is encrypted to make it impossible for hackers to steal sensitive information.

Hybrid cloud ideal for enterprise use
For companies interested in implementing a cloud environment, a hybrid cloud is a good option to consider. Hybrid clouds offer organizations a mix of private and public infrastructures, making it possible to utilize the best of both platforms. Applications can be run on a public platform while critical services and data can be stored privately to add an extra level of security. Hybrid clouds also offer service scalability, making it easier to meet business demands. When traffic is slow, companies can focus on the more critical platform and increase service on the secondary cloud when demand picks up.