Schools see benefits with cloud computing

Businesses have been reaping the benefits of cloud environments for years now, but other organizations have been slower to follow along. Now, the education sector is also beginning to experience advantages by virtualizing, storing and analyzing their data in the cloud. However, the benefits of cloud adoption by schools are greater than simply being able to automate daily operations.

Students are able to get more individual attentions through use of the cloud, as teachers can track individual progress. This enables teachers to get a more comprehensive view of the classroom's strengths and weaknesses and allow them to identify potential issues before they arise. Sharing and collaboration are also enhanced through use of the cloud, as students can work on a document together from multiple locations or share notes for an upcoming test through a convenient Web portal. Cloud platforms offer students the ability to interact more with their lessons. Presentations and assignments can be shared with the class through the cloud environment and include videos, links to related site and webinars.

Cloud computing beneficial to student learning
Educational cloud environments have been found to help improve student performance. Google performed a case study of NYC Intermediate School 339, tracking performance before and after a cloud platform was implemented. Prior to the use of cloud computing, 22 percent of the school's students completed grade-level math. After introducing cloud technology, 47 percent were able to complete the course. According to the principal, Jason Levy, behavior was also improved and attendance increased after the school started using the cloud.

As well as enhancing learning and improving students' performance, cloud computing also save teachers time by reducing the number of mundane tasks required on a day to day basis. Activities like photocopying, hole punching and making packets for students are no longer necessary because worksheets and homework can be provided electronically, which also reduces paper waste and saves money on materials.

The cost-effectiveness of the cloud can be especially beneficial for budget conscious schools. According to Public School Review, Oregon has adopted cloud computing within all of its public schools and estimates it will save the state's Department of Education $1.5 million annually. Overhead and maintenance costs can be dramatically reduced by implementing an educational cloud environment. Machines such as printers and copiers are no longer needed, saving money on costly materials like ink. Physical storage space within school buildings can also be re-purposed, creating lucrative real estate space at zero cost.

Demand for data center power solutions growing

As the use of technology continues to increase, the amount of data created grows as well and the need for a place to store all of that information becomes more urgent. The demand for data center space and the ability to safely and efficiently process information are rising, creating a sharp rise in the need for data center providers to expand their capacity. However, to support the growing number of new data centers, a massive amount of energy is required.

Along with the creation of new computing facilities, the demand for efficient power solutions for data centers is also expected to grow at a strong pace over the next few years, according to a new study by MarketsandMarkets. Nearly half of the total cost associated with the operation of a data center comes from power usage, causing companies to develop increasingly efficient energy solutions to help facility managers reduce their power expenditures, as well as their total cost of ownership.

According to the study, providers of power solutions are beginning to broaden their offerings in order to meet the business demands of their data center clients. Vendors are helping facility managers to reduce their infrastructure and operational costs by providing efficient electrical solutions, which also servers to increase data center capacity. A growing number of facilities are employing high-density zones in which each server rack uses more than 10kW, and to fully leverage the benefits of these zones, efficient power solutions are necessary.

Data center power market to grow over next five years
As a result of the growing use of efficient energy, the global data center industry has seen a decreased consumption of electricity over the last two years. While electricity usage is declining, the use of energy in data centers is growing rapidly, and the report estimates that the global data center power market will rise from $15.19 billion this year to more than $23 billion by 2019, increasing at a compound annual growth of 9.3 percent.

Data center power solutions currently contribute to a variety of industry verticals in a vital way, as they offer cost-effective ways to network, as well as enabling facilities to increase their capabilities while maintaining the life cycle of their IT equipment. Energy efficient data center practices are becoming more popular across the globe, but the report expects North America to be the largest region for the global data center power market.

ISG operates multiple data centers in the Midwest and is actively employing energy efficient strategies to keep costs down for clients and protect the environment. At ISG’s Wichita data center, the company has been testing a hot aisle-cool aisle system that captures heat from servers before it can circulate. Heat generated from active servers is captured and directed immediately out of the room to be cooled before it can raise the heat of the surrounding area, while in alternating aisles cool air is pumped in through the floor to keep the room at a constant temperature. Curtis Mead, head of sales for ISG’s data center services segment, says the limited testing of the technique has been so successful that the company plans to expand its use to other aisles within the facility.

Virtualization driving global SDN market

As businesses increasingly look for the most effective and efficient technologies to power their operations, software defined networking continues to become more popular within the larger enterprise and cloud service provider markets in regards to data center networking.

SDN refers to an innovative architectural model that delivers network virtualization, automated provisioning and network programmability to enterprise networks and data centers. Companies are quickly realizing that SDN offers tremendous value to tech-based organizations, and the technology has shown itself to be a driving force for change and innovation in the sector. While SDN is still fairly new to many companies, the landscape of the technology is likely to change in the next three to four years as vendors continue to make large investments in the area and enterprises keep acquiring the technology, according to Cloud Times.

"SDN is taking center stage among innovative approaches to some of the networking challenges brought about by the rise of the third platform, particularly virtualization and cloud computing," said Rohit Mehra, Vice President of Network Infrastructure at International Data Corporation. "With SDN's growing traction in the datacenter for cloud deployments, enterprise IT is beginning to see the value in potentially extending SDN to the WAN and into the campus to meet the demand for more agile approaches to network architecture, provisioning, and operations."

Global SDN market expanding rapidly
A study conducted by IDC predicted that the enterprise SDN market will grow by 89 percent annually, increasing from $960 million in 2014 to $8 billion in 2018, due in large part to the implementation of software virtualization, physical infrastructure, network controllers and security services. A separate report published by MarketsandMarkets estimated that the software defined data center market will rise to $5.4 billion over the next four years. The growth in that market is mostly attributed to more frequent use of network virtualization and the practice of corporate data center consolidation. The study also found that network controllers and switches will also contribute to higher market share.

According to MarketsandMarkets, the major industries driving the SDN market are financial services, government, telecom and education. All of the sectors most commonly utilizing SDN can benefit from the technology's simplified network designs and operations, directly programmable network control, ability to increase the network's agility in adjusting to traffic flow and single interface management capabilities.

Cloud computing, virtualization offer benefits to healthcare industry

Recent advancements in technology have impacted every industry, but none more so than healthcare. The emergence of mobile devices like smartphones and tablets have influenced medical providers, as those devices are beginning to replace traditional monitoring and recording systems and allowing patients more flexibility in their treatment. The growing use of cloud computing has especially had an effect on healthcare, improving communication, data storage and ease-of-use.

One of the biggest advantages technology has brought to the healthcare industry is an improvement in the way doctors and patients communicate with one another. It can often be hard for patients to get ahold of their physicians, but with a variety of cloud communication options like voice and video conferencing, as well as technology focused solely on connecting doctors and patients, the burden is being eased.

Remote monitoring is another major benefit of the use of technology in medicine. Just as it can be difficult for patients to get in touch with their doctors, it can also be hard for many people to make it to the hospital at all. Home monitoring technology allows patients to use a small device designed specifically for their health problem from the comfort of their home. According to a report by Research and Markets, 2.8 million patients worldwide were utilizing home monitoring by the end of 2012. This provides patients with reliable care while reducing the cost of multiple visits to the doctor and lowering the risk of having to be readmitted. Readmission rates for cardiac patients using home monitoring dropped from 25 percent to just 2 percent, Becker's Hospital Review reported.

Cloud computing and virtualization are now also able to take remote monitoring one step further and provide patients with complete medical treatment from their homes using a telehealth platform. Just as home monitoring helped to reduce expenses, telemedicine is also cost-effective as it reduces travel times for patients and allows doctors to see more people each day. For patients living in rural or underserved areas, being able to have a doctor's visit over a video conference and receive prescriptions and medical records through a cloud-based portal is a dramatic improvement from having to travel long distances to see a physician, or not getting any treatment at all.

Increase in healthcare data breaches highlight need for improved storage solutions

While much of the news on cybersecurity and data breaches has been focused on attacks aimed at retail stores, security experts are increasingly warning healthcare organizations that hackers are more frequently going after targets in this $3 trillion industry.

In the underground market where cybercriminals sell their stolen goods, medical information can go for more than 10 times what credit card numbers are worth. Due to the high price medical records can fetch, attacks are increasing at an alarming rate. Just last month the FBI warned healthcare providers to be on high alert after Community Health Systems, one of the U.S.'s largest hospital operators was hacked and the information of 4.5 million patients was compromised. A recent study by the Ponemon Institute found that the number of healthcare organizations reporting a data breach is rising, with 40 percent of providers reporting an intrusion in 2013 as opposed to 20 percent in 2009.

Lack of awareness makes healthcare great target
As opposed to retail data breaches or personal identity theft, fraud involving medical information is rarely detected in a timely manner, making it more worthwhile for hackers to go after healthcare records instead of credit card numbers.

"As attackers discover new methods to make money, the healthcare industry is becoming a much riper target because of the ability to sell large batches of personal data for profit," said Dave Kennedy, CEO of TrustedSEC LLC in an interview with Reuters. "Hospitals have low security, so it's relatively easy for these hackers to get a large amount of personal data for medical fraud."

According to an FBI estimate, one medical record can sell for as much as $50 in an underground marketplace, in stark contrast to the few dollars a stolen credit card might bring in. Stolen medical information commonly on sale on the black market includes names, dates of birth, billing information, diagnosis codes and policy numbers. This data is then used by cybercriminals to create fake IDs in order to purchase prescriptions or medical equipment that can be resold, or to make phony insurance claims.

Low funding, high risk
One of the major drivers in the increase in healthcare data breaches is the recent switch to electronic medical records. In an interview with the Boston Globe, Beth Israel Deaconess Medical Center CIO John Halamka said that IT departments in the healthcare industry commonly receive between only 2 and 3 percent of an organization's budget, compared with the 20 percent offered to those in retail and financial industries, yet organizations are being forced to rely on technical solutions. Perhaps because of the lack of funding, a recent study by security firm BitSight Technology found that healthcare providers respond more slowly to data breaches than any other sector, compounding the problem.

The Ponemon Institute report found that the healthcare industry loses $5.6 billion a year due to security incidents. As cybercriminals continue to find more sophisticated attack methods and target larger amounts of information, healthcare providers will have to find a more secure way of storing their electronic medical records. A reliable way to protect patient data is to utilize cloud storage services. Data saved in the cloud can be easily encrypted and kept in a separate place from other enterprise information. Business continuity procedures are also improved by keeping health records in the cloud, as duplicate data can be stored offsite and kept safe in case a system is compromised or a disaster were to occur. Cloud services are a cost-effective storage option as they are highly scalable and require healthcare providers to only pay for the amount of service being used. This allows cash-strapped organizations to protect sensitive information without breaking the bank. 

FCC considering proposal on net neutrality regulations

The Federal Communications Commission continues to consider proposed rules this week that would change the way broadband providers are able to handle traffic moving across their networks.

The FCC first enacted regulations on net neutrality – the concept of treating all Internet packets equally – back in 2010, barring Internet service providers from blocking or unreasonably discriminating against any type of Web traffic. However, a federal court struck those rules down earlier this year. Now FCC chairman Tom Wheeler is working to create new requirements that will be capable of surviving future legal challenges.

The proposal the commission is currently considering would ban providers from intentionally blocking or slowing down traffic to specific websites, but create the possibility for sites to pay for special access to faster service for their clients, essentially creating a tiered system. The proposed regulations have caused backlash amongst proponents of net neutrality, and the agency received a record 3.7 million public comments on the issue, many of them against paid prioritization.

“The U.S. government should ensure that entrepreneurs do not face arbitrary roadblocks that limit their potential to build products and services on the Internet,” National Venture Capital Association President Bobby Franklin said in an interview with The Wall Street Journal. “If the FCC were to allow this, it would create a competitive advantage for well-established companies while disadvantaging entrepreneurs.”

FCC considering changing definition of broadband
Julie Veach, chief of the FCC’s Wireline Bureau, expressed interest in an agency blog post in the concepts proposed by educational and library groups that would create a new Internet reasonableness standard to ban fast-lane deals with broadband providers. A large group of net neutrality supporters have called on the commission to reclassify broadband Internet access as a public utility under telecommunications law, allowing it to be subject to greater regulations.

In an interview with NextGov, senior vice president of the non-profit public interest group Public Knowledge Harold Feld said he believes it’s a good sign that the FCC is seriously considering proposals that at least rely partially on reclassification, as it shows the agency’s seriousness in combating fast-lanes. However, in a speech last week, Wheeler emphasized his preference to avoid reclassification in favor of encouraging greater competition within the industry which he believes will lessen the need for regulation.

Wi-Fi Alliance announces improvements to Wi-Fi Direct service

 

This week it was announced that new improvements are coming to The Wi-Fi Alliance’s peer-to-peer technology Wi-Fi Direct. The service allows a variety of machines – including printers, PCs, phones and TVs – to communicate one-to-one without the need for a LAN, and the planned enhancements promise to make that action even easier.

The Alliance claims to have certified more than 6,000 products as Wi-Fi Direct-capable over the last four years, IDG News Service reported. Next week, the group plans to introduce four new mechanisms to make carrying out basic tasks simpler over Wi-Fi Direct. Adding the services to a certified device is optional, but they allow users to “discover, connect and do” certain functions with a single click, according to president and CEO of the Wi-Fi Alliance Edgar Figueroa.

The new services will make a variety of tasks simpler, but especially focus on simplifying the ability to share and print documents from mobile devices. The enhancements include:

  • Wi-Fi Direct Send: This feature will allow content to be quickly sent and received by one or more devices while keeping user interaction to a minimum.
  • Wi-Fi Miracast: Enables screen mirroring and display sharing in a single step when devices have implemented the updated device and service discovery mechanisms of Wi-Fi Direct.
  • Wi-Fi Direct for DLNA: Simplifies the process of allowing devices supporting Digital Living Network Alliance interoperability to find each other before connecting to stream content.
  • Wi-Fi Direct Printing: Allows users to print documents directly from PCs, tablets and smartphones with a single command.

New services remove previous complications
With the previous iteration of Wi-Fi Direct, a user could send a presentation from a computer to a Wi-Fi-enabled projector over the service as long as both devices were equipped with the basic technology. But after the initial connection, a variety of additional steps were required that made the process confusing for users. In the past, vendors weren’t developing enough Wi-Fi Direct implementations between products from different vendors, and because of poor interoperability some devices that were advertised as Wi-Fi Direct clients wouldn’t always be able to connect with a user’s device. The new services aim to improve interoperability between vendors’ products.

The new services do not require any additional hardware, allowing upgrades to be provided for products already in hand. The Alliance is making applications available for each of their services to vendors, so only a user interface will need to be created. According to Figueroa, the organization is also making a toolkit available so similar capabilities can be built for other processes in a standardized way.

The simplicity offered by the Wi-Fi Direct service is making devices with the capability increasingly popular, according to a recent study by ABI Research. The firm estimates that 2 billion Wi-Fi Direct certified devices have been shipped to date, RCR Wireless News reported. Over the next four years, ABI expects 81 percent of devices with Wi-Fi capabilities to be certified for Wi-Fi Direct.

Health workers look to the cloud to prevent infectious diseases

As the cloud becomes more widely adopted, the uses for the technology continue to grow. One of the sectors where the uses for cloud computing are advancing rapidly is healthcare.

In hospitals across the country, doctors and nurses are operating over the cloud on virtual desktops in order to access their desktops wherever they are. With the use of virtualization, medical staff are able to access their computers from the nearest thin clients instead of going back to their offices. Not only does this improve patient care, as charts can be updated more quickly and checked more frequently, but less movement helps to stop the spread of infection and decreases contamination. Fewer doctors and nurses entering the rooms of highly contagious people means a lower chance of spreading the disease, and virtual desktops enable medical staff to continue treating patients with a minimal risk of contamination. 

Aid workers look to the cloud for data sharing
On a larger scale, the University of California, San Francisco is in the process of creating a cloud-based platform that would utilize data from the Google Earth Engine to provide health workers around the world with actionable information to predict areas where malaria transmission is the most likely. Google Earth Engine is an aggregator of trillions of satellite images dating back almost 40 years ago, paired with online tools to help researchers map trends, identify changes and quantify differences in the Earth's surface. The project is aiming to provide resource-poor nations with the tools to more narrowly and effectively target campaigns against malaria, which kills 600,000 people each year.

The new tool will look at the relationship between occurrences of the disease and environmental factors like rainfall. Maps of the local areas on the Earth Engine will also help scientists and aid workers learn more about what drives malaria transmission. The malaria prediction tool will also allow health workers to share their information from the field about where and when malaria cases have occurred. By combining real time information with satellite data on environmental conditions within Earth Engine, the tool will be able to pinpoint where new cases are most likely to emerge. With more specific locations of expected outbreaks, healthcare officials can distribute bed nets, spray insecticides and give medicines directly to the people who need them most. 

The cloud platform will be launched in Swaziland, but there are plans to make the tool available to workers within the Global Health Group initiative operating in other countries. The program's creators are also looking into adapting the platform to help predict other infectious diseases.

Cloud helps hospitals treat patients more effectively 
​Cloud-based medical programs are also being used in hospitals across the country, including Memorial Hermann Health System in Texas which recently launched a cloud platform that monitors patients for signs of infections. The technology monitors all of its hospitals' patients simultaneously and continuously for signs of sepsis, a life-threatening infection complication that affects nearly 750,000 people nationwide each year and has a 50 percent mortality rate.

The sepsis monitoring system uses precise calculations to detect signs of infection in patients and alerts staff when at least two signs have been found, including rapid breathing, low blood pressure or fever. The tool alerts medical staff to the infection and enables them to quickly begin procedures to treat the condition.

Companies find increased reliability, flexibility with desktop virtualization

As bring-your-own-device policies and remote working have become increasingly popular and resource optimization has become more necessary, keeping enterprise IT current and efficient is growing more complex all the time. Between PCs and each employee's personal devices, upgrading the applications and operating systems on individual endpoints can consume time and resources that most companies just don't have. Luckily, desktop virtualization and remote application delivery have emerged as reliable alternatives to traditional network delivery.

As Tech Radar contributor David Howell noted, moving to a virtual desktop environment offers small- and medium-sized businesses dramatic gains in control, as well as being an effective way to future-proof IT systems. A recent study by VMware found that 90 percent of enterprise IT departments spend at least half of their time completing routine administrative tasks. SMBs that have implemented virtualization, however, reported experiencing an increase in productivity and 73 percent said they witnessed significant improvements in the amount of time spent completing administrative tasks.

When an office transitions to a virtual desktop environment, it means that the computers employees use have desktops delivered and controlled directly from a central server room. This offers centralized management of the office's desktops, since each one is virtualized and provided in an isolated state, creating a highly secure network environment. 

"Desktop, or endpoint, virtualization enables a centralized server to deliver and manage individual desktops remotely," according to Symantec. "While users enjoy full access to their systems, IT staff can provision, manage, upgrade, and patch them virtually, instead of physically. This also means that users can access files and applications on a central server. Companies might also opt for a hybrid scenario where users can access some applications through a central virtualized server and others through their local computers."

Enterprises see a variety of benefits with virtualization
Transitioning to a virtual environment and leaving behind traditionally installed OSs and applications enables businesses to be more flexible and agile, as virtual desktops can change in real time to reflect the work at hand while all being managed from a single, central location. Virtualization also allows companies to offer their employees more mobility, being able to access data and applications from the same work environment no matter where they are. Workers can easily connect to servers from multiple devices as all the necessary components are available at login.

Adopting desktop virtualization is also cost-effective and provides a high return on investment, as it offers a customized user experience that is more scalable and reliable than traditional options. Business continuity is improved with the use of desktop virtualization, with all data saved in an off-site data center that prevents lost, stolen or damaged devices from having a damaging impact on the organization's daily processes. At the same time virtualization makes for a logical addition to any enterprise disaster recovery plan, as desktop applications are being offered through an off-site server, so power outages or extreme weather won't affect business. Running operating systems and applications through a virtual machine increases enterprise security by allowing employees a safe way to access sensitive corporate information.

Survey finds companies aren't taking advantage of cloud automation

A recent survey of cloud use among CIOs found that many organizations are using much less computing power than they're paying for.

Only about half of the capacity companies have bought is being used, according to the study, and 90 percent of organizations reported that they consider over-provisioning as an unavoidable aspect of operating in the cloud. On the other side of the spectrum, 88 percent of CIOs surveyed said they have previously chosen to sacrifice performance at some of the busiest times in order to keep costs down.

According to the report, many companies continue the bad habits of doing without peak performance and over-provisioning their platforms because they are used to them from using on-premise solutions and have gotten used to these types of limitations.

According to the study, companies are paying nearly twice as much as they should be, considering the capacity they are utilizing, and a major reason for this is that a big portion of enterprises are manually managing their cloud services, adding to cost.

"…[a]s the research shows, and as half of respondents recognized, cloud as we have it today really isn't truly elastic — it does not expand and retract automatically to meet demands, and it is not paid for like a utility, based on consumption," said cloud computing expert Richard Davies. "However, with next-generation cloud and containerization technology, change is afoot." 

The report found that only 14 percent of CIOs surveyed had automated their cloud platform, something that can save time and money. As ITProPortal contributor Darren Allan noted, virtual machines and application programming interfaces can help businesses to scale and automate their cloud services. Once the task of having to manually make changes to capacity levels based on traffic volumes can be given to a machine and done on its own, enterprise IT departments can be freed up for more mission critical projects and benefit the company immensely.

Cloud services provider ISG can help to implement reliable solutions utilizing virtual machines and APIs that take advantage of automation and help companies make the most out of their cloud investments.