Posts

Global unified communications market to reach $75 billion by 2020

A recent report conducted by Grand View Research projects the global unified communications market will exceed $75 billion by 2020.

The study found that enterprises were the largest market for UC applications last year and they were expected to continue maintaining their market share over the next six years. Small and medium-sized companies have also begun deploying the appropriate IP infrastructure to support unified communications, helping to accelerate market growth.

"Increase in mobile workforce as well as enterprise mobility is expected to favorably impact the global market over the next six years," stated a release accompanying the report. "Growing demand and proliferation of smartphones is also expected to fuel market growth over the forecast period. Increasing adoption of BYOD initiatives by large enterprises as well as SMEs [are] a driving force for the industry."

The adoption of unified communications by government and private-sector organizations have also helped to drive the market. UC platforms have proven the ability to provide improved emergency response capabilities, operational continuity and situational awareness, driving adoption.

The report also found that 60 percent of spending for unified communications systems went to on-premise kits. However, managed services, hosted systems and cloud platforms are projected to increase their market share as companies begin to adopt more of those technologies.

According to the study, North America makes up 35 percent of the global unified communications market. However, the European and Asia-Pacific regions are quickly growing and are expected to lead the market over the next six years. The fast-paced growth in these areas is due in most part to the need for effective communication systems and improved collaboration, as well as the savings that come with implementing UC platforms.

New USB-based malware means big trouble for businesses

A pair of security researchers recently discovered a major vulnerability present in nearly every USB-connected device. Karsten Nohl and Jakob Lell created the BadUSB malware as a proof-of-concept virus that they are presenting at the Black Hat security conference in Las Vegas this week. According to the duo, the malware shows that malicious software attacks on the firmware of USB devices can remain undetected for long periods of time through the use of reformatting techniques that enslave devices including smartphones, keyboards, mice and thumb drives.

Nohl and Lell discovered the vulnerability when they realized that the controller chips used in common USB devices aren't protected against malicious reprogramming. The firmware of a thumb drive can be reformatted to make it execute malicious commands without a user knowing anything is wrong, meaning that the BadUSB malware won't just infect just a user's computer, but any device the USB is plugged into. Most people don't realize that connecting a USB to a computer is more complicated than simply allowing a connection. It opens a portal that allows connected devices to have nearly unlimited access to hardware and software, creating a major security concern.

When plugged into an infected computer, Android smartphones can be exploited and turned into compromised network cards, fooling the computer into visiting malicious pages that pose as popular sites like Facebook and Google. An infected device could also impersonate a keyboard and type commands that could lead to a variety of issues, including installing more malware and deleting important files from a hard drive. BadUSB is embedded directly into the firmware of USB devices, making it nearly impossible for an average user to remove the malware from the device. Extreme measures would have to be taken to fully disinfect the firmware, such as disassembling and reverse-engineering a compromised device.

"The next time you have a virus on your computer, you pretty much have to assume your peripherals are infected, and computers of other people who connected to those peripherals are infected," said Nohl.

No help in sight
Unfortunately, there doesn't seem to be any effective ways of preventing a BadUSB-type attack, or removing the malware from an infected device. The anti-virus software used by most companies can't scan the firmware of a device and the firewalls of USBs aren't able to block devices with this kind of infection, according to the researchers. The malicious software associated with BadUSB can infiltrate a computer's embedded USB devices or compromise the PC's basic input-output system inside the motherboard, meaning it can't be removed simply by reformatting a hard drive or reinstalling an operating system.

According to Nohl and Lell, the best way to protect systems in the short-term is to only use thumb drives and other USB-connected devices that have been used only in a secure environment and never connect a device to an unknown computer or share it with an unknown user.

"If you put anything into your USB [slot], it extends a lot of trust," Nohl said. "Whatever it is, there could always be some code running in that device that runs maliciously. Every time anybody connects a USB device to your computer, you fully trust them with your computer. It's the equivalent of [saying] 'here's my computer; I'm going to walk away for 10 minutes. Please don't do anything evil.' "

Alternatives to USB
This new vulnerability poses a major problem for enterprises that share files between employees on thumb drives. It's a convenient method for collaboration, but one that can create drastic cybersecurity issues. One way to avoid falling victim to a BadUSB infection is to utilize cloud storage services. Enterprises that keep documents in the cloud can offer employees easy access to files while still ensuring security. The cloud storage allows documents to be accessed from anywhere with an Internet connection without having to connect a strange device and expose a system to malicious activity.

New report finds cloud offers increased business agility

A recent study by Harvard Business Review Analytic Services has discovered companies that move aggressively to adopt cloud services are gaining a competitive advantage by increasing business agility and reducing system complexity.The report included responses from over 500 Harvard Business Review readers who work in large and mid-sized companies in a variety of industries worldwide.

According to the study, 70 percent of participants had already adopted the cloud and 35 percent of those adopters “are very aggressively moving forward wherever it makes sense.” Among those companies that had already begun to implement cloud technology, 37 percent reported that it had simplified internal operations. Other respondents said they experienced better delivery of internal resources and increased collaboration between employees. More than half of aggressive adopters reported seeing significant advantages from the cloud.

The report also found that the use of cloud technology helps businesses to expand their operations. Of adopters classified as aggressive, 49 percent reported having entered a new market in the last three years. Of the companies cautiously adopting the cloud, 36 percent had done the same.

Businesses see multiple benefits from cloud adoption
When it came to reasons for adopting the cloud, 32 percent of all respondents reported an increase in business agility as the top factor for their transition. Among aggressive adopters, the number rose to 41 percent.

“Not even the cautious adopters led with ‘it really saves money,'” said Angelia Herrin, research director for HBR, according to CIO. “If you’re stuck on using new technologies like cloud just to save money, you’re really losing out. Agility leads to being able to do things like enter new markets, improved productivity and improved responsiveness to clients.”

Other drivers toward cloud adoption reported in the survey were increased innovation, lower costs and the ability to scale to demand.

Herrin noted that the companies who have been able to successfully leverage the cloud all seem to have CIOs that are willing and able to drive change in the enterprise.

“We’re really seeing companies that are making big impacts have a lot of involvement from the top,” said Herrin. “I think the conversation about technology is one where the companies that are moving fast and really experiencing digital transformation have a CEO that is really embracing it and pushing it. Those companies who say technology is an advantage for them also say that their CIOs don’t just have a seat at the table, they’re helping to lead the charge.”

Cloud services level the playing field 
Investment in cloud technologies is beneficial to companies of all sizes, but can also help small enterprises compete on more equal ground with larger firms. With the increased business agility the cloud offers, smaller companies can respond more quickly to change and accomplish work faster and with more accuracy, helping to reduce time to market.

With help from a knowledgeable partner, enterprises can avoid costly infrastructure replacements, security and compliance mistakes and expensive server sprawl. Utilizing colocation and cloud services allow in house IT professionals to have priorities that are more aligned with a company’s specific business goals, putting the organization in a better position to grow.

Cities increasingly utilizing the cloud for disaster recovery services

 

With state and local governments increasingly feeling the pressure to streamline IT operations to control costs and enhance performance, a growing number of cities are beginning to pursue the most up-to-date tools and hardware architectures to modernize their data centers.

At the same time as there is an emphasis on physical devices, city IT managers and CIOs are also utilizing the cloud in their data center renovations. Instead of using tight budgets on new data center facilities, cities are able to implement pay-as-you-go cloud services to consolidate data and programs from different government agencies in an effective way. Many local agencies are employing the cloud to handle spikes in data center workloads, or as a backup service or a disaster recovery utility.

Under the supervision of CIO Vijay Sammeta, the city of San Jose is implementing plans to use the cloud as a backup mechanism for the city’s critical IT infrastructure. In the next 12 to 18 months San Jose will be transitional virtual machines to the cloud and using the technology to manage various applications, as well as for backup and disaster recovery services.

“When you think about all the components of a highly available service delivery stack: network, servers, database and the applications, it starts [to] make a lot of sense to simply let someone else worry about that and just build redundancy to the Internet,” said Sammeta.

The cloud an alternative to physical facilities
The city of Asheville, North Carolina has also turned to the cloud for its disaster recovery plan. The city was set to build a $200,000 disaster recovery center as part of a fire station construction project, but it never came together so Asheville needed a plan B. Utilizing the cloud allows the city to enter disaster recovery mode only when it is critically necessary. The ability to scale for need saves Asheville thousands of dollars a year as compared to the cost of maintaining hardware in a physical facility. With the new system, the city is also able to encompass a number of applications into the disaster recovery plan that were previously uncovered.

In Michigan, Oakland County is using the cloud to supplement its overworked data center facilities, according to CIO Phil Bertolini. Implementing a cloud infrastructure allows the county to transition some systems to the cloud, taking computing pressure off of the data centers’ servers. The town of Newington, Massachusetts is also getting in on the cloud craze, implementing services to extend the city’s business continuity and disaster recovery capabilities.

FBI in search of cloud storage services

The FBI announced this month that it is seeking ideas and suggestions from the private sector about how to construct and implement large-scale cloud infrastructure. The agency's Criminal Justice Information Services Division- which manages the criminal background check system, crime statistics and fingerprint services- is hoping to transition its systems and databases to a cloud environment.

Experts say the move could help cut costs and make the agency's operations more efficient. According to industry expert Trey Hodgkins, the FBI could enhance its mission by transferring services and applications to a cloud platform. In an interview with Federal Times, Hodgkins said that FBI systems and databases would be able to run more efficiently and at a lower cost than legacy systems that frequently run in to trouble when trying to connect to new technology.

"Building a cloud infrastructure gives the FBI the flexibility to decide how much they want to use and what controls and authentications they want to deploy," Hodgkins said.

The cloud environment employed by the FBI must be based between two data centers at least 1,500 miles apart, be able to scale to 2.3 petabytes of memory and replicate data between the two facilities. The platform should also be able to support a wide range of services, including pay-as-you-go policies, scalability and the ability to access all stored information securely and in real-time. The agency also requires the infrastructure to include the use of virtualization, rapid elasticity, resource pooling, continuous monitoring and centrally managed multi-site operations.

The FBI is hoping to make a five year commitment with a contractor to help create and run the public cloud system.

New study finds companies increasingly utilizing cloud for disaster recovery

 

As technology becomes more prevalent in business and companies increasingly rely on massive amounts of data to complete work, the need for a secure backup service and disaster recovery plan is more necessary than ever. In a recent webinar sponsored by Microsoft, Forrester analyst Noel Yuhanna recommended that enterprises strategically implement public cloud services for disaster recovery to ensure business continuity.

According to Yuhanna, more than 70 percent of enterprises currently have to manage at least two terabytes of data, but at the rate new information is being created that could become petabytes in just a few years. In the webinar, Yuhanna praised the cloud for its ability to automate the data backup process and include encryption while not requiring staff to manage the day-to-day operations of the servers and storage platform.

Forrester recently conducted a survey of more than 200 database backup and operations professionals on three continents and found that 15 percent of companies are currently utilizing the cloud for database backups. This number has doubled in the last year, according to Yuhanna. The report also found that users were driven to the cloud for backup and disaster recovery services due to the need for constant application availability, cost savings and organizational agility.

Cloud offers multiple DR benefits
The cloud is ideally suited for disaster recovery because it is able to replicate data that resides in a physical location without having to create a redundant facility to house it. It is also a cost-effective option, as backups and archived data often sit unused for years at a time with few updates and don’t need to be stored in an expensive physical facility. The cloud therefore creates a dual benefit of storing information in a cost-effective environment that is also offsite in case of a disaster.

The Forrester survey also discovered that the key reasons companies utilized the cloud for backup and disaster recovery services were the ability to save money on data storage and administrative costs and provide more frequent backups.

“You could almost be guaranteed that if you decide to put some data in the cloud that, whether it’s an archive or backup, the next year it’s going to be cheaper to store it there,” explained Forrester principal analyst Dave Bartoletti.

Finally, the report found that 57 percent of respondents reported the use of cloud backup and disaster recovery services actually helped to improve their company’s service level agreements, as processes and systems become more reliable with the cloud.

FCC approves plan to increase Wi-Fi access in schools

Earlier this month the Federal Communications Commission approved a plan to spend $2 billion over the next two years on providing schools and libraries with enhanced Wi-Fi capabilities.

The proposal aims to modernize the FCC’s existing E-Rate program in an effort to meet goals set by President Obama in a directive last year to expand 99 percent of U.S. students’ access to broadband. Major industry players like Facebook, Netflix and Bloomberg LP all sent letters to the commission supporting the initiative because “the plan will make dramatic progress in bringing high-speed connectivity to classrooms.”

More schools have Internet, but that’s not enough 
According to Businessweek, the number of U.S. classrooms with a connection to the Internet have increased by 83 percent since the E-Rate program was created in 1998. School administrators, however, say simply being connected isn’t enough. Higher speeds and better service are becoming increasingly necessary and obtaining those things increases costs at a time when school budgets are being dramatically reduced. Sixty percent of schools in the U.S. do not have sufficient Wi-Fi access, according to FCC Chairman Tom Wheeler, and so far the E-Rate program has only been able to improve that access in 5 percent of schools and 1 percent of libraries.

“Technology has changed, the needs of students and library users have changed, and now E-Rate has changed,” said Wheeler. “No responsible business would stick with an IT plan developed in 1998.”

Removing obsolete services, increasing funding
The recently approved plan seeks to provide a larger portion of schools with improved Wi-Fi by revamping the E-Rate program. The initiative pays for telecom services for schools and libraries, but those still include obsolete services like paging and landline phones. By redirecting funding for outdated systems to the Wi-Fi program, more schools will be able to benefit. According to FCC acting managing director Jon Wilkins, the phaseout of obsolete services will create savings of $350 million next year and will grow to $950 million by the program’s fifth year. The remainder of E-Rate’s budget comes from monthly fees telecom providers are required to charge their clients.

According to Wheeler, the commitment of $1 billion to schools and libraries in 2015 means that millions of students will be have access to increased opportunities.

“The new plan will make E-rate dollars go farther by creating processes to drive down prices and increase transparency on how program dollars are spent,” said Wheeler. “And it will simplify the application process for schools and libraries, making the program more efficient while reducing the potential for fraud and abuse.”

The approval vote also made it possible for E-Rate’s annual funding, which has been capped at $2.25 billion since the program started, to be increased later this year. Currently, the initiative’s formula for allocating funding is based on schools’ student numbers and libraries’ physical size, but this method has come under scrutiny by members of Congress and will likely be revised.

Cloud computing in education market grows as schools see benefits

As the benefits of cloud computing become increasingly obvious, a growing number of industries are beginning to embrace it. The newest sector adopting cloud services is education, as they offer students and teachers the ability to access a variety of applications and resources easily and economically. Added security, cost-effectiveness and ease-of-use are also driving the adoption of cloud computing in educational institutions of all levels and sizes, as well as disaster recovery services and the promise of stronger communication and collaboration between students.

Due to the increase in adoption of cloud technology by schools, the global cloud computing in education market is growing at a rapid pace. MarketsandMarkets recently released a report projecting the global market will grow to more than $12 billion by 2019, an increase of $7 billion over five years. The study found that reduced costs, increased flexibility and enhanced infrastructure scalability were major market drivers, as was the growing need for schools to be technologically advanced. 

"The significant production of inexpensive computers, Internet broadband connectivity, and loaded learning content has created a worldwide trend in which Information and Communication Technology is being used to alter the education process," the report stated. "Cloud computing is beginning to play a key role in this revolution."

The report went on to say that North America is expected to remain the leader of the market, but the Asia-Pacific and European regions are projected to show the most significant traction.

A separate survey from a technology provider found that almost 50 percent of respondents in higher education made adopting cloud computing a priority because their employees were increasingly utilizing cloud applications and mobile devices in their work. The study also found that IT professionals in higher education expect to save an average of 20 percent over the next three years due to the implementation of cloud services.

Schools see multiple benefits with cloud 
​A major reason many schools are adopting cloud technology is the ability to cut spending, not just with the cloud services themselves but through the ability to reduce the costs of office supplies like paper and ink. With cloud-based services, teachers are able to make lesson plans, homework and reading available online instead of having to print and copy hundreds of pages each semester.

Another major benefit of the cloud is that disaster recovery and online backup services are built right in to the infrastructure, which comes in very handy as students increasingly complete work electronically. Schools are also benefiting from the large amounts of cloud storage services available. Student records can be kept in the cloud and then encrypted, making them easy to share with necessary parties while at the same time improving security.

With the cloud, students are able to collaborate more easily and effectively as they can work on and edit documents simultaneously. Sharing and transmitting documents is also made easier which improves the ability to receive feedback and improve work. The increased accessibility offered by the cloud also allows students to work on assignments from anywhere with an Internet connection.

New tests discover 'no-wait data center' technology

Researchers from the Massachusetts Institute of Technology recently announced that they have created what they are calling a 'no-wait data center'. According to ZDNet, the researchers were able to conduct experiments in which network transmission queue length was reduced by more than 99 percent. The technology, dubbed FastPass, will be fully explained in a paper being presented in August at a conference for the Association for Computing Machinery special interest group on data communication.

The MIT researchers were able to use one of Facebook's data centers to conduct testing, which showed reductions in latency that effectively eliminated normal request queues. The report states that even in heavy traffic, the latency of an average request dropped from 3.65 microseconds to just 0.23 microseconds.

While the system's increased speed is a benefit, the aim is not to use it for increased processing speeds, but to simplify applications and switches to shrink the amount of bandwidth needed to run a data center. Because of the miniscule queue length, researchers believe FastPass could be used in the construction of highly scalable, centralized systems to deliver faster, more efficient networking models at decreased costs.

Centralizing traffic flow to make quicker decisions
In current network models, packets spend a lot of their time waiting for switches to decide when each packet can move on to its destination, and have to do so with limited information. Instead of this traditional decentralized model, FastPass works on a centralized system and utilizes an arbiter to make all routing decisions. This allows network traffic to be analyzed holistically and routing decisions made based off of the information derived from the analysis. In testing, researchers found that a single eight-core arbiter was able to handle 2.2. terabytes of data per second. 

The arbiter is able to file requests quicker because it divides up the necessary processing power to calculate transmission timing among its cores. FastPass arranges workloads by time slot and assigns requests to the first available server, passing the rest of the work on to the next core which follows the same process.

"You want to allocate for many time slots into the future, in parallel, " explained Hari Balakrishnan, an MIT professor in electrical engineering and computer science. " According to Balakrishnan, each core searches the entire list of transmission requests, picks on to assign and then modifies the list. All of the cores work on the same list simultaneously, efficiently eliminating traffic.

Arbiter provides benefits for all levels
Network architects will be able to use FastPass to make packets arrive on time and eliminate the need to overprovision data center links for traffic that can arrive in unpredictable bursts. Similarly, distributed applications developers can benefit from the technology by using it to split up problems and send them for answers to different servers around the network.

"Developers struggle a lot with the variable latencies that current networks offer," said the report's co-author Jonathan Perry. "It's much easier to develop complex, distributed programs like the one Facebook implements."

While the technology's inventors admit that processing requests in such a manner seems counterintuitive, they were able to show that using the arbiter dramatically improved overall network performance even after the lag necessary for the cores to make scheduling decisions.

The FastPass software is planned to be released as open source code, but the MIT researchers warn that it is not production-ready as of yet. They believe that the technology will begin to be seen in data centers sometime in the next two years.

New tests discover ‘no-wait data center’ technology

Researchers from the Massachusetts Institute of Technology recently announced that they have created what they are calling a 'no-wait data center'. According to ZDNet, the researchers were able to conduct experiments in which network transmission queue length was reduced by more than 99 percent. The technology, dubbed FastPass, will be fully explained in a paper being presented in August at a conference for the Association for Computing Machinery special interest group on data communication.

The MIT researchers were able to use one of Facebook's data centers to conduct testing, which showed reductions in latency that effectively eliminated normal request queues. The report states that even in heavy traffic, the latency of an average request dropped from 3.65 microseconds to just 0.23 microseconds.

While the system's increased speed is a benefit, the aim is not to use it for increased processing speeds, but to simplify applications and switches to shrink the amount of bandwidth needed to run a data center. Because of the miniscule queue length, researchers believe FastPass could be used in the construction of highly scalable, centralized systems to deliver faster, more efficient networking models at decreased costs.

Centralizing traffic flow to make quicker decisions
In current network models, packets spend a lot of their time waiting for switches to decide when each packet can move on to its destination, and have to do so with limited information. Instead of this traditional decentralized model, FastPass works on a centralized system and utilizes an arbiter to make all routing decisions. This allows network traffic to be analyzed holistically and routing decisions made based off of the information derived from the analysis. In testing, researchers found that a single eight-core arbiter was able to handle 2.2. terabytes of data per second. 

The arbiter is able to file requests quicker because it divides up the necessary processing power to calculate transmission timing among its cores. FastPass arranges workloads by time slot and assigns requests to the first available server, passing the rest of the work on to the next core which follows the same process.

"You want to allocate for many time slots into the future, in parallel, " explained Hari Balakrishnan, an MIT professor in electrical engineering and computer science. " According to Balakrishnan, each core searches the entire list of transmission requests, picks on to assign and then modifies the list. All of the cores work on the same list simultaneously, efficiently eliminating traffic.

Arbiter provides benefits for all levels
Network architects will be able to use FastPass to make packets arrive on time and eliminate the need to overprovision data center links for traffic that can arrive in unpredictable bursts. Similarly, distributed applications developers can benefit from the technology by using it to split up problems and send them for answers to different servers around the network.

"Developers struggle a lot with the variable latencies that current networks offer," said the report's co-author Jonathan Perry. "It's much easier to develop complex, distributed programs like the one Facebook implements."

While the technology's inventors admit that processing requests in such a manner seems counterintuitive, they were able to show that using the arbiter dramatically improved overall network performance even after the lag necessary for the cores to make scheduling decisions.

The FastPass software is planned to be released as open source code, but the MIT researchers warn that it is not production-ready as of yet. They believe that the technology will begin to be seen in data centers sometime in the next two years.