Big data causing big changes in the enterprise

A lot of IT buzzwords get thrown around without there ever really being any context as to what the technology does for a business or how many companies are actually utilizing it. For 2015, that buzzword is definitely “big data”. It pops up everywhere, but what is the real picture? According to a recent study by EMC, big data is more than just a buzzword, it’s a necessary tool for enterprise success.

The recent report “Big & Fast Data: The Rise of Insight-Driven Business” sponsored by Capgemini revealed that a growing number of companies are investing in big data initiatives and are seeing positive results. According to the study, 70 percent of IT decision-makers believe their organization’s ability to extract value from big data is critical to their future success. Another 65 percent said that they risk becoming irrelevant or losing a competitive advantage if they don’t utilize big data.

Businesses bracing for shifts as big data takes hold
The study, which included interviews with more than 1,000 senior executives and decision-makers across nine industries in 10 countries, provides a variety of insights into how companies are responding to the changes big data has brought to the enterprise. More than half of respondents believe that investments in big data will outstrip past investment in information management over the next three years. This is due in part to the fact that 63 percent of participants believe the monetization of data could potentially become as valuable as existing products and services. This is especially true among those in the telecommunications sector, where 83 percent of respondents agreed with the statement.

One of the most significant statistics is the fact that 47 percent of senior executives believe their organizations’ IT systems are not properly optimized to allow business decision-makers to do their jobs effectively. These executives reported seeing a need for increasing the cadence of their IT systems’ improvement to keep up with the increasing client, supplier and stakeholder requirements outside of their organizations.

In order to accommodate all of the changes brought about by the increased use of big data, businesses will need to ensure that their data center solutions and IT infrastructure are up to the challenge. Working with a trusted service provider to upgrade infrastructure and improve data center performance is the most reliable way to ensure that new big data initiatives will be implemented successfully.

New survey finds clients willing to pay for stronger Wi-Fi

For most people, it would be hard to go even a few hours without an Internet connection to power a laptop or mobile device. According to a new report, consumers worldwide are more eager than ever to have access to stronger, faster and easier-to-use Wi-Fi services. So eager, in fact, that they are willing to pay a premium to get them.

A recent survey conducted by IE Market Research found that Wi-Fi is more in demand than ever, and subscribers are even willing to relinquish some of their privacy to get a better client experience and personalized offers. The study, which included responses from more than 4,000 Wi-Fi clients in 11 countries, revealed some interesting, though perhaps not surprising, facts about consumer Wi-Fi use.

“Canadian and U.S. clients are willing to pay almost 10% more on average for broadband Internet with certain amenities.”

According to the report, the biggest pain points when it comes to Wi-FI service is coverage outside the house and poor connection speed. Consumers are ready to pay extra to receive carrier-grade services, with clients in Canada and the U.S. willing to pay almost 10 percent more on average for their broadband Internet if it includes service outside of the home and provides a high-connection speed, seamless connections across various endpoints and has automatic handoff to cellular networks.

Consumers are looking for an improved Wi-Fi experience.Consumers are looking for an improved Wi-Fi experience.

Stronger, more personalized Wi-Fi top priorities
So interested in stronger Wi-Fi connections are consumers that nearly 66 percent of those surveyed said they would consider replacing their cellphone plans with a Wi-Fi first offering. Kristin Dolan, chief operating officer of Cablevision, explained this growing trend by saying that cellular networks were built to carry voice, while Wi-Fi connections were meant to handle data. As consumers spend more time using the Internet on their phones and doing things other than make calls like watch video, Wi-Fi becomes the channel of choice for many users.

“Connectivity, particularly wireless, is going to become more and more important to our consumers,” said Cablevision CEO James Dolan. “Connectivity has surpassed video as the primary product for a company like ours. And we need to continue to strategize our product offerings to reflect that with different packaging, etc., which is something I think we will do in 2015.”

Another interesting fact the survey uncovered is that 80 percent of participants said they would feel comfortable allowing their service provider to collect personal data if it would make the marketing and client service experience more personalized and satisfactory. Another 7 percent of respondents said they would even be willing to pay more for their service each month if it meant they would get customized offers and personalized service.

“Clients are looking for customized care,” said Nizar Assanie, vice president of IE Market Research. “This question wasn’t asking whether they’d pay more as a line item for customized care. But they did see the value in it. There is a demand for personalized client support and better quality of service and consumers are willing to pay to get it.”

Businesses look toward converged infrastructures to boost data center performance

As collecting and storing data becomes an increasingly critical part of the enterprise, businesses are starting to pay more attention to the infrastructure needed to handle such key workloads. In order to ensure reliable operations with such an influx of information, data center operators are turning to a variety of innovative methods to improve data handling while lowering costs. One of the most popular of these methods is convergence.

Practically every major cloud platform provider now offers some type of converged infrastructure, and some are even going so far as to realign their business models to work more effectively with the concept. HP is one such company that is making major strides toward accepting convergence architecture. The tech giant is looking to combine blade servers and its CI division to increase the speed of development and provide channel partners with more integrated solutions that help deployment and integration processes happen more quickly.

Convergence is the way of the future for <a  data-cke-saved-href=Convergence is the way of the future for data center operations.

Changes to networking essential for improved data management
When talking about a converged infrastructure, the key element is networking. Server and storage components function basically the same in a converged solution as they would traditionally, but they work in closer proximity to one another. However, as convergence gains more popularity among service providers, networking will evolve to become more of a fabric architecture, according to Information Age contributor Ben Rossi. This change will bring about a variety of challenges.

"As convergence gains more popularity, networking will have to evolve."

Providers will have to take a different approach to virtual networking. Provisioning and setup may be possible with only a simple overlay, but such a solution may inhibit performance as scale increases. A high degree of application awareness will also be necessary to optimize performance in key workloads, meaning simple automation won't be enough to deliver an optimal user experience. In order to address this issue, converged platforms will have to be provisioned to address specific workloads and support an overarching, integrated architecture that allows for simplified migration and data connectivity.

One of the biggest mistakes enterprises make when changing their internal IT infrastructure is trying to do all of the work themselves despite a lack of training and expertise. In order to avoid this common problem, enterprise decision-makers should work alongside a trusted service provider to ensure a successful implementation. By working together with a reliable industry partner, companies can create a customized infrastructure that works for them.

How the cloud is like PCs: An IT history lesson

Technology has always played a role in creating freedom within an organization, either by breaking down boundaries or by providing an avenue through which to reach new horizons. For a long time, the most innovative force in enterprise technology was the computer. When the first PC was introduced, it changed everything by making real computing power affordable and available to businesses and individual employees. The analysis made possible by PCs resulted in increased operating efficiency, faster innovation and dramatically improved client experiences. While computers are still the main focus of every organization, they no longer driving the freedom of innovation they once did. Cloud computing, however, has taken up the mantle, and has changed the face of enterprise IT in much the same way PCs did when they were first introduced. Businesses can learn from their own IT history and put the cloud to work for them the way they did with PCs in the following three ways:

“Cloud computing has changed the face of enterprise IT.”

1) Embrace the freedom to build
One of the reasons PCs became so popular so quickly was because they offered employees the ability to build applications, which freed them from the practical constraints of the IT department. Each user was able to pursue his or her own ideas independently and follow the ones that would make the biggest difference to the company. Any technology that expands a user’s possibilities is unstoppable, and cloud enables the same freedom as PCs before it.

Before the cloud arrived, innovative employees who wanted to create a new application to improve operating procedures had to go through an endless series of steps to get approval before anything could move forward. Now the cloud puts a massive number of resources right at users’ fingertips, allowing them to create, test and distribute programs that may never have gotten made otherwise.

The cloud is poised to change the enterprise the same way PCs did in the '80s.The cloud is poised to change the enterprise the same way PCs did in the ’80s.

2) Focus on the value of data
One of the biggest benefits PCs offered businesses in the ’80s and ’90s was the ability to gather and use data at a level previously unheard of. Now, the cloud offers businesses a similar opportunity. Not only can massive amounts of data be created through countless apps and services, but an even greater amount can be collected and analyzed through those same features to offer insights into business processes and operations.

As a recent Forbes article noted, “This changes the way IT practitioners and leaders need to think about IT. Now it’s not just about building and running data centers. It’s about marshaling tools and applications that acquire, transform, apply and protect the data that runs the organization.”

3) Recognize the power to disrupt
After PCs crashed onto the tech scene in the early ’80s, network storage systems followed closely behind. After that, PC technology moved into the data center and created even more innovations. PCs quickly became a dominate force in the data center, fundamentally changing the economics of how they were built and operated. Now cloud is here to usher in the next wave of data center disruption.

Cloud is poised to create a deep and lasting impact on the future of IT. Hybrid cloud especially is becoming a defining trend. The majority of enterprises around the world are already using multiple cloud environments for at least part of their IT workloads, changing the way people think about data.

The bottom line is that the cloud won’t be going anywhere anytime soon, and organizations would do well to look at the examples set by earlier disruptive technologies and apply them now to make the most out of their technological investments.

Increased use of technology causing changes in the enterprise

A recent study by management consulting and technology services firm Accenture revealed that a growing number of enterprise decision-makers are relying more heavily on technology to make changes in their business.

“90% of senior decision-makers expect technology to transform their companies.”

According to the report – which surveyed nearly 2,000 senior decision-makers in 15 countries – 90 percent of respondents expect digital technologies to transform their companies. At the same time, 87 percent of participants said their organization had made significant inroads to adopting digital technologies within the last 12 months.

While enterprise technology is useful for numerous reasons, one of the key drivers for adoption was the ability to increase mobility. When participants were asked which technologies their companies had already successfully adopted, nearly two-thirds responded with mobility.

A variety of benefits can be realized with the adoption of digital technologies, but one of the most widely reported was the creation of new revenue opportunities, with 48 percent of respondents experiencing that advantage. Another 46 percent reported faster time to market for products and services, and 45 percent said they were now able to provide more rapid responses to client demands.

As technology becomes a more essential part of the enterprise, organizations are shifting their focus.As technology becomes a more essential part of the enterprise, organizations are shifting their focus.

Use of technology causing companies to restructure 
Businesses are beginning to completely restructure their departments to make the best use of technology possible, as well as to ensure the most beneficial decisions are being made. The report found that 83 percent of organizations have implemented a holistic strategy and a central team to oversee the implementation and management of new technologies. Another 80 percent of companies have appointed a chief digital officer to help with large-scale adoption and ensure the technology being employed is being used in the best way.

“The benefits of digital technology are not just being talked about anymore, but are being put into action as organizations are reshaping themselves to take advantage,” said Jim Bailey, global managing director for Accenture Mobility. “A vast majority of respondents said their business had made significant inroads in using digital technologies over the past year – to grow their client base and or to enhance their overall enterprise efficiency – but acknowledged there is still some way to go.”

One of the most reliable ways for businesses to ensure a smooth and successful adoption of new technological infrastructure is to partner with a trusted service provider. An organization like ISG Technology is able to offer decades of industry experience to create a customized program that will work for each individual business. ISG enables companies to access the support and network capacity necessary for a successful deployment.

CIOs look to find a balance between tech innovations, enterprise security

With technology playing a much more integral part in the enterprise, the role of the CIO has become more complicated in recent years. A variety of factors that previously didn’t affect the position are now shaping everyday processes, and there is an increasing degree of change continuously facing IT staff. According to the 14th Annual State of the CIO survey conducted by CIO Magazine, 91 percent of CIOs say the role has gotten more challenging recently, and 74 percent say it is becoming increasingly difficult to find a balance between business innovation and operational excellence.

The rising frequency of data breaches have put a premium on strict security practices to protect critical infrastructure. But, at the same time, CIOs must be able to focus on just a few key priorities that will help to propel their organizations forward. In order to achieve this balance, there are a few main technology drivers that CIOs look to for guidance on IT priorities: cloud computing, big data analytics, enterprise mobility and data centers.

In many cases, the advantages of multiple areas are being combined to create solutions that benefit companies even more. Business continuity/disaster recovery and security will always be – or should always be, at least – a top priority for businesses, but innovations in cloud computing and data center design are helping to improve these processes by increasing overall security and enhancing recovery efforts so network intrusions cause as little disruption as possible.

Big data analytics and enterprise mobility are also teaming up to provide operational insights that were previously unavailable to most organizations. In the modern enterprise, data serves as a new form of currency, and the more information businesses can get out of their data, the richer they will become. Practically every company has some form of mobility or bring-your-own-device program by now, and many organizations also offer a mobile application for employees and clients to access information on the go. The data created through those programs is proving invaluable to enterprises hoping to learn more about their client base and streamline operating procedures.

Enterprises are experiencing numerous benefits with new technologies. Enterprises are experiencing numerous benefits with new technologies.

Tech innovations offer benefits to companies, but expertise is lacking
While these areas of IT are becoming the most important for many businesses, they are also some of the categories in which many CIOs are seeing skills shortages. According to the State of the CIO survey, big data, security and mobile technologies are three of the top five areas in which businesses are finding it difficult to find qualified candidates. The study also found that 56 percent of CIOs believe they will experience an IT skills shortage over the next year.

“ISG Technology offers expertise to help companies implement solutions right for them.”

In order to ensure they are able to experience the benefits of these technologies despite a lack of IT talent, many businesses are turning to third party service providers to receive the help they need. Organizations like ISG Technology offer expertise in data center management, security, enterprise mobility and cloud computing and can help companies implement solutions that are right for them quickly and conveniently.

New study finds Internet of Things continuing to expand

A new study recently released by Gartner has found that use of the Internet of Things is growing, and an increasing number of devices now have IoT capabilities.

According to the report, 4.9 billion connected things are expected to be in use next year, an increase of 30 percent from 2014. The number of IoT devices is believed to be on track to reach 25 billion by 2020. Gartner researchers estimated that total spending on services supported by the IoT will reach $70 billion in 2015 before rising dramatically to $263 billion in 2020.

Part of the reason connected devices have seen such a dramatic growth recently is due to the powerful force the IoT has shown itself to be in terms of business transformation. The report discovered that while the increased number of connected things is being driven by consumer applications, enterprises will account for most of the revenue in the market.

"The number of connected intelligent devices will continue to grow exponentially, giving 'smart things' the ability to sense, interpret, communicate and negotiate, and effectively have a digital 'voice,'" said Steve Prentice, Gartner fellow and vice president. "CIOs must look for opportunities to create new services, usage scenarios and business models based on this growth."

Researchers also noted that traditional, mainstream products will start to be reinvented to include computing capabilities and provide them with a digital voice. The enhancement of objects once viewed as passive products will completely change their value propositions and create new services and business models. The study found that by 2020, the three industries with the highest level of IoT use will be utilities, manufacturing and government.

Security a major part of IoT expansion 
​A major point touched on by the report is the security repercussions of the IoT, as dozens of new platform options are brought into enterprise digital security architecture. Increased use of the IoT will also bring new security standards to each industry individually and provide a new view of applications. These changes will cause IT leaders to create a more comprehensive technological approach to IoT risk and security going forward. According to the study, 20 percent of companies will have digital security services devoted to protecting business initiatives using IoT devices and services in the next two years.

"The IoT highlights the tight linkages between information security, information technology security, operational technology security and physical security like never before," a statement from Gartner noted. "Executives now face a decision regarding the future of security in their enterprise and who governs, manages and operates it."

New tests discover ‘no-wait data center’ technology

Researchers from the Massachusetts Institute of Technology recently announced that they have created what they are calling a 'no-wait data center'. According to ZDNet, the researchers were able to conduct experiments in which network transmission queue length was reduced by more than 99 percent. The technology, dubbed FastPass, will be fully explained in a paper being presented in August at a conference for the Association for Computing Machinery special interest group on data communication.

The MIT researchers were able to use one of Facebook's data centers to conduct testing, which showed reductions in latency that effectively eliminated normal request queues. The report states that even in heavy traffic, the latency of an average request dropped from 3.65 microseconds to just 0.23 microseconds.

While the system's increased speed is a benefit, the aim is not to use it for increased processing speeds, but to simplify applications and switches to shrink the amount of bandwidth needed to run a data center. Because of the miniscule queue length, researchers believe FastPass could be used in the construction of highly scalable, centralized systems to deliver faster, more efficient networking models at decreased costs.

Centralizing traffic flow to make quicker decisions
In current network models, packets spend a lot of their time waiting for switches to decide when each packet can move on to its destination, and have to do so with limited information. Instead of this traditional decentralized model, FastPass works on a centralized system and utilizes an arbiter to make all routing decisions. This allows network traffic to be analyzed holistically and routing decisions made based off of the information derived from the analysis. In testing, researchers found that a single eight-core arbiter was able to handle 2.2. terabytes of data per second. 

The arbiter is able to file requests quicker because it divides up the necessary processing power to calculate transmission timing among its cores. FastPass arranges workloads by time slot and assigns requests to the first available server, passing the rest of the work on to the next core which follows the same process.

"You want to allocate for many time slots into the future, in parallel, " explained Hari Balakrishnan, an MIT professor in electrical engineering and computer science. " According to Balakrishnan, each core searches the entire list of transmission requests, picks on to assign and then modifies the list. All of the cores work on the same list simultaneously, efficiently eliminating traffic.

Arbiter provides benefits for all levels
Network architects will be able to use FastPass to make packets arrive on time and eliminate the need to overprovision data center links for traffic that can arrive in unpredictable bursts. Similarly, distributed applications developers can benefit from the technology by using it to split up problems and send them for answers to different servers around the network.

"Developers struggle a lot with the variable latencies that current networks offer," said the report's co-author Jonathan Perry. "It's much easier to develop complex, distributed programs like the one Facebook implements."

While the technology's inventors admit that processing requests in such a manner seems counterintuitive, they were able to show that using the arbiter dramatically improved overall network performance even after the lag necessary for the cores to make scheduling decisions.

The FastPass software is planned to be released as open source code, but the MIT researchers warn that it is not production-ready as of yet. They believe that the technology will begin to be seen in data centers sometime in the next two years.

New tests discover 'no-wait data center' technology

Researchers from the Massachusetts Institute of Technology recently announced that they have created what they are calling a 'no-wait data center'. According to ZDNet, the researchers were able to conduct experiments in which network transmission queue length was reduced by more than 99 percent. The technology, dubbed FastPass, will be fully explained in a paper being presented in August at a conference for the Association for Computing Machinery special interest group on data communication.

The MIT researchers were able to use one of Facebook's data centers to conduct testing, which showed reductions in latency that effectively eliminated normal request queues. The report states that even in heavy traffic, the latency of an average request dropped from 3.65 microseconds to just 0.23 microseconds.

While the system's increased speed is a benefit, the aim is not to use it for increased processing speeds, but to simplify applications and switches to shrink the amount of bandwidth needed to run a data center. Because of the miniscule queue length, researchers believe FastPass could be used in the construction of highly scalable, centralized systems to deliver faster, more efficient networking models at decreased costs.

Centralizing traffic flow to make quicker decisions
In current network models, packets spend a lot of their time waiting for switches to decide when each packet can move on to its destination, and have to do so with limited information. Instead of this traditional decentralized model, FastPass works on a centralized system and utilizes an arbiter to make all routing decisions. This allows network traffic to be analyzed holistically and routing decisions made based off of the information derived from the analysis. In testing, researchers found that a single eight-core arbiter was able to handle 2.2. terabytes of data per second. 

The arbiter is able to file requests quicker because it divides up the necessary processing power to calculate transmission timing among its cores. FastPass arranges workloads by time slot and assigns requests to the first available server, passing the rest of the work on to the next core which follows the same process.

"You want to allocate for many time slots into the future, in parallel, " explained Hari Balakrishnan, an MIT professor in electrical engineering and computer science. " According to Balakrishnan, each core searches the entire list of transmission requests, picks on to assign and then modifies the list. All of the cores work on the same list simultaneously, efficiently eliminating traffic.

Arbiter provides benefits for all levels
Network architects will be able to use FastPass to make packets arrive on time and eliminate the need to overprovision data center links for traffic that can arrive in unpredictable bursts. Similarly, distributed applications developers can benefit from the technology by using it to split up problems and send them for answers to different servers around the network.

"Developers struggle a lot with the variable latencies that current networks offer," said the report's co-author Jonathan Perry. "It's much easier to develop complex, distributed programs like the one Facebook implements."

While the technology's inventors admit that processing requests in such a manner seems counterintuitive, they were able to show that using the arbiter dramatically improved overall network performance even after the lag necessary for the cores to make scheduling decisions.

The FastPass software is planned to be released as open source code, but the MIT researchers warn that it is not production-ready as of yet. They believe that the technology will begin to be seen in data centers sometime in the next two years.

BYOD policies support majority of Americans who can't go 24 hours without their phone

A recent survey from Bank of America found that 96 percent of Americans between the ages of 18 and 24 consider mobile phones to be very important. While that may not be so surprising, the fact that only 90 percent of the respondents in the same group reported deodorant as also being very important. The report involved interviews with 1,000 adults who owned smartphones and found that they were more important than most anything, including toothbrushes, television and coffee.

The survey also discovered that 35 percent of Americans check their smartphones constantly throughout the day. Forty-seven percent of respondents said they wouldn't be able to last an entire day without their mobile phone, and 13 percent went so far as to say they couldn't even last an hour.

As the Bank of America report proves, people are more attached to their devices than ever. Millennials are especially dependent on their phones and tablets, and they are also the group making up the biggest portion of new workers. Companies are increasingly able to benefit from implementing BYOD policies, as employees who have grown accustomed to their particular phone expect to be able to continue using that phone at work. Allowing workers to keep their own device increases productivity, as they aren't constantly checking an alternate phone, as well as boosting employee satisfaction.