Posts

4 trends for the data center in 2015

Data centers have been growing increasingly important to the functions of the enterprise in recent years, and as a result the technologies and processes that run the data centers have rapidly evolved and changed. This year will be the biggest one yet for the data center industry, as there will be more demand for their product than ever before. Below are four of the biggest trends ready to change the industry in 2015:

"This year will be the biggest one yet for the data center industry, as there will be more demand for their product than ever before."

1) Increase in virtualization 
This IT constant has been driving growing change in the data center recently, but as an increasing number of organizations understand the benefits of the technology, it will become an even more prominent fixture. Virtualization offers users improved testing, faster redeployments and simpler backups, as well as a whole host of other advantages. Its use is a relatively new development, and technologies are emerging all the time that help improve its performance. One such improvement, the virtual storage area network, helps to make the data center more flexible, as well as increasing automation. Similar innovations will start to appear more rapidly, constantly changing the processes in play for data center operations.

2) AI finds the data center
A growing number of companies are making strides in the area of artificial intelligence and machine learning, and the next logical step for such technology is to utilize it in a data center. Google announced last year that it had started using machine learning via neural networks in order to optimize its computing facilities. In Google's case, AI was used mainly to manage and optimize operations in the data center concerning IT load, temperature and the effectiveness of their cooling equipment and allowed the company to experience greater energy efficiency.

Big changes are coming to the <a  data-cke-saved-href=Big changes are coming to the data center in 2015.

Shift to IPv6
As use of the Internet has increased dramatically throughout the last decade, options regarding addresses on the current version of Internet Protocol, IPv4, have been almost entirely exhausted. The addresses have run out and routing tables have grown too large, creating serious problems. IPv6 is intended to replace the existing protocol and alleviate the issue, though large scale adoption may still be a ways away.

Though it hasn't arrived yet globally, IPv6 will have a major impact on the data center. Addresses for IPv6 are made up of eight groups of four hexadecimal numbers, each separated by colons. This type of grouping opens up a massive new set of possible IP addresses, but it will also likely have to be compatible with IPv4. Not only will the new version offer more addresses, it will also increase efficiency, improve security and provide new services support. U.S. adoption of the new version is currently around 14.5 percent, but that number will continue to grow in the coming years and data centers have to be ready. 

More companies adopt cloud solutions
Hybrid cloud seems to be the enterprise trend of 2015, with a rising number of organizations transitioning to a mixed cloud environment. A hybrid solution allows enterprises to mix security with performance, bringing in those who were once skeptical about the ability of the technology to handle all aspects of their business' workload. As more companies migrate to the cloud, data centers will have to be prepared to handle the increase in demand.

Changes in store for enterprise cloud in 2015

While 2014 was a big year for technology in general, cloud computing in particular experienced multiple advances last year. Major tech companies have enhanced their public cloud offerings and investment in enterprise cloud solutions saw a dramatic increase. Gartner has named cloud computing as one of the top 10 strategic trends for 2015, and as the cloud becomes an even more prominent fixture in the business world, more big changes are in store for the technology this year.

Increased use of hybrid cloud
One of the biggest changes coming for enterprise cloud storage services is a more rapid shift from either public or private environments to a mixture of both. Many organizations are realizing that utilizing a single infrastructure is not sufficient to meet their availability and security needs at once. By employing a hybrid platform, companies are able to increase access for less secure applications by hosting them in a public environment and more business-critical programs can be kept secure in a private infrastructure.

CIOs already ramped up their adoption of hybrid cloud platforms in 2014, and experts expect this trend to continue into this year. The number of organizations shifting to a hybrid solution will rapidly increase as they realize the benefits of utilizing a platform that allows them to achieve a properly managed and governed IT portfolio.

Rise of hybrid cloud management
With adoption of hybrid cloud environments growing at a dramatic rate, use of a management platform to handle the new infrastructure is bound to follow. The ability to maintain governance over business-critical data and increasing compliance within an organization will become a major selling point in 2015, leading to a rise in the use of hybrid cloud management platforms.

There are a lot of changes happening in cloud computing, and it can be difficult to make the transition without a little help. When looking to implement any type of cloud infrastructure, the most important step is to find a reliable service provider to make the transition as simple and seamless as possible. ISG Technology offers the experience and knowledge required to create the right environment for every client, as well unmatched service to keep it running at prime capacity. Working with a trusted partner like ISG can help to make the process simpler and more straightforward, ensuring reliable access and service that will improve business. 

Cloud computing market to increase through 2016

While many new innovative technologies are surely going to emerge in 2015, one proven technology is poised to become even more prominent in the enterprise this year. According to a recent report by Market Monitor, revenue from the cloud computing market is expected to continue increasing throughout the next two years, likely resulting in a value of just under $20 billion by the end of 2016.

The study projected the cloud computing market will rise at a compound annual growth rate of approximately 36 percent. Individual segments of the market were also analyzed. The fastest growing cloud computing offering is infrastructure-as-as-service, with IaaS making up the majority of the total market revenue and more than half of the total market share for public cloud. The annual growth for IaaS offerings is expected to be even higher than the overall market, at 37 percent. However, platform-as-a-service is predicted to be the fastest growing segment of the cloud market, with a projected annual growth rate of 41 percent between 2012 and 2016. PaaS offerings accounted for almost one-quarter of total public cloud revenue.

"Cloud computing is on the upswing and demand for public cloud services remains strong," stated Yulitza Peraza, a quantitative services analyst with 451 Research and co-author of the report. "However, public cloud adoption continues to face hurdles including security concerns, transparency and trust issues, workload readiness and internal non-IT-related organizational issues."

Because of the security concerns associated with solely public cloud environments, many organizations have started to adopt hybrid solutions. A mix of public and private platforms, hybrid cloud allows enterprises to experience the accessibility and availability of a public environment for less sensitive applications and the security of a private infrastructure for programs that are more business-critical.

Moving to the cloud? Better have a strategy in place

As an increasing number of organizations begin to transition from legacy systems to those hosted in the cloud, there is a growing trend of CIOs being left out in the cold when it comes to deployment.  IT departments are shrinking within enterprises from every industry, creating a shortage of skilled workers with the ability to implement effective cloud programs while still keeping information secure. While the cloud should be employed across all business units, security and compliance must remain primary concerns and that can't happen without knowledgeable decision makers involved.

For organizations lacking the manpower to do all the heavy lifting themselves, deploying resources and applications on an "as-a-service" basis allows CIOs to create the cloud strategy that best suits their needs while leaving the details to a trusted service provider. Before utilizing Infrastructure-as-a-Service, however, companies need to make sure they have a strong cloud sourcing strategy in place that ensures they will receive the best user experience available and be able to respond quickly and effectively to changing market conditions.

Know what you're working with        
It's important for IT decision-makers to know where they stand before with their existing infrastructure before they will be able to implement a new one. Conducting an inventory of a company's cloud consumption allows CIOs to get a grasp on what services are working and what isn't, as well as uncovering any instances of shadow IT. According to Cloud Tweaks contributor Nick Earle, many companies that complete an audit of their cloud use find that the use of unauthorized cloud applications is 10 times higher than they expected. Shadow IT presents a major security risk for enterprises, as a large portion of the data stored within those applications is unencrypted and not secured with a password.

Examining which applications were used without IT input can help decision-makers understand what employees are looking for from the company's cloud strategy and allow for more informed adoption with the new strategy. It also presents an opportunity for CIOs to explain the security risks of utilizing unauthorized programs, creating a learning experience for the entire company.

Look ahead
Successfully deploying a cloud infrastructure doesn't only entail creating an environment that address an enterprise's immediate needs, it also includes provisions for the changes in the future. Currently, the Internet of Things is causing quite a stir among IT departments as companies struggle to find a way to integrate this growing technology into an existing infrastructure. By 2020, more than 50 billion devices are expected to be connected to the IoT, posing a problem for organizations without a flexible strategy in place. No one knows what the future will hold in terms of business technology, but those who aren't able to adapt to new changes quickly will be left behind by their competitors that can, so an agile IT environment is absolutely necessary in retaining competitive advantage.

Focus on flexibility
One of the biggest mistakes enterprises make when employing a cloud infrastructure is utilizing a platform that doesn't fit all the needs of the business. Many focus either on a private environment that offers the security necessary to keep data safe but isn't agile enough to support rapidly changing business segments, or a public solution that offers the access needed but leaves information unprotected. To combat this problem CIOs should focus on a flexible infrastructure that is able to handle both tasks with ease.

By partnering with a trusted third-party service provider, companies can create a customized infrastructure that works for them. Innovative as-a-service options enable CIOs and other IT decision-makers to control the enterprise environment while still being able to access the necessary flexibility to move business forward.

Preparing for changes to BYOD in 2015

By now, the majority of enterprises have put in place some form of bring-your-own-device policy. Employees have been somewhat slow to take advantage of BYOD options, with only one-third currently using their own smartphones in the office, but that number is expected to increase to 60 percent in the next five years, according to information from Gartner. As BYOD gains more traction in the enterprise, companies should begin to look at how such policies can change the way they do business in the new year. David Willis, a vice president at Gartner, noted that BYOD can bring flexibility to an organization, but it can also bring a variety of new concerns, so IT decision-makers should be prepared for a shifting mobile landscape.

Enterprise mobility management 
One of the major changes happening to enterprise mobility is the integration of information and application management. Traditionally, mobile device management was separated between application and information management. Recently, however, these two categories have converged into a single realm, enterprise mobility management. EMM emerged in 2014, but will continue to flourish throughout 2015. Having all of an organization's enterprise management needs covered under one system will provide multiple benefits for internal IT staff, but will also begin to blur the line between information security and application security, requiring a new, more integrated approach for both.

BYOD as a recruitment tool
Another big shift that businesses will start to experience is the recruitment and retention potential of BYOD. A recent survey conducted by Samsung found that companies have extended BYOD options to 80 percent of support and line-level employees and 94 percent of non-executive managers. Mobility and flexible working options are becoming increasingly important to applicants and organizations that offer comprehensive programs will have a bigger advantage when it comes to persuading the best candidates for the job. With the ability to telecommute, enterprises can recruit the best applicants no matter where they're located. 

Bring-your-own-data
Another trend that has just started to emerge is the idea of bring-your-own-data. As enterprise mobile device policies have evolved, a rising number of programs are looking to leverage the personal information of employees to gain insights. According to information from Gartner, nearly one-third of all BYOD policies will utilize employee data, applications and social connections for business purposes by 2016. Workers have become very comfortable with sharing their personal information with companies like Google and Facebook, it was only a matter of time before organizations took advantage of the data sitting right under their noses.

Hotel group applies to FCC for ability to block Wi-Fi signals

As mobile devices have become an increasingly important part of our daily lives and the Internet makes us feel more connected than ever, the ability to connect to Wi-Fi while away from home is a necessity for the average traveler. A variety of devices allow users to set up their own Wi-Fi networks wherever they’d like, providing constant, free access anywhere they go. This may not be the case in some hotel chains soon, however.

“A variety of hotels are appealing to the Federal Communications Commission to allow them to block the signals created by personal Wi-Fi devices “

A variety of hotels, including Marriott International Inc., are appealing to the Federal Communications Commission to allow them to block the signals created by personal Wi-Fi devices so guests would have to use the in-house network inside conference halls and meeting spaces. While the group insists that they want to prevent other networks from being accessible in order to prevent criminals from tricking visitors into using phony networks that look like the hotel Wi-Fi, many opponents are crying foul.

Those who are against the proposal believe the move is being made to force guests to use the hotels’ networks, and in most cases have to pay the steep fees that come along with them.

“If a client arrives at a hotel with her own Mi-Fi device, and the hotel interferes with the client’s connection to that personal hotspot, the hotel can effectively force the client to purchase the hotel’s WiFi services to gain access, even though the client has already paid her mobile operator for personal hotspot capability,” said officials with Microsoft.

Hotels argue they should be able to block Wi-Fi signals to help security

Deciding between public and private airwaves
While the airwaves that have been set aside for use by television companies and cellphone service providers belong to a certain organization and require licenses to operate, the airwaves used by Wi-Fi networks utilize unlicensed frequencies that are meant to be available to anyone, like those that garage door openers and baby monitors use.

A law enforced by the FCC makes it very clear that no one is allowed to “willfully or maliciously” interfere with “any radio communications of any station licensed or authorized” by the government. Therefore, devices like signal jammers are strictly forbidden by the agency. The group of hotels argues that the law preventing the use of jammers should not apply to Wi-Fi because it doesn’t operate on a licensed spectrum. Furthermore, the group argues that a hotel jamming a signal is not maliciously interfering as it is attempting to “monitor and mitigate threats to the security and reliability of its network,” according to an FCC filing.

Cloud computing trends in 2015

According to a recent report by Gartner, cloud computing is one of the top 10 strategic technology trends for 2015. A strategic technology trend has "the potential for significant impact on the organization in the next three years."

David Cearley, vice president and Gartner Fellow, said there are three major themes within the tech trends for the new year: 

  • the merger of real and virtual worlds
  • technological impact of the digital business shift
  • the concept of intelligence everywhere

The last theme is related to a variety of the the trends predicted for 2015, including computing everywhere, smart machine learning and the Internet of Things. Cloud computing is a key element of all of those technologies.

The Gartner study suggested that in 2015, companies will begin to focus on promoting applications that are centrally coordinated and can port across multiple devices.

"Cloud is the new style of elastically scalable, self-service computing, and both internal applications and external applications will be built on this new style," said Cearley. "While network and bandwidth costs may continue to favor apps that use the intelligence and storage of the client device effectively, coordination and management will be based in the cloud."

The large scale shift to the cloud is undeniable, and many industry experts believe the technology will only become more popular in the new year. But what kinds of changes and improvements will be seen with the cloud in 2015? InformationWeek editor-at-large Charles Babcock made some predictions about what the next 12 months have in store for cloud computing. Below are some of the top predictions:

Enterprises will move an increased number of workloads to the cloud
While much has been made in the past about enterprise cloud adoption, it has mostly been talk up until this point. Now, companies are starting to actually implement the technology and this trend will continue into 2015. A recent study by IDG Enterprise revealed that 69 percent of organizations currently have a least part of their IT infrastructure hosted in the cloud, and investments in the technology have increased 19 percent in the last two years.

Software-defined security will become the new norm
With the rising success of software-defined networking technology, a growing number of providers are beginning to offer software-defined security. Babcock suggested that this method will become a bigger part of the software-defined data center and will be used to protect workloads in the cloud.

"In the software-defined data center, software mapping systems identify system perimeters and feed intelligence into a central monitoring system," Babcock explained. "That mapping capability must be extended to define the permissions and activities allowed to the software system, with a surveillance agent ensuring that it adheres to only those activities. Any exceptions must trigger an inspection and potential intervention. "

Greater use of public cloud infrastructures in business
As cloud environments have gotten safer, a rising number of CIOs have started using both public and private clouds in order to give business teams the tools they need to succeed quickly. While most organizations will likely always employ a private cloud environment for at least a portion of their businesses, many are starting to see the benefits of utilizing public platforms for less sensitive systems and processes.

IoT and big data platforms increase cloud use
As the Gartner study suggested earlier, the IoT will become even more popular in the new year; as it does, an increasing number of organizations will utilize the information created by the connected devices to benefit big data initiatives. Such a massive amount of data will have to be stored somewhere, and the cloud will see a major boost in enterprise investment as companies deploy Hadoop and other big data programs.

Protect against data center downtime with cloud disaster recovery

As technology becomes a bigger part of people's everyday lives, the systems and facilities that house the data we need are more important than ever. So when these things experience problems that cause downtime, the effects can be catastrophic.

Such was the case last week when the generator room of a Maryland State Police building had a small fire and caused an outage of the force's data center. According to the Baltimore Sun, the downtime denied state troopers access to central crime databases and caused the state police website to be offline for an afternoon. In an interview with the Sun, a police spokeswoman said that the outage made it impossible for police officers around the state to access shared documents, making it difficult for work to continue as normal.

The fire occurred while the data center was operating on generator power during a planned IT maintenance period. A small fire started in the generator room which caused the sprinklers to activate and the water shut the generator down.

While fires are a rather rare occurrence, they do still occur in data centers around the world. In 2013, Michigan's Macomb County lost IT services after its data center facility had a fire. More recently, a Samsung data center in South Korea experienced a fire in April that affected access to the network used by Samsung device users across the globe.

Severe weather biggest data center threat
Instead of a fire, most companies are more likely to experience data center connectivity problems due to severe weather conditions or unreliable power supplies. It doesn't matter what causes the downtime if a response plan isn't in place ahead of time. According to a recent federal IT survey by SolarWinds, more than 20 percent of participants admitted they didn't have a disaster preparedness strategy in their organization.

In order to avoid the downtime and mitigation costs that come along with a data center disruption, it is important for organizations to back up their information at a remote site or in the cloud to maintain access during an outage. Utilizing the cloud as a disaster recovery solution enables companies to access remote network management. This allows IT managers to remotely manage and fix the problems affecting a network so they don't have to brave the same storm that knocked out the data center in the first place.

Maintain winter business continuity with UC solutions

As the new year fast approaches, it will bring with it colder weather and snowier conditions. While many love some snow in the wintertime, it can spell trouble for commuters and make business continuity a major struggle. For example, according to a recent report by telecom provider Daisy Group, an estimated 3 million workers in the U.K. are prevented from completing their normal work duties each year due to adverse weather conditions.

With employees stuck in the house and clients unable to fly in from other cities, maintaining an easy and continuous method of contact is necessary to keep normal business operations running smoothly.

“Businesses [must] be ready to respond in any imaginable scenario-even when your systems get knocked offline or your staff is unable to get to the office due to inclement weather,” said telecom expert Lindsay Kintner, in a blog post. “The good news is that by taking proactive steps and putting a comprehensive in place prior to disaster striking, you ensure that your business is able to handle all client concerns.”

Beat the weather with UC solutions
While no one can control the weather, enterprises can control their responses to it. Modern companies operate in an environment that is increasingly utilizing IP-based tools, and they are the best defenses against the unpredictability that winter brings. Implementing unified communications tools within an organization enables workers to access a business communication solution that integrates voice, email, video, instant messaging and presence into a single interface. This way, work can continue from anywhere, whether it’s in the office or snowed in at home.

With cloud-based communications, employees are able to collaborate on necessary projects and receive responses from one another in real time. Conferencing tools allow large groups to gather together from multiple locations and discuss topics just as they would in an office, meaning no time is wasted and business isn’t halted.

Client inquiries can also be easily handled with the use of a UC suite. Calls to an office phone can easily be re-routed to a mobile device with distributed workforce functionality, meaning a client call will never be missed because an employee is working remotely. This attention to detail and level of service helps to increase client satisfaction and brand loyalty.

“In today’s ultra-competitive business world, you can’t afford to compromise on the caliber of the client service you deliver,” said Kirtner. “By planning ahead and leveraging modern communications tools, you guarantee that you’re able to respond to your clients immediately-even when the weather interferes.”

VoLTE service to increase in 2015

As 2014 comes to a close, it’s time to take a look into what the new year may hold. One trend that will likely earn a much larger foothold in the telecom industry during 2015 is voice-over-LTE. According to Network World contributor Larry Hettick, a majority of cloud providers experienced double-digit growth in their IP telephony and unified communications portfolios this year. This growth is expected to continue as an increasing number of providers offer the ability to connect to 4G LTE networks.

In a recently release report, industry market research firm Visiongain said it expected there to be more than 101 million active VoLTE subscribers around the world by the end of 2015.

Major carriers like AT&T and Verizon are working on plans for more widespread adoption of VoLTE in 2015 in the hopes of retiring their 3G voice networks. The employing of LTE service for Internet-based calling provides users with high-definition voice and enhanced interoperability with collaborative media like video and conferencing.

VoLTE improves call quality, efficiency 
According to PC Magazine contributor Sascha Segan, VoLTE service bumps up the quality of an incoming call from an 8kbps codec to a 13kbps codec that uses more modern compression methods. The new service offers operators the ability to optimize their spectrum efficiency, utilize their IMS infrastructure to the fullest extent and add value to their existing voice plans. Industry analysts have suggested that using LTE networks for voice services could help carriers achieve 40 percent more spectral efficiency compared to legacy systems.

Both Verizon and AT&T are hoping to replace their legacy switch networks for ones that are entirely IP-based in the coming years. AT&T is looking to retire its traditional infrastructures by 2020, giving a long timeline for carriers to make the switch before being left out in the cold.

Verizon and AT&T are also partnering to offer users interoperability between the two carriers, so any call between the two will remain on VoLTE the entire time.

“Interoperability of VoLTE between wireless carriers is crucial to a positive client experience,” said Krish Prabhu, president of AT&T Labs and chief technology officer at AT&T. “Clients expect to be able to connect anywhere, anytime – and as LTE technology continues to evolve, it’s imperative that we provide a seamless experience between carriers.”

As 2015 comes around the corner, VoLTE service will become more prominent in the industry as users discover the benefits it has to offer.