Data center construction increases, driven by demand for colocation services

As more companies shift to an increasingly digital business model, the demand for colocation services is growing. In turn, data center construction is set to increase at a steady rate in the coming years, according to a recent study from Research and Markets. With new, state-of-the-art infrastructure coming online and the industry gravitating toward large-scale data center deployments, companies may want to reconsider how these trends can simplify their own IT strategies.

According to the Research and Markets study, the global data center construction market is set to grow at a compound annual rate of 21 percent through 2018. This trend is largely being driven by the increasing challenge of managing a data center as new demands in terms of efficient energy use, alternative power sources and industry regulations complicate the logistics of building and running an enterprise facility. Additionally, the growing complexity of network infrastructure is proving a challenge for many companies to handle internally, prompting them to look for outsourced solutions.

In addition to new construction, many existing facilities are also being forced to retrofit with new server, power and cooling equipment to absorb the challenges of the contemporary tech landscape, ITBusinessEdge's Arthur Cole noted in a recent column. The result is a change in the profile of the average data center.

"Going forward, infrastructure will be leaner and meaner, but the individual pieces will be more powerful and flexible," Cole wrote. "And the [data centers themselves] will be fewer in number, but much, much bigger."

Rather than try to weather these changes themselves, companies may find it expedient to embrace the trend toward colocation and instead look for a trusted third-party data center provider. With the right partner, companies can position themselves to transition smoothly into the future.

Achieve IT savings with better data center management

 

IT departments face a wide variety of budgetary pressures, which means that finding more efficient ways to deliver the same services is always a goal for technology staff. One of the biggest sources of inefficiency for many companies is the corporate data center, which can create substantial costs that have nothing to do with actual IT in the form of power and cooling needs. Companies are increasingly looking for ways to make these operations more efficient and turning to data center infrastructure management solutions as a result. Additionally, many businesses have found that by switching to a managed services provider for their data center, they are able to access the gains of instituting such technology without the upfront costs and complexity.

A recent Navigant Research study found that the market for data center infrastructure management technology is expected to grow more than sixfold in the next six years as data center operators take advantage of new solutions that offer visibility into both key facilities metrics and server management variables. A separate study of one DCIM solution conducted by Forrester found that the return on investment in terms of power and space planning was 93 percent, while the ROI in terms of energy monitoring was 216 percent.

“DCIM – the software, systems, and services that monitor, measure, and help control data centers’ IT and facilities infrastructure – is quickly becoming a must-have technology for managers of modern data centers,” said Eric Woods, research director at Navigant Research.

Given the substantial savings companies can achieve by using state-of-the-art monitoring and management tools, they should look to leverage data center solutions that incorporate these technologies. Managed services providers should have granular insight over their facilities that enables them to create tangible operational savings and, in turn, pass those savings along to clients.

Developing the customized cloud in the data center

The past few years have seen a wholesale embrace of cloud storage and application hosting approaches, and companies are continuing to look for solutions that meet their evolving computing needs. Despite the rapid growth of cloud services, however, the majority of cloud deployments are still private, occurring in the on-premise or managed data center, according to VMware CEO Pat Gelsinger. While the move to public cloud services is set to continue as companies look for cheaper delivery models for certain services, this general reality is expected to stay consistent for the foreseeable future.

"On-premise cloud is a $2 trillion market … 92 percent of cloud is on-premise," Gelsinger said at the Cloud Factory conference in Banff, Alberta, according to VentureBeat. "And Gartner says that by 2020 it will still be 77 percent."

Companies are keeping their clouds on-premise due to reasons tied to security, cost, government regulations and availability, Gelsinger added. However, there's another reason for the ongoing use of on-premise cloud: There are many ways to use the cloud, and not all of them are best suited for public cloud deployments. Companies can leverage virtual servers in a variety of ways, and the dominant model for new IT deployments is moving toward custom implementations on a per-use basis, ITBusinessEdge's Arthur Cole wrote in a recent column. Rather than following industry trends, companies are looking for the right way to meet their needs for individual applications, whether they have certain scale requirements that necessitate using the public cloud or regulatory demands that make it easier to keep data in a single colocation facility.

"If ever there was an example of a technology being all things to all people … it is the cloud," Cole wrote. He later added, "Cloud infrastructure, then, is likely to become as diverse as today's legacy environments, but with the added twist that it can be made and remade according to the needs of the moment."

As companies look for the best balance of cloud deployments, they can benefit from working with a managed services provider or IT consulting firm to determine the optimal data center infrastructure and virtual server architecture solutions to meet their needs.

Managed services equip companies to deal with changing cybersecurity landscape

Each year seems to bring a broader and more complex array of cyber threats to businesses, and many companies are struggling to keep up with the rapid pace of change. According to a recent survey from security software firm KnowBe4, more than half of IT managers – 51 percent – find security harder to maintain now than a year ago. Preventing cyberthreats and responding quickly to security issues are some of the biggest challenges for companies, which is why many are turning to managed services providers for a more secure infrastructure, as well as functions like malware removal and application support.

"Cybercriminals are constantly devising cunning new ways to trick users into clicking their phishing links or opening infected attachments," KnowBe4 CEO Stu Sjouwerman stated, adding that companies need to respond with thorough cybersecurity procedures, policies and training.

Another recent study from Solutionary and the NTT Group found that 54 percent of new malware goes undetected by antivirus software. As a result, companies need to make sure they are protected at the application level by using secure software and applying updates, ITBusinessEdge contributor Sue Poremba wrote in a recent column. Leveraging managed services for application support can help ensure software is kept updated and secured against threats, while external expertise can also be valuable in implementing state-of-the-art perimeter solutions and secure data center infrastructure.

Additionally, a managed services provider that offers malware removal can be a valuable partner in responding to and limiting the damage of an incident like an SQL injection attack, which the Solutionary study noted can easily cost a business $200,000 or more. Such protection might be unaffordable for a small business to implement in-house, but, by outsourcing certain IT management functions, companies can access state-of-the-art security solutions and industry-leading expertise. With the right portfolio of tools protecting it, a small business can avoid these ever-expanding threats.

ISG Expands Sales Coverage in Springfield, Missouri

Experienced Team Delivers Customized IT Solutions with Regional Network of Data Centers and Local Support 

ISG Technology has been helping Springfield, Missouri businesses protect their business and slow down the cost of growth since 1982. ISG experts deliver infrastructure, cloud services, bandwidth, IT as a service, unified communications, as well as disaster recovery and business continuity solutions. ISG clients have seen a reduction in IT sprawl, with corresponding cost savings in CAPEX, ongoing maintenance and energy costs, and costs associated with staff and real estate.

Recently, ISG has hired two experienced salespeople to help local businesses solve their toughest IT challenges. Leslie Willcockson brings an extensive background in voice and data solutions and Ryan Walker is an expert in desktop virtualization.

“ISG works hard to become a partner with our clients,” said Dustin Dasal, Regional Vice President of Sales Springfield, Missouri. “Our full range of services help clients manage, access, transport, store and secure their critical business data. Growing the Springfield team will enable existing clients to take advantage of our entire portfolio of services. We’ll also be able to introduce ourselves to local businesses that have challenges managing their computing environment.”

Leslie Willcockson has joined ISG Technology as an Account Executive.  A native of Warsaw, MO and long-time resident of Springfield, Willcockson brings 11 years of experience selling customized voice and data services to a variety of clients, from single locations to large national enterprises.

“First, I learn about my client’s future needs and goals and then develop a customized solution that grows with them,” said Willcockson. “The ISG portfolio of services delivers all the technology  needed to create the right solution, whether from IT infrastructure, or hosted in ISG’s regional network of data centers. I was also drawn to the integrity of the company, and appreciate their focus on local support.”

Ryan Walker, Senior Account Representative at ISG, grew up in Republic, MO and has close to 10 years of experience selling complex IT solutions, including data virtualization and infrastructure. “I wanted to work for an experienced IT company with a concentrated focus on service, and enjoy the ability to serve clients in my home market,” Walker said. “I work very closely with my clients—whether it’s a large business or small business—and deliver one-on-one attention. Several of my clients have told me they aren’t used to this level of service.”

Disaster recovery services, cybersecurity critical to protecting electric grid from attacks

Over the past few years, the utilities industry has made a concentrated effort to make key infrastructure "smarter." The integration of data-capturing devices and automated, software-based management systems has the potential to create smart electric grids that can more effectively use and distribute power, reducing energy costs and environmental impact in the process.

However, turning power grids into connected devices has potentially harrowing implications – a concentrated cyberattack could cause lengthy and widespread outages, not only withholding electricity from businesses and residences, but disrupting communications, healthcare systems and the economy. According to many cybersecurity researchers, the likelihood of a potential problem occurring is less of an "if" and more of a "when." 

Ramping up disaster recovery services and cybersecurity protocols is key to shielding the smart electric grid from a devastating attack. While the federal government tries to increase the efficacy and stringency of its own security measures, it's important that utility companies – from national generators to local distributors – build up their own prevention and backup systems, according to a recent white paper by the three co-chairs of the Bipartisan Policy Center's Electric Grid Cybersecurity Initiative. This effort will require a hybrid system that responds to both physical and cybersecurity threats. 

"Managing cybersecurity risks on the electric grid raises challenges unlike those in more traditional business IT networks and systems," the report stated. "[I]t will be necessary to resolve differences that remain between the frameworks that govern cyber attack response and traditional disaster response."

Disaster recovery efforts need to include backup digital systems that rival physical ones. Electric grids require faultless failover technology that can depend on a secondary backup network if the primary one is taken offline for any reason. As the Baker Institute pointed out in a recent Forbes article, the measure of a disaster recovery system's effectiveness is based on whether the grid can be restarted following a major breach, disruption or cyberattack. Without a system that can effectively monitor, prevent and immediately respond to such threats, the smart electric grid could be putting many key infrastructure systems in danger.

Disaster-recovery-as-a-service market emphasizes changing priorities

Disaster recovery, once a relative afterthought or nonentity in business planning, is now a central consideration. Advanced threats and high-profile data breaches have helped to convince organizations that it's time to stop dragging their feet and start taking disaster recovery more seriously. The rapid rise of the market for disaster-recovery-as-a-service highlights an important shift in priorities.

According to TechNavio, the global market for DRaaS is expected to rise at a compound annual growth rate of 54.6 percent between 2014 and 2018. Demand for hybrid and cloud-based disaster recovery has driven investment, especially in small- and medium-sized businesses that have found a "flying under the radar" approach by virtue of their size is no longer a viable approach to avoiding the consequences of information security compromises.

Larger organizations have also realized that IT departments are generally unable to maintain complete oversight and disaster recovery protection amid data deluges and rapid network expansion. To cite one sector, the banking industry has begun to invest heavily in the cloud to relieve the amount of resources it has to spend on application updates, software patches and IT infrastructure, according to Bank Systems & Technology.

The report did note that relying too much on a generic cloud solution or paying insufficient attention to backup data could diminish the effectiveness of DRaaS investment. A company is better served by using a multi-service provider that focuses on customization, specificity and addressing pain points. This way, it can avoid any data integrity or governance issues stemming from a lackluster vendor. 

Getting cloud storage services at the right price

 

Cloud storage services offer organizations peace of mind by providing a secure location to store and back up data. But what happens when the company needs to recover it on a short timetable? Some cloud providers offer an easy road to retrieving data and resetting environments following an incident, but others may have hidden their helpful services behind a dense thicket of extra fees. It’s important to know how to avoid cloud storage pricing models that aren’t as cost-effective as they look on paper.

Although it’s not clear if cloud storage price cutting is an effective measure of netting clients, many cloud providers do it anyway, wrote TechTarget contributor Sonia Lelii. When selecting a vendor, it’s important to discern whether they’re slashing services along with prices. Don’t make a decision based on price alone. Network infrastructure, security and the cloud interface are important features that should not take a backseat to a price quote. Be wary that a low upfront cost could be masking higher expenses during a time of need.

Looking for flexibility is another way to maximize value in a cloud storage services investment, wrote Enterprise Storage Forum contributor Drew Robb. A fee structure that offers locked-in prices for regular or predictable needs, combined with an outline for services that can be purchased on-demand when they’re needed, is better than relying on one that determines fixed usage prices based on service offerings.

Why it's time to invest in a third-party data center

Rising data center complexity means that for many companies, continuing to support an in-house data center is rapidly becoming unsustainable. Spending on data center infrastructure management services and software is expected to top $4.5 billion by 2020, according to Navigant Research, and many organizations are already feeling the pinch on a local level.

Organizations that want to maintain an on-premises facility will likely fall into one of two paths: The first is to enter a cycle of massive spending on everything – new equipment, management tools, software upgrades and IT staff – to handle an otherwise bloated IT environment. The other is to keep costs down by restricting expansion, which could have significant problems for continuity, security and the company's competitive advantages. Neither situation is preferable as both have a fairly high potential for confusion, mismanagement and far-reaching operability issues. 

Outsourcing data center needs, either through colocation or managed servers, enables an organization to gain control of its data center spending while retaining access to top storage, network and security technologies. It also offers companies scalability, which is incredibly difficult or highly expensive to come by in the on-premises facility. As an organization requires additional server space, network bandwidth or software support, it can work with the data center provider to increase available capacity. This way, it doesn't pay for what it doesn't use, and can depend on complete IT support for every aspect of its investment.

Data Center Journal contributor Yuri Rabover compared continued on-site data center development to reorganizing one's garage: Although space is at a premium, the realities of equipment size and non-expert human planning likely mean that it won't be used as effectively. Instead of playing a never-ending game of data center Tetris, hemorrhaging resources at every turn, it makes sense to eliminate an otherwise ongoing problem with a decisive, conclusive act. Outsourcing data center infrastructure and its management helps enterprises avoid the spending sprees that can curtail their competitive advantage.

Why it’s time to invest in a third-party data center

Rising data center complexity means that for many companies, continuing to support an in-house data center is rapidly becoming unsustainable. Spending on data center infrastructure management services and software is expected to top $4.5 billion by 2020, according to Navigant Research, and many organizations are already feeling the pinch on a local level.

Organizations that want to maintain an on-premises facility will likely fall into one of two paths: The first is to enter a cycle of massive spending on everything – new equipment, management tools, software upgrades and IT staff – to handle an otherwise bloated IT environment. The other is to keep costs down by restricting expansion, which could have significant problems for continuity, security and the company's competitive advantages. Neither situation is preferable as both have a fairly high potential for confusion, mismanagement and far-reaching operability issues. 

Outsourcing data center needs, either through colocation or managed servers, enables an organization to gain control of its data center spending while retaining access to top storage, network and security technologies. It also offers companies scalability, which is incredibly difficult or highly expensive to come by in the on-premises facility. As an organization requires additional server space, network bandwidth or software support, it can work with the data center provider to increase available capacity. This way, it doesn't pay for what it doesn't use, and can depend on complete IT support for every aspect of its investment.

Data Center Journal contributor Yuri Rabover compared continued on-site data center development to reorganizing one's garage: Although space is at a premium, the realities of equipment size and non-expert human planning likely mean that it won't be used as effectively. Instead of playing a never-ending game of data center Tetris, hemorrhaging resources at every turn, it makes sense to eliminate an otherwise ongoing problem with a decisive, conclusive act. Outsourcing data center infrastructure and its management helps enterprises avoid the spending sprees that can curtail their competitive advantage.