Posts

Cloud infrastructure market showing steady growth

The cloud Infrastructure-as-a-Service market is growing at an accelerated rate, with providers bringing in increased revenue, according to IT analyst firm Gartner.

A recent Gartner report found that global spending on cloud IaaS solutions will reach almost $16.5 billion in 2015, an increase of more than 32 percent from last year. As more businesses move an increasing number of workloads to the cloud, the market is expected to grow at a compound annual growth rate of 29 percent through 2019.

"10% of CIOs consider cloud IaaS their default infrastructure option."

Last year the absolute growth of public IaaS workloads surpassed on-premises workload growth of any type for the first time, the Gartner report revealed. According to a survey of CIOs conducted by Gartner, cloud IaaS is considered an infrastructure option by 83 percent of CIOs and 10 percent already deem it their default choice.

This growth in the IaaS market is also causing a consolidation of service providers, according to Gartner vice president and analyst Lydia Leong. The market is rapidly revolving around a small number of trusted service providers, so IT buyers will need to select their vendors carefully.

"We urge buyers to be extremely cautious when selecting providers; ask specific and detailed questions about the provider's roadmap for the service, and seek contractual commitments that do not permit the provider to modify substantially or to discontinue the offering without at least 12 months' notice," said Leong.

The cloud IaaS market is growing and providers are consolidating.The cloud IaaS market is growing and providers are consolidating.

IaaS proves a versatile tool
Cloud IaaS solutions can be put to work for practically any use case that can reasonably be hosted on virtual servers, but the most common are development and testing environments, high performance computers and batch processing, Web-based apps and non-critical internal business applications. Gartner suggested that businesses adopting a cloud IaaS solution operate in two essential modes, otherwise known as bimodal IT. This allows them to keep sight of what is needed to maintain IT operations while at the same time innovating with new, digital possibilities.

"Cloud IaaS can now be used to run most workloads, although not every provider can run every type of workload well," said Leong. "Cloud IaaS is not a commodity. Providers vary significantly in their features, performance, cost and business terms. Although in theory, cloud IaaS has very little lock-in, in truth, cloud IaaS is not merely a matter of hardware rental, but an entire data center ecosystem as a service. The more you use its management capabilities, the more value you will receive from the offering, but the more you will be tied to that particular service offering."

When first starting, most organizations deploy cloud IaaS for mode 2, which are agile IT projects that may be on the periphery of the organization's IT needs but can still have a major impact for the business. As the company becomes more comfortable with its use of IaaS over time, some organizations may choose to use it in Mode 1, for traditional IT projects.

As time goes on, many enterprises, especially those in the mid-market, will likely migrate away from operating their own computing facilities and instead host their workloads in a data center run by a service provider and rely primarily on infrastructure in the cloud.

Basic tips to avoid cybercrime

 

With people using the Internet for banking, shopping, socializing and everything in between, the risk of falling victim to a cybercrime scheme has never been more real. In fact, a CNNMoney report found that 110 million American adults were hacked in 2014 alone. Considering this number is 47 percent of the U.S. adult population, it’s obvious that online security is just as much of a concern for people as physical security.

And while these numbers are frightening in their own right, an even scarier thought is how drastically unprepared some businesses are for cyberattacks. A McAfee report found that about 90 percent of small-to-medium-sized businesses don’t use data protection of any kind for themselves or their clients.

Statistics like this make dealing with smaller businesses seem risky, and as such every small business should make cybersecurity a top priority. To this end, this list of very basic online practices has been compiled in order to make navigating the Internet safer.

Don’t click on something unless you are 100 percent sure what it is: This may seem very basic, however, many people don’t fully understand how risky it is to click a link they aren’t absolutely sure about. Frank Heidt, CEO of Leviathan Security Group, gave a TED talk about this very topic. In the speech, he stated that the easiest way to hack someone in a particular company is through their loved ones.

A CEO’s computer will have very specific security protocols, but his child’s computer probably won’t. All a hacker has to do is gain access to the child’s computer and then send the CEO a message using the child’s email address. This email, which will look identical to any other sent by the child, will contain a piece of malware in the form of a seemingly harmless clickable link. Once clicked, this virus will run through the CEO’s computer and will eventually require malware removal across the entire company’s network. The takeaway from this is that no part of your online experience is 100 percent safe, and any and all links should remain suspect.

Update your software: With the fast-paced nature of modern technologies, keeping software up-to-date can be extremely hard for some smaller companies. And even though your business might be getting along just fine with Windows XP, for example, you’re actually putting yourself in danger.

On April 8, 2014, Windows announced that it would no longer be putting out updates for Windows XP. This means that any security holes in the software that need patching will not be fixed, and that hackers will be free to exploit them. Having up-to-date software not only gives clients a better experience with the company, but it also makes sure their data is better protected.

Keep passwords complex: Although most people know to keep passwords hard to crack, few know how truly important it is to keep passwords complex. A Bloomberg study about password complexity showed how making a few tiny changes could drastically change a hacker’s ability to access your computer. The article stated that a six-word, all lowercase password takes a hacker’s computer about 10 minutes to crack on average. However, if you were to add an uppercase letter and a number/symbol to that password, the time to crack it jumps to 463 years.

Contact a company that deals in cybersecurity: If you had to pick only one of these tips to follow, this one would definitely be the one to choose. Companies like ISG Technology stay current with their cybersecurity knowledge, and can help with everything from malware removal to safe data storage. They can also help assess the security of your company’s network, ensuring that both company and client information stay out of the hands of hackers.

Benefits of outsourcing IT needs

Outsourcing work that cannot effectively be done in-house is a huge part of how companies should do business. A study by The Economist on modern outsourcing found that manufacturers typically outsource 70 to 80 percent of their finished products. The same study also found that 90 percent of these companies saw outsourcing as playing a key role in their growth strategies.

So, if these companies can outsource their manufacturing needs, why are so many businesses afraid of doing the same for their IT department and data management?

What can outsourced IT provide companies?
Of course, companies should never completely do away with their own IT departments. That being said, allowing a separate entity to handle certain aspects of the company's IT needs allows the business's internal IT personnel the chance to work on new ideas.

As Howard Baldwin stated in an article for Forbes, IT department's regularly get bogged down in "fire-fighting mode – reacting to every issue that (comes) up to keep things up and running." When this happens, IT employees aren't free to pursue interesting endeavors that may innovate the company. By outsourcing these everyday occurrences and other projects to an outside firm, employees can truly change how a company uses the technology available to it.

A perfect example of this would be IT support for all the software and technologies businesses deal with on a regular basis. Sure, software providers are usually there to give support on their products. However, these software providers simply cannot cope with the system wide consequences of using their products in terms of the specific kinds of technology a company is using.

This is something an internal IT department could certainly deal with, however, something so banal would simply take away from their ability to develop the company. Outsourced IT companies are more than happy to take on this responsibility, as well as other mundane tasks such as malware removal and helping companies switch to a colocation of data. 

Business continuity and brand image

Murphy’s Law states that anything that can go wrong, will go wrong. In business terms, this means that downtime is simply inevitable. And while downtime is extremely expensive in terms of getting business up and running again, yet another extreme expense it causes is a damaged brand image. Disaster recovery planning is perhaps the most important part of a company’s brand. This is because of the huge stress clients put on consistency and data protection with the companies they do business with.

Consistency is key
Being able to do business in a consistent and reliable manner is what separates successful companies from the ones that fall by the wayside. As Kristi Jackson, founder of Women CEO Project, stated, “consistency helps build trust with consumers and other businesses because they see your brand or results frequently and know you mean business.” When a person walks into an Italian restaurant, it’s because they want Italian food. If they arrive one day and find the restaurant is serving sushi – or worse, no food at all – they are going to be extremely disappointed with their visit. The same can be applied to any other business. If a business has had a lot of downtime due to poor disaster recovery, clients are going to be frustrated.

This is exactly why business continuity plans are so important to a business’s clients. As Continuity Insights stated, a company’s brand belongs to the clients more than it does to the company itself. This is because clients can help build a company’s brand image or destroy it if they see fit.

Robust business continuity plans, such as those offered by ISG Technology, right the ship when things are not going well. The reason a good plan is so important is because it guides how employees act in a crisis. DisasterRecovery.org explained that when companies have good business continuity plans, employees can take a more educated and confident role in the company’s recovery process. This means less panic on the side of the employees, which means less downtime for the company, which means less money and brand image lost. Getting the company back on its feet after a run in with a disaster is at the heart of consistency and good brand imaging, and should be a key concern for those trying to take their business to the next level.

Virtualization of the classroom

It's no secret that many schools and districts in America are severely underfunded. Despite being the best way to guarantee an educated and prosperous populace, education in America has gone through some serious budget cuts since the recession in 2008. In fact, a review of the 2013-2014 school year by state found that 35 states are providing less funding per student than they did before the recession. And while this needs to be worked on at a governmental policy level, as it stands, schools have to make due with what they have. 

That being said, costs need to be cut where they can. With this in mind, virtualization of computer and IT systems within schools is emerging as an effective way to use a budget effectively while still providing the technological education necessary to thrive in the modern world. 

Virtualization: What is it and how is it cost-effective for schools?
In order to grasp why virtualization is so beneficial for the education system, it's important to fully understand what it is. Basically, there are two types of virtualization. The first has to do with a school's use of servers. Server virtualization allows a single physical server to act as multiple by housing many virtual ones in one machine.

This has a dual effect in terms of cost-cutting. First, it cuts down on the physical costs of multiple servers. With machines with zero VMs, 95 percent of a server's capacity isn't being used in the current model. If schools were to virtualize, their hardware costs in terms of servers would plummet as they began to use servers more effectively and efficiently.

Server virtualization would also reduce costs by allowing different school districts to share files more easily. As new instructional materials are distributed, sharing of these materials in a virtual environment is typically as simple as a file copy operation

The second common type of virtualization is that of the desktop. As knowledge of computers becomes more and more necessary to function within the modern world, classrooms will need to continue to add them into curricula. As this happens, students will need a desktop specific to their classes and their schedule. Before virtualization, a student would need to be given a specific laptop if the school wished for them to have a desktop catered to the student's needs in each of their classrooms.

However, desktop virtualization allows for a student's workplace to follow them from class to class without the necessity of a specific laptop assigned to them. This practice also allows for a greater safety in terms of the students files in a disaster recovery scenario. If a computer the student is working on crashes or stops working for whatever reason, their entire desktop can be moved from one work station to another without any level of difficulty. This not only saves money in terms of technical support, as a student who can't access their files would need help or risk missing the day's lesson, but it also saves the headache of having to start from scratch. 

What to know about letting employees bring devices from home

The bring-your-own-device movement has been getting serious traction lately, as the amount of technology owned by employees continues to go up. Whether it be their personal phone, tablet or even laptop, people really seem to like the idea of having their own tech at work. And this recent BYOD trend doesn't seem to be slowing down. In fact, research firm Gartner said approximately 70 percent of mobile workers will be using their own smart devices rather than those given to them by their company by 2018. With those kinds of numbers, it's no wonder that many companies are gearing up for the BYOD revolution. 

BYOD: The risks and rewards
Just like every trend, there are positives and negatives to letting employees bring their own electronics to work. The simplest and perhaps the most obvious of the positives is that people like their own devices. Employees don't just grab the first phone or computer that catches their eye. They take enormous amounts of time and energy to figure out what device is right for them. There is a multitude of things a device can offer someone, and allowing employees to tailor their work devices to their own wants and needs is certainly a benefit to both productivity and employee morale

Another positive point to consider with BYOD is the fact that employees upgrade their own devices at a much more rapid pace than their employers. According to eMarketer, about 54 percent of smartphone users plan to buy a new device within the next 12 months. Any company attempting to keep their own hardware this current would most certainly run itself into the ground. It's best to let employees worry about having the most current device.

Despite the many positives to BYOD, there are also some risks every company should consider. The biggest problem a company implementing a BYOD plan is likely to run into is the fact that it is extremely hard to tell employees what they can and can't do with their own devices. People get used to surfing the Web on their personal laptops, and while this is fine if the device is strictly for personal use, it becomes a problem when the device is brought into the office. It's very easy to tell an employee what they can and can't do with company-owned property, but it becomes a little harder when the employees own those devices. Any company considering BYOD should express these concerns with employees if they wish to keep productivity at peak levels. 

Aside from productivity, there is also a sizeable security risk from employees using BYOD hardware like they would at home. Something as simple as an employee downloading apps on their phone allows third-party access to company data. This is due to the relative simplicity of developing mobile viruses and can lead to the necessity of malware removal. Again, companies considering letting employees bring their own devices into work need to instruct these people in the correct use of devices. 

But companies shouldn't let the risks scare them away. With help from BYOD experts like ISG Technology, allowing employees to bring their own equipment can have a multitude of benefits without any downsides. 

Data lost is money lost

When people think of Google, they tend to imagine the search engine giant as an indestructible force in the technical world. It would seem that Google is such a big player that it can handle any and all obstacles thrown in its way. Google's recent foray into literally any part of the technology industry it sees fit to invest in shows the scale at which the company operates. And while Google's data is probably more secure than the average company from hacking attempts, one force it simply can't match is that of Mother Nature.

Nature's power over man was proven recently when one of Google's Belgian data centers was struck by lightening four times. Although a vast majority of data survived (well over 99 percent, in fact) the point still remains that there are some problems that even the biggest of companies simply can't avoid. If Google's data storage is at risk, how can any other company even hope to protect itself from every single threat imaginable?

The real cost of data loss: Downtime
As any company that has lost data can attest, one of the most frustrating parts about losing data is the amount of time and money it takes to get the company up and running again. In order to find out just how much money data loss and downtime costs companies, EMC Corporation spoke to 3,300 IT professions from 24 countries in 2014. A SecurityWeek article about the study reported that organizations of more than 250 people lost some $1.7 trillion due to downtime and data loss in 2014 alone.

That's quite a lot of money to be lost and only speaks to the sheer, unimaginable magnitude of the problem that is data loss. To compound this, the massive amount of lost revenue felt by companies due to this problem is only made worse considering the study's finding that 64 percent of surveyed enterprises ran into downtime or data loss in the past year.

This kind of data should be a rude awakening to companies that think they are above the threat of data loss. No one is completely safe from data loss, and the only way a company can truly protects itself is to back up its data as much as possible. A robust online backup service, such as what is offered by ISG Technology, is peace of mind in a world where industry heavy-hitters like Google can lose data in a freak accident.  

Growing number of IoT devices calls for enhanced data storage solutions

When the Internet started gaining prevalence in the workplace two decades ago, it would have been hard for most users to imagine how big a role it would end up playing in business. From our almost non-stop use of Wi-Fi to the growing list of smart devices that are able to connect to Wi-Fi networks, the Internet is changing the way end users interact with devices to accomplish tasks. The next step in the evolution of the Internet, however, is changing the way devices interact with one another.

The Internet of Things is creating a vast web of machines that are able to communicate and share information with one another, changing the way we use devices and the data they create.

"The Internet of Things revolves around increased machine-to-machine communication; it's built on cloud computing and networks of data-gathering sensors; it's mobile, virtual, and instantaneous connection; and they say it's going to make everything in our lives from streetlights to seaports 'smart,'" explained Wired contributor Daniel Burrus. "The Internet of Things really comes together with the connection of sensors and machines. That is to say, the real value that the Internet of Things creates is at the intersection of gathering data and leveraging it. All the information gathered by all the sensors in the world isn't worth very much if there isn't an infrastructure in place to analyze it in real time." 

"By 2020 the number of IoT devices will reach 38.5 billion."

According to a recent study conducted by Juniper Research, the number of devices connected to the IoT is expected to grow dramatically in the next few years. By 2020 the number of IoT devices willreach 38.5 billion, a 285 percent increase from this year. Juniper's report, The Internet of Things: Consumer, Industrial & Public Services 2015-2020, also found that one of the biggest hurdles businesses will have to face due to the massive influx of IoT devices is how to handle the increase in storage space necessary for the newly created data, as well as gathering and analyzing that information.

The IoT is changing the way data is used.The IoT is changing the way data is used.

Breaking health care barriers with the IoT
The IoT is having an impact on practically every industry, but its effects are being felt especially strongly by businesses within the health care sector. The problems associated with data collection, storage and analysis can be especially difficult for medical organizations because of strict security regulations and limited budgets. In order to make the technology behind the IoT work in the most effective way possible for health care companies, industry experts have identified some key elements that need to be addressed by IT administrators.

  • First, employees and users need to start adopting an IoT mindset so they will become more comfortable using the technology.
  • Second, operability between IT systems needs to be increased, so devices and programs will work seamlessly with one another.
  • Third, IT administrators need to think about moving forward and how current technologies will work with and influence future innovations.

The best way to address all of these issues is to implement a cloud-based data management solution. Cloud computing has proven to be the most beneficial way to leverage enterprise data. The IoT already utilizes cloud-based applications to interpret and transmit data coming from all of the sensors on connected devices, so using cloud storage services and other cloud applications to manage the data greatly increases interoperability. Using a cloud environment will also help end users become more comfortable with the technology, as many employees already use the cloud to accomplish a variety of tasks throughout the day and are familiar with the platform. Finally, working with the cloud to store, manage and analyze data being collected from IoT sensors will help health care organizations transition more easily into the future as the cloud can be leveraged for a variety of technologies.

Enterprise cloud adoption continues to grow as companies see benefits

Since it was introduced nearly a decade ago, cloud computing has been changing the way companies do business by being perhaps the most transformative technology since the Internet itself. The cloud is becoming so important within the enterprise that market research firm IDC has predicted spending on public cloud services alone will reach $70 billion this year. The five industries investing the most on cloud deployments are discrete manufacturing, banking, professional services, process manufacturing and retail, according to the IDC report.

"Spending on public cloud services will reach $70 billion in 2015."

The study found that the biggest opportunities for success with cloud deployments are in "the development of intelligent industry solutions, which are built on top of a new platform that includes cloud, big data and analytics, mobile and social."

"We have already seen such platforms and innovation communities in place in retail, financial services, media, and other industries," said Eileen Smith, Program Manager of IDC's Global Technology and Research Group. "This will reshape not only how companies operate their IT but also how they compete in their own industry. Technology suppliers will continue to see significant demand for their industry-specific solutions."

So what kinds of benefits can businesses expect when they deploy a cloud solution?

Increased mobility
By hosting IT assets in a cloud environment, information and applications can be accessed and synced from anywhere with an Internet connection. The ability to access information remotely in the same way you would in the office makes it possible for businesses to enable their employees to work from just about anywhere. This also dramatically increases the available talent pool companies can draw from and it makes it  much easier to open and maintain satellite offices around the globe.

Improved collaboration
Because cloud services make accessing data remotely so much easier, collaborating is greatly improved as well. Contractors, remote team members, clients and any other interested party can access the necessary files and programs through a cloud portal instead of through massive attachments on needlessly complicated email chains. Anyone with the appropriate access to information can view, edit and share files, making group projects and collaborative efforts simpler than ever.

More companies are adopting cloud services than ever before because of the competitive advantages it offers.More companies are adopting cloud services than ever before because of the competitive advantages it offers.

Enhanced online backup services
Outages, system failure and unplanned downtime are all a part of working with technology, but how a business comes back from such disruptions is what sets it apart from other companies. With files stored in traditional digital solutions, users typically can't access them if the network goes down, bringing work – and revenue streams – to a grinding halt. With the cloud, however, critical IT assets are still accessible through off-site storage features allowing business to continue even after a disruptive event.

Reduced storage and hardware needs
One of the most expensive parts of enterprise IT infrastructure is the equipment and storage capacity necessary to run a business. Using cloud storage services means organizations do not need as much hard drive space. As a result, hardware requirements are reduced because necessary components are maintained by the cloud service provider.

By partnering with a trusted third-party service provider like ISG Technology, companies can create a customized infrastructure that works for them. Innovative as-a-service options enable CIOs and other IT decision-makers to control the enterprise environment while still being able to access the necessary flexibility to move business forward.

Introducing second wave Wi-Fi

In the world of technology, a lot can change in just a few years, with new innovations emerging all the time. With users employing a growing number of devices to connect to the Internet while also demanding increasing speed and download capabilities, a lot has changed with the way wireless Internet connections are expected to function.

When the first Wi-Fi certified ac products came onto the scene two years ago, they implemented core features of the IEEE 802.11ac standard. While those products – known as "first wave" 11ac – used more spatial streams, wider channels and higher-density modulation triple the speed of comparable Wi-Fi certified 11n products, the necessary features to meet the standard's full potential of 7 Gbps were left out because the technology was still immature and there were a variety of engineering challenges that had to be overcome before moving forward.

With the rapid development of Internet capabilities and Wi-Fi engineering, the Wi-Fi Alliance has announced that it is currently evaluating features that can be added to the "second wave" 11ac products for an updated certification program that will be available in mid-2016.

The next wave of Wi-Fi products will greatly improve capacity and functionality.The next wave of Wi-Fi products will greatly improve capacity and functionality.

What will second wave Wi-Fi have to offer?
In an article for TechTarget, contributor Lisa Phifer noted that the first wave of 11ac products built off of the technology used in products on the IEEE 802.11n standard.

"The first wave of 11ac was built upon the same technologies used by 11n — most notably, multiple input multiple output (MIMO) antennas that transmit data along several spatial streams, optionally combined with double- or quadruple-wide channels to achieve faster data rates," wrote Phifer. "But unlike 11n, 11ac focuses exclusively on 5 GHz band transmission, leaving the congested 2.4 GHz band for use by older, less capable devices and other technologies, such as Bluetooth. Similarly, the second wave of 11ac will build upon the first wave features."

"Second wave Wi-Fi will double capacity and add support for 80 80 and 160 MHz channels."

Just as the first wave doubled the maximum channel width available at the time, second wave will double it again and add support for 80 80 and 160 MHz channels. The amount of spatial streams expected from access points will also grow, rising from three to four transmit and receive streams. While these changes may seem small, they have the potential for major improvements. The enhancements in the second wave Wi-Fi offer the possibility of quadruple the maximum data rates currently available under favorable conditions.

The other big difference with second wave Wi-Fi is the introduction of MU-MIMO (multi-user, multiple input output) technology. It offers the ability to dramatically increase the throughput of wireless networks and make a noticeable difference in dense, high capacity networks. Previous wireless standards like 11ac and 11n were able to improve data rates, but only for individual users. MU-MIMO, however, allows for multiple streams to be sent from access points to multiple users simultaneously, creating a greater impact across the network.

"Wi-Fi has always suffered from density and capacity issues, especially in the small and crowded 2.4GHz band," explained Network World contributor Eric Geier. "Using 802.11n or 802.11ac in the 5GHz band helps by providing many more channels and faster data rates. However, MU-MIMO helps even more as multiple devices can be served simultaneously. This leads to increased throughput, frees up more airtime, and allows access points to serve larger crowds of devices."

The first devices featuring second wave Wi-Fi and MU-MIMO are already starting to appear on the market, offering improved capacity for business-class access points and smartphones, as well as laptops and routers.