Wi-Fi delivers multiple benefits to schools

As technology becomes increasingly ubiquitous, traditionally low-tech industries are having to adopt more modern devices and systems. The education sector is slowly beginning to implement new technology to better serve students and teachers. Many school districts are realizing the benefits of Wi-Fi in classrooms, but there are still a whole host of schools that still rely on wired connections to access the Internet.

A school that uses a wired Internet connection provides a fundamentally different learning environment than those that offer wireless access. Wi-Fi gives students and teaches improved mobility and connectivity between campus buildings, increasing productivity and collaboration. Wireless is also a more cost-effective solution than traditional wired services.

Teachers find advantages with Wi-Fi
A recent Pew Research Center survey of more than 2,000 teachers in the Advanced Placement and National Writing Project programs found that digital technologies have had positive effects on their classrooms and helped them in teaching their middle and high school aged students. Of the teachers surveyed, 92 percent reported Internet access having a major impact on their ability to access content, resources and materials for their lesson plans. Wi-Fi opens up a vast number of learning opportunities for students and instructional ones for teachers.

The survey went on to show that 45 percent of students used e-readers to complete assignments in class and 43 percent used tablets for the same task. Both of these devices can only access the Internet through a Wi-Fi connection. A whole host of new technology is rendered useless by wired Internet, meaning schools without Wi-Fi are blocking students from accessing an entire generation of devices, most of which are easier for them to use than traditional computers.

One of the major benefits of Wi-Fi is the mobility it offers. Wired computers restrict Internet access to specific locations, like computer labs, that greatly reduce the ability for students and teachers to collaborate. Access to wireless Internet increases communication between everyone in an educational environment; student to student, student to teacher, teacher to teacher, etc. The Pew survey found that 69 percent of teachers experienced a major impact on their ability to share ideas with other teachers by utilizing the Internet, and that ability only increases with the mobility of Wi-Fi.

Data center networking market to reach $22 billion

A recent study by research firm MarketsandMarkets projects the global data center networking market to reach $21.8 billion by 2018. According to the report, North America is expected to hold the largest share of it over the next five years.

The study noted the dramatic market potential created by the demand for cloud technologies and software-defined networking in data centers. The increased use of mobile, driven by bring-your-own device policies, and the use of cloud services have caused data center providers to shift their network offerings from traditional models to those more capable of providing the flexibility necessary to quickly transfer workloads between servers.

This shift in data center architecture was originally driven by the demand for virtualization, but a variety of new changes in the market have persuaded providers to favor faster and flatter models over traditional core-distribution-edge designs. Some of the new challenges facing data center managers include heavy inter-server traffic, burst speeds faster than 1 gigabit and the shift from fiber channels to Ethernet networks.

Data centers can no longer be built the way they were even just a few years ago, as the fundamental structure of enterprise applications have changed and with them the needs of users. The adoption of new, more advanced hardware is placing greater demands on data center networks and fueling a boom in the market.

"Data center networks are being re-architected as part of a transition to the next generation of data centers, reimagining how applications and data centers are built," wrote Biztech Magazine contributor Joel Snyder. "This change extends from the power and cooling to the servers and storage, as well as the networking."

As new data centers are built and their designs continue to shift, requirements for increased security and greater distributed and managed services will be front of mind. Other factors will help to shape the creation of the next generation of data centers, including higher speed, reduced latency, layer 2 flattening and high availability. Demand for the installation of new virtualization and storage equipment will offer data center providers the opportunity to rethink facility design and create truly modern data centers.

FCC approves plan to increase Wi-Fi access in schools

Earlier this month the Federal Communications Commission approved a plan to spend $2 billion over the next two years on providing schools and libraries with enhanced Wi-Fi capabilities.

The proposal aims to modernize the FCC’s existing E-Rate program in an effort to meet goals set by President Obama in a directive last year to expand 99 percent of U.S. students’ access to broadband. Major industry players like Facebook, Netflix and Bloomberg LP all sent letters to the commission supporting the initiative because “the plan will make dramatic progress in bringing high-speed connectivity to classrooms.”

More schools have Internet, but that’s not enough 
According to Businessweek, the number of U.S. classrooms with a connection to the Internet have increased by 83 percent since the E-Rate program was created in 1998. School administrators, however, say simply being connected isn’t enough. Higher speeds and better service are becoming increasingly necessary and obtaining those things increases costs at a time when school budgets are being dramatically reduced. Sixty percent of schools in the U.S. do not have sufficient Wi-Fi access, according to FCC Chairman Tom Wheeler, and so far the E-Rate program has only been able to improve that access in 5 percent of schools and 1 percent of libraries.

“Technology has changed, the needs of students and library users have changed, and now E-Rate has changed,” said Wheeler. “No responsible business would stick with an IT plan developed in 1998.”

Removing obsolete services, increasing funding
The recently approved plan seeks to provide a larger portion of schools with improved Wi-Fi by revamping the E-Rate program. The initiative pays for telecom services for schools and libraries, but those still include obsolete services like paging and landline phones. By redirecting funding for outdated systems to the Wi-Fi program, more schools will be able to benefit. According to FCC acting managing director Jon Wilkins, the phaseout of obsolete services will create savings of $350 million next year and will grow to $950 million by the program’s fifth year. The remainder of E-Rate’s budget comes from monthly fees telecom providers are required to charge their clients.

According to Wheeler, the commitment of $1 billion to schools and libraries in 2015 means that millions of students will be have access to increased opportunities.

“The new plan will make E-rate dollars go farther by creating processes to drive down prices and increase transparency on how program dollars are spent,” said Wheeler. “And it will simplify the application process for schools and libraries, making the program more efficient while reducing the potential for fraud and abuse.”

The approval vote also made it possible for E-Rate’s annual funding, which has been capped at $2.25 billion since the program started, to be increased later this year. Currently, the initiative’s formula for allocating funding is based on schools’ student numbers and libraries’ physical size, but this method has come under scrutiny by members of Congress and will likely be revised.

BYOD policies support majority of Americans who can't go 24 hours without their phone

A recent survey from Bank of America found that 96 percent of Americans between the ages of 18 and 24 consider mobile phones to be very important. While that may not be so surprising, the fact that only 90 percent of the respondents in the same group reported deodorant as also being very important. The report involved interviews with 1,000 adults who owned smartphones and found that they were more important than most anything, including toothbrushes, television and coffee.

The survey also discovered that 35 percent of Americans check their smartphones constantly throughout the day. Forty-seven percent of respondents said they wouldn't be able to last an entire day without their mobile phone, and 13 percent went so far as to say they couldn't even last an hour.

As the Bank of America report proves, people are more attached to their devices than ever. Millennials are especially dependent on their phones and tablets, and they are also the group making up the biggest portion of new workers. Companies are increasingly able to benefit from implementing BYOD policies, as employees who have grown accustomed to their particular phone expect to be able to continue using that phone at work. Allowing workers to keep their own device increases productivity, as they aren't constantly checking an alternate phone, as well as boosting employee satisfaction.

Determining bandwidth requirements in the data center

How much bandwidth does a data center really need? It depends on how many workloads and virtual machines are in regular operation, as well as what the facility is designed to support. For example, a data center providing resources to a public cloud requires much more bandwidth than one that is simply powering internal systems and operations shielded by the company firewall. The increasing uptake of remote data centers and colocation arrangements, in tandem with server virtualization, has added to organizations' bandwidth considerations.

How virtualization complicates bandwidth requirements
Server and desktop virtualization have made companies less reliant on physical infrastructure and the specific sites that house it. Here's how they work:

  • With desktop virtualization, or VDI, an operating system can be hosted by a single machine (even an aging one), simplifying management of both software and hardware while reducing costs
  • Server virtualization involves a single physical server being turned into multiple virtual devices. Each instance is isolated and the end user cannot usually see the technical details of the underlying infrastructure.

By getting more out of IT assets via virtualization, companies have reshaped IT operations. More specifically, they have spread out their infrastructure across multiple sites and put themselves in position to move toward cloud computing.

With increased reliance on virtualization, organizations have looked to ensure that remote facilities receive the bandwidth needed to provide software, instances and data to users. However, liabilities still go overlooked, jeopardizing reliability – especially when data centers are too far apart from each other.

Ensuring low latency is just one piece of the data center optimization puzzle, though. Sufficient bandwidth must also be supplied to support the organization's particular workloads. In the past, Microsoft has advised Exchange users to think beyond round trip latency.

"[R]ound trip latency requirements may not be the most stringent network bandwidth and latency requirement for a multi-data center configuration," advised Microsoft's Exchange Server 2013 documentation. "You must evaluate the total network load, which includes client access, Active Directory, transport, continuous replication and other application traffic, to determine the necessary network requirements for your environment."

Knowing how much bandwidth is needed
Figuring out bandwidth requirements is a unique exercise for each enterprise. In a blog post, data center networking expert Ivan Pepelnjak looked at the nitty-gritty of assessing bandwidth-related needs, honing in on some of the problems that reveal a need to rethink how bandwidth is allocated and utilized.
These issues include:

  • Over-reliance on slow legacy equipment
  • Oversubscription to services
  • Miscalculation of how much traffic each virtual machine generates 

In addition, data center operators sometimes overlook bottlenecks such as how virtual machines can sometimes interact slowly with storage. If they have to frequently access data stored on an HDD, for example, quality of service may degrade. Networks may require extra bandwidth in order to avoid data transfer hiccups. 

HealthKit, healthcare and managing BYOD

As smartphones become faster and increasingly capable of running sophisticated applications and services, health care organizations are faced with a dilemma. Do they allow doctors, nurses and staff to participate in bring-your-own-device policies and potentially unlock productivity gains that enable higher-quality care? Or do they hold back out of legitimate concerns about data security and compliance with regulations?

The growing interest of technology firms in health care tracking only complicates the situation. Individuals may now use devices such as wristbands, in addition to smartphones, to record and share health information, making it critical for providers to keep tabs on BYOD activity to ensure compliance.

HealthKit and the larger issue of sharing health information
At this year's Worldwide Developers Conference, Apple announced HealthKit, a platform built into iOS that underscores how healthcare on mobile devices is rapidly evolving and sparking questions about how sensitive data is handled. HealthKit isn't a discrete solution but a system of APIs that would allow, say, an application that tracks steps to share its information with medical software that could provide actionable advice.

Major health care organizations are already on board. The Mayo Clinic created an application that monitors vital signs and then relays anomalous readings to a physician. Given the already considerable presence of mobile applications in health care, HealthKit could give hospital and clinic staff additional tools for providing efficient care.

At the same time, HealthKit turns any iOS device into a potential compliance painpoint. Data that is stored on an iPhone, for example, would not fall under the purview of the Health Insurance Portability and Accountability Act, but if shared with a provider or one of their business associates, HIPAA would likely apply. Stakeholders will need time to adjust to the nuances of how healthcare applications interact with each other in the HealthKit ecosystem.

"The question would be whether the app is being used by a doctor or other health care provider. For example, is it on their tablet or smartphone?," asked Adam Greene of Davis Wright Tremaine LLP, according Network World. "Where the app is used by a patient, even to share information with a doctor, it generally will not fall under HIPAA. Where the app is used on behalf of a healthcare provider or health plan, it generally would fall under HIPAA."

Tracking and securing privileged health information
HealthKit is just one platform on a single OS, but it is part of a broader shift in data control, away from centralized IT departments and organizations and toward end users. For healthcare, this change is particularly challenging since providers have to ensure that the same compliance measures are enforced, even as BYOD and cloud storage services become fixtures of everyday operation.

A recent Ponemon Institute survey of more than 1,500 IT security practitioners found that almost 60 percent of respondents were most concerned about where sensitive data was located. BYOD complicates compliance, and healthcare organizations will have to ensure that they have well defined policies in place for governing security responsibilities.

"People trained in security also view IT as accountable for the security domain," Larry Ponemon, chair of the Ponemon Institute, stated in a Q&A session on Informatica's website. "But in today's world of cloud and BYOD, it's really a shared responsibility with IT serving as an advisor, but not necessarily having sole accountability and responsibility for many of these information assets."

It's no longer enough to rely on IT alone to enforce measures. Security teams and IT must work together and implement BYOD security as well as network monitoring to ensure that only authorized devices can connect to the system, and that data is safely shared.

UC market continues to grow as IT becomes more consumerized

Most enterprises are probably familiar with bring your own device, the practice of employees supplying their own hardware, typically smartphones and tablets, to supplement or replace traditional office PCs. Recently, the BYOD buzzword has given way to discussion of "shadow IT," a similar phenomenon that nevertheless is usually cast in a more negative light. Whereas BYOD is regularly construed as a potential boon to productivity, shadow IT is framed a threat to the IT department's control, especially as organizations increasingly migrate from on-premises to cloud-based software.

Unified communications' place as BYOD, shadow IT come to the fore
Unified communications solutions are in a unique position as BYOD and shadow IT infiltrate the enterprise:

  • UC may be hosted on-premises or provided through cloud resources, making it both a traditional and cutting-edge technology, depending on the implementation.
  • The widespread use of OTT voice, messaging and chat solutions – Apple, for instance, has pegged iMessage as the single most used iOS app – is changing how companies approach communications infrastructure. Circuit-switched telephony and email alone no longer suffice.
  • With such consumerization all across the enterprise messaging, technologies such as Wi-Fi are being advanced to make voice calls and Internet access more seamless.

Overall, UC has so far benefited from the widespread shift of IT toward the cloud and mobile devices. In a 2014 report, Infonetics Research estimated that the voice-over-IP market alone reached $68 billion in 2013, up 8 percent from 2012. Revenues could rise another $20 billion by 2018.

UC and Wi-Fi-enabled VoIP
Employees are now accustomed to seamless connectivity and high-quality, feature-rich software on mobile devices. For example, apps such as Skype and LINE are much more versatile than standard SMS and voice dialers.

A big part of achieving a better use experience with enterprise UC is getting the installation right. Firms that handle high daily call volumes may choose to host UC on-premises for maximum reliability. If VoIP is a major part of the solution, it is important to ensure that is supported by sufficient bandwidth and Wi-Fi access points.

"Believe it or not, not all antennas are created equal," stated. "[K]eep an eye out for the following details: the size of the antenna, the quality of the construction, the choice of metal used, corrosion prevention, the bracket, and other characteristics such as focus and radiation patterns. The more stable your pole, the more stable your connection will be."

Unified communications brings real benefits to education

As its name suggests, a unified communications suite gives the user a variety of business services, from instant messaging and email to voice calling and video conferencing, in a single convenient platform. On top of that, UC can serve as a natural path into cloud computing. Many of its key functionalities do not even require on-premises physical equipment since they can be run from remotely hosted servers.

UC solutions are ideal for organizations looking to consolidate their communications processes and save money while doing so. Let's look at some of the general benefits of UC, as well as how it has worked in practice for institutions in higher education.

Why organizations should consider replacing legacy systems with UC
The well known limitations of legacy hardware and software – inflexibility, difficult maintenance and total cost – can hold back businesses that are in the midst of rapid growth. Rather than deal with arduous upgrades of traditional phone systems, for instance, companies can adopt a UC platform that provides a broader set of communications services with a lower price tag.

Some of the most notable perks of UC include:

  • Better overall user experience: Thanks to the rise of mobile computing, individuals now expect immersive, intuitive interactions with all devices and applications. UC offerings may feature rich interfaces, plus they're usually compatible with smartphones and tablets, making it possible to work from anywhere.
  • Reduced equipment- and support-related costs: Investing in a UC system may not even require purchasing new hardware upfront. Services are delivered through an IP network, an appealing arrangement for cost-conscious small and midsize businesses. The UC provider may also handle support issues, freeing up the IT department to attend to other matters.
  • Flexibility and scalability: Backed by cloud storage services, UC solutions can be easily modified and extended as business requirements evolve and the user base grows.

University puts UC to work in modernizing practices
How does UC look in the real world? EdTech chronicled the Florida State University College of Medicine's adoption of a UC suite that included video conferencing and was supported by server virtualization.

Five years ago, the institution upgraded its network to support the added bandwidth requirements of video conferencing, and more recently it virtualized its servers to save rack space. The result has been a UC video conferencing platform that enables remote work and easy video viewing by students and guests.

"We now have more than 2,000 recordings that take up in excess of 2 terabytes of data," college media specialist Patrick Sparkman told EdTech. "While virtualizing our UC servers was part of the college's effort to modernize its server infrastructure, adding the [storage area network] gave us the storage capabilities we needed. And, through the server virtualization, we now have the redundancy we didn't have when we started."

The college is still moving toward full implementation of UC. Over the next few years, it hopes to continue making use of platforms such as Microsoft Lync and also integrate voicemail with email.

Overcoming common obstacles to VDI implementation

Implementing virtual desktop infrastructure is a big change for any organization. It almost always leads to significant shifts in how the network is used, plus VDI can really strain the storage and bandwidth resources in company data centers. If an enterprise is unprepared, its VDI efforts could get of to a rocky start.

The VDI 'boot storm' and others issues to keep in mind
Last year, ZDNet's Steven Vaughan-Nichols examined some of the common obstacles to successfully setting up VDI. These include:

  • Insufficient bandwidth: The key advantage of VDI is that it enables everyone to work more easily from anywhere, through the delivery of a consistent desktop. But once employees are working outside the office, there's no guarantee that they'll have Internet connection speeds that are suitable for an optimal VDI experience.
  • Bring-your-own-device security: In many scenarios, it makes sense for VDI users to use a virtual private network, which is not always easily accomplished if they're connecting from, say, a public Wi-Fi hotspot.
  • License and storage management: VDI licensing can become really complex on Microsoft Windows. On top of that, accommodating user habits can require large amounts of storage, while "boot storms" (everyone connecting to VDI within a short timeframe) push servers to capacity.

The latter phenomenon is particularly noteworthy, since it not only compromises the end user's ability to be productive through VDI, but also reveals which parts of the IT infrastructure are insufficient or at least unsuited to VDI. With VDI now a popular method for facilitating corporate mobile device usage, the boot storm can make it seem really out of place alongside native mobile applications.

"The issue with virtual desktops is the so-called 'boot storm" when everyone fires up their computers at 8 AM. As any PC user knows, a hard drive running flat out at 150 IOPS takes a couple of minutes to complete boot," wrote Jim O'Reilly for Network Computing. "A quicker boot time will be important for VDI, especially with most users having instant-on experience with tablets and mobile phones. These are becoming the endpoint for the virtual desktops to be displayed, and a long period to boot up every time the desktop is accessed isn't going to fly."

These problems are not intractable, though. Managed services providers can assist with desktop virtualization and ensure that organizations get the levels of storage, licensing and bandwidth that they need to make VDI work for them.

Data storage and VDI
O'Reilly also looked at some data storage considerations to make when considering VDI. For example, replacing some or all traditional hard disk drives with solid-state drives can provide the performance boost required for first-rate VDI. While SSDs are more expensive than HDDs on a per-GB basis, they can support many more VDI instances.

Organizations such as the Bank of Stockton in California have shifted their storage strategies to respond to surging VDI traffic. The bank used a combination of DRAM, SSDs and flash memory, as well as virtualization and decompression, to ensure that its appliances could keep pace with VDI usage. Implementing VDI requires new approaches to hardware, security and device management, but it is possible to get it right with help from vendors and IT services providers.

Connecting the dots: Bandwidth as a business model

Few developments have affected businesses in the past few years as much as the burning desire for bandwidth. As enterprise environments expand, complications are inevitable. Proper information storage and security are increasingly vital as more businesses transition to data-driven initiatives. They're also becoming harder to attain. Many organizations find themselves caught in a tangled web of carriers, data centers, service providers and connectivity requirements. A lack of interoperability between services and poor communication among stakeholders can make undoing these knots an expensive and resource-intensive slog. It induces broadband rage and burns a lot of bandwidth in the process.

Optimizing connectivity needs to be a foremost concern in today's business model. In theory, it means providing enough bandwidth to create sufficient breathing room for all locations and stakeholders. In practice, an organization needs to centralize its connectivity support. Data Center Knowledge contributor Bill Kleyman recently discussed some fundamental changes in information technology that should compel companies to consider building their business model around their data center network. 

"Business used to establish their practices and then create their IT department. Now big (and smart) businesses are approaching data centers and technology from a completely different angle," Kleyman wrote. "These visionaries see that the future revolves around complete mobility and true device-agnostic connectivity."

Examples Kleyman highlighted included cloud-based data distribution models, which support expanding application development and processing environments. He also observed that new ways of computing, such as virtualization and software-defined networking, place more emphasis on minimizing granular infrastructure management and centralizing IT. Complexity in digital compliance and data governance can also be assuaged by a centralized connectivity platform.

Looking at bandwidth as a business model involves seeing technology as a critical role player rather than simply as a means to get things done. Connectivity infrastructure can and should contribute directly to bottom-line thinking. Paring down the number of service providers to a basic carrier-agnostic data center model can provide more bandwidth integrity and fewer headaches.