Posts

Recent Postal Service data loss highlights need for disaster recovery solutions

It was discovered in a recent government audit of the U.S. Postal Service that the agency lost sensitive data after the device containing both the original and backup copies of the information suffered a hardware failure. The machine that crashed contained the database for the Computer Incident Response Team, which was "used to record and monitor computer incidents." The database was lost in April after an unspecified malfunction occurred. The information was considered essential, meaning it was necessary to the maintenance of daily operations.

"…[T]he Postal Service did not ensure all database backups were being stored on separate hardware," stated the audit report. 'Specifically, the CIRT database was lost due to a hardware failure and the data was not recovered due to the absence of a backup on a separate piece of hardware."

Currently, the security standards for the Postal Service do not require separate devices for storing backup and original files to maintain information resources. Ironically, the USPS was given an award by CSO Magazine earlier this year for innovative use of online security. The award was accepted by the CIRT's Information Systems Security Manager Andrew Kotynski.

Disaster recovery: More important than you think 
​While it may seem like what happened to the USPS was just an embarrassing oversight, hundreds of companies make the same mistake each year. Even if duplicate copies of information aren't stored together, they can still be lost if the appropriate disaster recovery and business continuity policies aren't implemented. A recent survey conducted by Forrester found that 33 percent of companies have declared a disaster in the last five years. Four years ago, that number was 20 percent. The study also found that the downtime caused by disasters can be extremely expensive, with respondents reporting costs of up to $3.5 million.

When putting disaster recovery and business continuity plans in place, it is important for organizations to consider where documents and important information are currently stored and how employees access them. For critical information that is used frequently and by many different people, cloud storage services are the best choice.

Using content management systems and cloud-based solutions allow companies to store important data in an easily accessible place that will stay safe during a disaster and keep business running as usual. Employing managed services also lets small- and medium-sized businesses enjoy the same benefits as large companies while having lower costs and the security of a fully redundant, reliable data center.

California counties join together to bring broadband to underserved areas

It was announced earlier this month that a partnership has been formed between four California counties in an effort to extend fast and affordable Internet services to underserved areas. The North Bay North Coast Broadband Consortium is made up up Marin, Sonoma, Napa and Mendocino counties and was created in the last few months in the hopes of modernizing the area's service offerings.

Over the next to years, and with a $250,000 grant from the California Public Utilities Commission, each of the four counties in the consortium will be developing regional maps to identify which areas are served by what types of Internet connections.

"One of the biggest issues we're confronting is closing the digital divide, and this mapping is really a data-based approach that will identify where we have the greatest need," said Sonoma County Supervisor Efren Carrillo, who was appointed to represent the county in the consortium. "We have to get broadband access to those who are underserved, especially in our rural communities in the county."

One of the driving forces behind the consortium's efforts is Marin County which, despite having a highly educated population, has the highest percentage of people without broadband access within the nine Bay area counties, according to the Utilities Commission. The goal of the initiative is to demonstrate to major providers that the need for broadband access exists and to push for future state and federal funding to build a network of underground fiber optic cables that would connect rural fire and sheriff stations, schools, libraries and businesses to reliable Internet.

Proving a need exists
In an interview with the Marin Independent Journal, Carrillo said that the consortium hopes to offer Internet providers data that would help to push them to make better services available to rural and underserved areas like Marin county.

"This is a chance to bring many more Marin residents online," said Marin County Supervisor Steve Kinsey. "It has moved pretty swiftly. We just started this last November and enticed the Mendocino and Sonoma folks and brought along Napa to make a compelling consortium. We got word in the last month or so that they are on board, and we are moving ahead."

Providing service for the people
The access made possible by the new initiative will also allow more state parks, like Fort Ross State Historic Park in Jenner, California, to access high-speed Internet and give visitors the ability to post pictures and send emails about their trip. About 200,000 people visit Fort Ross each year, but the park currently has only spotty cell phone reception and poor Internet access.

In an interview with The Press Democrat, Public Policy Institute of California researcher Dean Bonner said that an increasing number of California citizens are viewing the Internet as a service that should be provided in the same way as power and water. According to Bonner, about two-thirds of those surveyed believe high-speed Internet is a public utility that everyone should have access to. Another 67 percent of respondents said that they would support a program offered by the government and funded by telecom providers that would increase broadband access for residents in rural or low-income areas.

New data center technology leverages SDN for security

It was announced this week that Israeli security startup GuardiCore had closed a round of fundraising to begin production on its new security system designed to internally secure data centers. The technology takes advantage of recent improvements in network virtualization and uses software-defined networking methods to defend data centers operating at multi-terabit rates of traffic.

"SDN is an opportunity to introduce advanced security controls and capabilities into the data center network in a way that can scale to the demands of a large [data center] and offer a dynamic and proactive security control framework, detecting and mitigating an attack at an early stage,"  said the company in a statement.

A weakness created by modern facilities' tendency to include applications that cross security parameters has been exacerbated by the adoption of intra-data center traffic that moves at multi-terabit levels, according to GuardiCore CEO Pavil Gurvich. The new technology aims to address the increase in cyberattacks committed within a data center that go unnoticed due to insufficient security measures. Traditional methods of defense, including sandboxing, intrusion detection and deep packet inspection, are not capable of keeping pace with the speeds at which data center traffic currently operates.

The first component of this new security system, Active Honeypot, surreptitiously re-routes network traffic to counter attack cybercriminals by sending data to an 'ambush' server. The secret server is highly monitored and is capable of quickly providing information about the attack in order to effectively eliminate the threat. Active Honeypot is currently being evaluated in a variety of data centers and private cloud environments.

The recent round of fundraising was led by Battery Ventures, whose general partner Scott Tobin noted that tracking and eliminating intra-data center threats is the next important skill for the industry to master.

"Traditional security techniques have focused on keeping the bad guys out of the perimeter. GuardiCore's approach assumes you have already been compromised and provides levels of visibility and protection that were previously unattainable," said Tobin. 

NASA successfully completes first phase of cloud migration

A massive move to transition NASA's websites and applications to a cloud platform has successfully completed its first phase, migrating more than 1 million files so far.

The agency's huge amount of information made the move quite an undertaking. NASA has more than 1,500 public-facing websites and thousands of applications and networks on top of the agency's huge data offerings and holdings. Sites being moved to the cloud include the internal NASA Engineering Network, which contains the documents of 3 million engineering projects, and NASA.gov. In all, the first phase of the move included more than 100 sites and applications and took 22 weeks to complete, according to NextGov.

Making sure applications 'don't go dark'
During the initial migration to the cloud infrastructure, the NASA.gov portal – which itself contains multiple sites – was redesigned to make the transition smoother. The rest of the websites were moved as-is so NASA could still save on infrastructure, according to Raj Ananthanpillai, who is overseeing the migration.

The applications and sites being moved to the cloud were previously housed in a commercial data center where redundancy and uptime were a top priority, so it was important to the agency that nothing fell through the cracks. In an interview with NextGov, Ananthanpillai likened migrating multiple, dispersed sites running on proprietary systems to changing a tire on a moving car. He stressed the importance of the sites being able to stay online, saying that none of them could go dark.

The Office of Management and Budget's federal cloud-first policy was a driving force behind NASA's move to a cloud platform. At the same time, the agency's own Open Government Initiative, which dealt with the utilization of open-source projects to consolidate internal and external websites, fit in nicely with the OMB's policy. NASA's cloud migration allowed the agency to introduce open-source components to overhaul technology in a cost effective way, while also employing new content management systems within the agency's enterprise tool kit.

Overall, the use of cloud storage services has already generated cost savings of 40 percent, according to Roopangi Kadakia, web services executive with NASA's office of the CIO. Looking to the future, the infrastructure is projected to cut the agency's monthly operations and maintenance costs by about 25 percent.

Universities increasingly look to the cloud for data storage solutions

The demand for access to data at large universities is increasing at an incredible rate with the advent of online classes, analytics services and expanding levels of research. In an interview with TechTarget, Michigan State University CIO Joanna Young explained that the current influx of data is posing a challenge for universities in regards to how best to store information and retain records in the most secure, efficient way possible.

Young noted that it's important for schools to be able to keep up with the growing demand for the multimedia content teachers share in class to be available to students online at anytime. As professors start to offer more content to students that is based somewhere besides a textbook, schools need to become more effective and efficient in their use of data storage and the cloud is an especially helpful solution. At the same time, cloud storage is almost a necessity for universities looking to offer online education options, according to Young.

"Because the video requirements for these online classes are huge – every week, two to four hours or more worth of video content – that would have quickly overwhelmed the storage we had on campus," Young said.

The cloud as a recruitment tool
In her interview with TechTarget, Young mentioned that data storage options can also be a helpful tool in incentivizing professors to come to the university to perform groundbreaking research or start important programs.

"As a CIO, the trick is to say to people…'You don't have to worry about storage. You don't have to worry about servers. Here's how we can provide that for you in a way that's easy for you to use, is going to give you enough space and access that you need, and the type of speed set is OK for you,'" explained Young. "[You] become a partner and get them to align with you, because I find particularly in higher education, you've got to stick with the carrot approach."

The increased ability to conduct advanced research provided by the cloud has even gotten the attention of the National Science Foundation. The NSF recently announced that it would be launching two $10 million projects to create test beds for cloud computing at universities. The aim is to enable the academic research community to pursue and develop new ways to utilize the cloud for next-generation applications used in medical devices, power grids and transportation systems. The first cloud program will be colocated between the University of Chicago and the University of Texas Austin, while the second will be a joint project with a large-scale, distributed infrastructure shared between the University of Wisconsin, Clemson University and the University of Utah.

With cloud platforms growing larger and more complex, Young noted that it can become impractical to solely purchase cloud storage services at such great volumes. In her previous role as CIO for the University of New Hampshire, she looked into software-as-a-service offerings that included storage as a package deal as a way to reduce costs. She also mentioned the need for schools interested in implementing a cloud infrastructure, especially large universities, to have a strong network and reliable broadband service.

LA parks receive access to free Wi-Fi

Park-goers in the Los Angeles area will now have access to free Wi-Fi in six of the city's parks. The City of Los Angeles Department of Recreation and Parks partnered with American Park Network, which creates guides for national parks and public land, to bring the program live. The Wi-Fi service was fully paid for by Toyota, The Los Angeles Times reported.

The service, which has been in beta testing since July, was officially launched last week. The public will have access to the "Oh! Ranger Wi-Fi" network at designated spots throughout Cabrillo Beach, Echo Park Lake, Griffith Observatory, Pershing Square, Reseda Park and Venice Beach. According to Mark Saferstein, publisher and editor-in-chief of American Park Network, the aim of the program is to get more people outside and enjoying the city's parks.

"It's a way to get families who might not go to a park to go there and share with their friends on social media," he said.

By visiting a city website, people visiting the park will be able log onto the network and receive Internet access comparable to what they have in their homes, according to Councilman Bob Blumenfield.

At the same time as the Oh! Ranger Wi-Fi is being introduced, the city of LA has also rolled out a new mobile website that provides citizens with information on parks and recreational activities, like upcoming events, available services and programs and a hub for residents to post service requests.

According to Saferstein the speed of the Wi-Fi varies depending on which park it is being used in, but visitors will have connections fast enough to be able to check email and post pictures. The program is also expanding to New York, Saferstein said, and the hope is to eventually expand to more parks in Los Angeles and across the country.

Top 4 benefits of cloud storage services

As technology becomes an increasingly important part of business, many companies are looking for solutions that will provide the most advantages for the least amount of money, time and complexity. One technology that is growing in popularity is cloud computing, and specifically cloud storage services. While there are many benefits to storing sensitive documents and information in the cloud, keep reading to find out the top four.

1) Cost-Effectiveness:
Backing up data can be extremely expensive, especially when considering the necessary equipment and hardware. Labor costs become an issue too, as manual backups are time-consuming and complicated. Cloud storage solves these problems by leaving the maintenance and equipment costs to a third party provider. Cloud storage solutions are easily scaled, allowing businesses to only pay for the amount of storage necessary for their business and making it simple to increase or reduce space as client needs change.

2) Security:
Storing information in the cloud is much more secure than keeping paper documents or using physical devices for file storage. Hard drives and USBs can be stolen or lost, while information in the cloud will always stay put. At the same time, security is not a core competency for many companies, but it is for cloud service providers. Because of this, providers who are mainly focused on data security are much more adept at keeping information protected than a business with an IT team focused on dozens of projects and problems at once.

Cloud storage also creates an extra layer of security between privileged data and cybercriminals. Backup files are kept separate from originals so hackers cannot steal everything at once.

3) Disaster Recovery:
In the same way that it is safer to keep duplicate files away from the originals to protect them from malicious actors, it also helps to protect against natural disasters. After a storm or fire, regular systems may not be accessible, but information stored in the cloud will be.

4) Accessibility:
Professionals are using more devices than ever before and cloud storage allows files to be accessed from any of them. Sharing is also made easier with this increased flexibility, as files can be put in the cloud and then accessed by any authorized party. This helps to increase collaboration between in-house and remote employees, as well as improving productivity.

Wi-Fi delivers multiple benefits to schools

As technology becomes increasingly ubiquitous, traditionally low-tech industries are having to adopt more modern devices and systems. The education sector is slowly beginning to implement new technology to better serve students and teachers. Many school districts are realizing the benefits of Wi-Fi in classrooms, but there are still a whole host of schools that still rely on wired connections to access the Internet.

A school that uses a wired Internet connection provides a fundamentally different learning environment than those that offer wireless access. Wi-Fi gives students and teaches improved mobility and connectivity between campus buildings, increasing productivity and collaboration. Wireless is also a more cost-effective solution than traditional wired services.

Teachers find advantages with Wi-Fi
A recent Pew Research Center survey of more than 2,000 teachers in the Advanced Placement and National Writing Project programs found that digital technologies have had positive effects on their classrooms and helped them in teaching their middle and high school aged students. Of the teachers surveyed, 92 percent reported Internet access having a major impact on their ability to access content, resources and materials for their lesson plans. Wi-Fi opens up a vast number of learning opportunities for students and instructional ones for teachers.

The survey went on to show that 45 percent of students used e-readers to complete assignments in class and 43 percent used tablets for the same task. Both of these devices can only access the Internet through a Wi-Fi connection. A whole host of new technology is rendered useless by wired Internet, meaning schools without Wi-Fi are blocking students from accessing an entire generation of devices, most of which are easier for them to use than traditional computers.

One of the major benefits of Wi-Fi is the mobility it offers. Wired computers restrict Internet access to specific locations, like computer labs, that greatly reduce the ability for students and teachers to collaborate. Access to wireless Internet increases communication between everyone in an educational environment; student to student, student to teacher, teacher to teacher, etc. The Pew survey found that 69 percent of teachers experienced a major impact on their ability to share ideas with other teachers by utilizing the Internet, and that ability only increases with the mobility of Wi-Fi.

Small businesses increasingly utilizing the cloud, studies find

As technology becomes an increasingly important part of conducting business, companies are starting to hone in on what really works and what doesn't. A growing number of small and medium-sized businesses are realizing that one technology that is worth their time is cloud computing. Cloud computing essentially democratizes business technology, reducing costs and increasing ease-of-use. The cloud makes it cheaper and easier to start a business or create a new product, as well as providing access to technology and capabilities that were once solely the domain of large companies.

Because of the benefits offered by the cloud, more and more small businesses are adopting it. A recent study on SMB cloud use projected the global market for cloud services to expand to $95 billion over the next year and SMB Group estimates the number of small and medium-sized businesses using cloud computing will grow to 44 percent by 2015.

"I think eventually every business has to have somewhere in its portfolio and go-to-market approach a range of cloud services," said IBM Midmarket Business General Manager John Mason in an interview with Forbes. "This is changing the landscape for small and midsize businesses by allowing them to focus on their own innovations and making them more competitive with larger, established companies."

Mason went on to say that the cloud, along with other innovative business tools, has three distinct impacts on SMBs.

  • It makes it possible for companies to go to market with products much quicker by removing large, upfront investments in technology and personnel.
  • Cloud dramatically increases scalability and allows for greater flexibility.
  • It removes geographic strains holding organizations back and offers the ability to work collaboratively with anyone from anywhere.

Reasons for cloud use vary, but all find benefits 
​A new report conducted by Intuit and Emergent Research has also highlighted the benefits the cloud offers SMBs, as well as some of the driving factors behind why companies are adopting the technology. 

"Today, the U.S. and global economy is going through a series of shifts and changes that are reshaping the economic landscape," said Steve King, a partner at Emergent Research, in an interview with Small Business Trends. " In this new landscape, many people are using the power of the cloud to re-imagine the idea of small business and create new, innovative models that work for their needs."

The study projected that 78 percent of small business will have adopted a cloud platform over the next six years. Research from the two companies also found specific types of cloud use among SMBs. Hives, for instance, are small businesses that are able to work together through the cloud and rarely meet in person. Another group, plug-in players, are organizations that utilize cloud services to handle back-end tasks so they are able to stay focused on tasks and processes that are more critical to the business. This group implements solutions such as cloud storage services and applications for accounting, marketing or human resources.

Data center networking market to reach $22 billion

A recent study by research firm MarketsandMarkets projects the global data center networking market to reach $21.8 billion by 2018. According to the report, North America is expected to hold the largest share of it over the next five years.

The study noted the dramatic market potential created by the demand for cloud technologies and software-defined networking in data centers. The increased use of mobile, driven by bring-your-own device policies, and the use of cloud services have caused data center providers to shift their network offerings from traditional models to those more capable of providing the flexibility necessary to quickly transfer workloads between servers.

This shift in data center architecture was originally driven by the demand for virtualization, but a variety of new changes in the market have persuaded providers to favor faster and flatter models over traditional core-distribution-edge designs. Some of the new challenges facing data center managers include heavy inter-server traffic, burst speeds faster than 1 gigabit and the shift from fiber channels to Ethernet networks.

Data centers can no longer be built the way they were even just a few years ago, as the fundamental structure of enterprise applications have changed and with them the needs of users. The adoption of new, more advanced hardware is placing greater demands on data center networks and fueling a boom in the market.

"Data center networks are being re-architected as part of a transition to the next generation of data centers, reimagining how applications and data centers are built," wrote Biztech Magazine contributor Joel Snyder. "This change extends from the power and cooling to the servers and storage, as well as the networking."

As new data centers are built and their designs continue to shift, requirements for increased security and greater distributed and managed services will be front of mind. Other factors will help to shape the creation of the next generation of data centers, including higher speed, reduced latency, layer 2 flattening and high availability. Demand for the installation of new virtualization and storage equipment will offer data center providers the opportunity to rethink facility design and create truly modern data centers.