Posts Tagged ‘data backup’

5 Ways To Secure Your Data

April 17th, 2012 Comments off

It’s never been more important to have comprehensive data security. Hacking is on the rise, as is malware and data loss. Many business entities and individuals are investing
major money into ensuring that their information is safe. Here are the top 5 ways to secure your data:

Use cloud services and/or data

centers—More and more companies are securing their data by utilizing online storage in the cloud, or cloud services, in order to centralize data in real time environments. Data centers like the Digital Realty Trust allow companies to colocate their services, host, and use cloud services in order to cut down on data loss and back up their data.

Steganography—Steganography has been around for a while. It involves hiding data inside of files so that they cannot be easily found. This can be as simple as storing programming code within a folder named “Pictures of Lily,” or it can be a complex process of using security schemes to embed data.

Protect your network—This can be cumbersome process but some would say it is crucial to ensuring that your information cannot be accessed by hackers. The first step is to change your wireless router password. Forget about hiding the Service Set Identifier, disabling DHCP, and filtering MAC addresses—these options will do little to actually protect your network. Your best bet is to throw out the WEP network and adopt WPA or WPA 2, which are encryption standards that use 128-bit encryption keys and a Temporary Key Integrity Protocol. This security should include your Xbox, Wii, laptops, and smartphones.

Encrypt your data—Do your own encrypting and make sure that even if a hacker or thief does get a hold of your hardware they won’t be able to access its data. A great option for this is TrueCrypt, a cross-platform software application for natively securing drives and folders.

Acquire browsing and downloading

privacy—Let’s face it, much of your most important data is entered by you as usernames and passwords while you’re browsing the Internet and downloading files. That’s why securing your Internet privacy and downloads is extremely important . Use an anonymous proxy service such as Vidalia in order to hide your online activity. These are five options for securing data. Combined they should provide you with an extremely valuable safety net against hacking, malware, and viruses. Many of these options, such as cloud computing, will include additional benefits like mor

Data BackUp Now 100x Faster, Claims Symantec

February 12th, 2012 Comments off

Symantec revealed new versions of their flagship NetBackup enterprise-class and Backup Professional midrange backup programs — Backup Professional 2012 and NetBackup v7.5.

Vijay Mhaskar, V . P ., Information Management Group stated that worldwide backup teams are frustrated with skipped backup home windows and non-integrated solutions for physical and virtual backup copies, deduplication and pictures.  Presently organizations face a scenario where recovery, the finish goal of why backup copies exist, is complex and unnecessarily costly.

“Our survey of just one,425 organizations worldwide implies that only 28 percent from the information mill completely certain that 100 % from the backed-up data could be retrieved, virtualization backup ranks second cheapest of effective IT initiatives because the complexity demands more agile systems.” states Mhaskar

The NetBackup v7.5 for that bigger businesses includes features such as the integration of incremental backup copies and deduplication to reduce I/O bandwidth and lower time required for backup copies, in addition to to offer eDiscovery without getting to first replicate data right into a separate repository.

“Our tests around the new system versus existing techniques pf backup reveal that a 61GB file being backed up from US to china which earlier required about 4 hrs 18 minutes 7 seconds now takes only one minute 33 seconds,” states Mhaskar.

Symantec also introduced an broadened partnership with NetApp to integrate with NetBackup Replication Director, enabling clients to unify Snapshot and backup management, and take away the price and risk connected with multiple backup and recovery tools

The V-Ray Edition of NetBackup v7.5 can back up data on virtual machines running either VMware or Microsoft’s HyperV. Furthermore, backup managers can also add physical machines to V-Ray’s backup capacity policy and manage both conditions via a single console view.

The update adds a brand new feature known as Virtual Machine Intelligent Policy (Very important personel) to NetBackup v7.5 , that may instantly identify and back up new, moved, or cloned virtual machines, using deduplication to reduce the quantity of data saved. Again, physical machines could be added to the virtual machine backup copies.

According to Symantec, many companies face the issue of cost exposure caused by over retention of backup tapes.  NetBackup Search identifies what information to archive and what to remove according to relevance to legal discovery or compliance cases.

Symantec stated additionally, it added bare metal backup copies and restore disaster recovery abilities into Backup Professional 2012, which permit a unsuccessful system to be retrieved to an actual server, or to a Hyper-V or VMware machine. The brand new disaster recovery features also allow entire virtual machines, single files, Active Directory objects, Exchange Emails, or SharePoint Documents from the single-pass physical, VMware or Hyper-V backup to be retrieved.

The NetBackup Accelerator feature functions by copying only incremental file changes individuals incremental file changes are then also deduplicated to further reduce network traffic. When Accelerator can be used to recover data, we have an index that enables it to rapidly restore single documents or files.

“The system monitors the files which have transformed and also the blocks which have transformed within individuals files. Then when a backup is performed, the machine knows what blocks to back up and where they’re,Inch stated Mhaskar.

By the certification policy stands on the SaaS model or can be purchased around the capacitive certification model. The organization needs the items to be accessible by before summer 2012.

“Companies have enough money the quantity of data they decide to store around or pay on the per machine license policy,” Mhaskar clarified.


Heat Makes Data Storage Faster

February 8th, 2012 Comments off

A new way of magnetic recording using heat will allow data processing hundreds of times faster than by current hard drive technology, – British researchers say.

International research led by the physics department at the University of York found heat could be used to record information onto magnetic media at a much faster rate than current technologies, a York release said Tuesday.

“Instead of using a magnetic field to record information on a magnetic medium, we harnessed much stronger internal forces and recorded information using only heat,” York physicist Thomas Ostler said.

“This revolutionary method allows the recording of Terabytes (thousands of Gigabytes) of information per second, hundreds of times faster than present hard drive technology. As there is no need for a magnetic field, there is also less energy consumption.”

Until now it has been believed that in order to record one bit of information — by inverting the poles in a magnetic medium — there was a need to apply an external magnetic field.

The researchers demonstrated the positions of the poles of a magnet can be inverted by an ultrashort heat pulse, harnessing the power of much stronger internal forces.

“For centuries it has been believed that heat can only destroy the magnetic order, now we have successfully demonstrated that it can, in fact, be a sufficient stimulus for recording information on a magnetic medium.” ” said Alexey Kimel of the Radboud University Nijmegen in the Netherlands.

Cleversafe Launches 10 Exabyte Data Storage System

February 1st, 2012 Comments off

Enterprises are now routinely storing workloads comprised of terabytes of data, which eventually add up to petabytes of storage. Next stop? Exabytes.

We’re going to be hearing the preface “exa-” as it refers to the data storage industry a lot more as time goes on, so we might as well get used to it.

A current illustration of this point is object-based storage provider Cleversafe, which launched a new multi-rack array system Jan. 30 that can hold billions of objects inside up to 10 exabytes of capacity.

That’s a serious amount of space. While some people describe it as limitless, it isn’t—but it’s pretty close.

For those who would like to see the actual numbers that describe just 1 exabyte, here they are:

1 exabyte = 1,000 petabytes = 1,000,000 terabytes = 1,000,000,000 gigabytes = 1,000,000,000,000 megabytes = 1,000,000,000,000,000 kilobytes = 1,000,000,000,000,000,000 bytes.

Terabyte Loads Now Routine

Enterprises are now routinely storing workloads comprised of terabytes of data, which eventually add up to petabytes of storage. All those packs of petabytes also pile up as time goes on, so what’s the next level of storage needed? Right: exabytes.

Realistically, only the true high-end enterprise systems—such as those deployed by scientific researchers, online game providers, digital video studios, stock markets, government and military installations and high-end financial services companies—are using petabyte-type storage now and will be looking at exabyte-able storage in 2012 or 2013.

But Chicago-based Cleversafe is one storage provider that figures no time is better than the present for planning for the future.

In its new 10-exabyte configuration, Cleversafe uses the same object-based dispersed storage system it developed on its own six years ago; only now it has been expanded to allow for independent scaling of storage capacity through what it calls a “portable datacenter,” a collection of storage and network racks that can be easily deployed or moved.

Each portable datacenter contains 21 racks with 189 storage nodes per PD and forty-five 3TB drives per storage node. This geographically distributed model allows for rapid scale and mobility and is optimized for site failure tolerance and high availability, Cleversafe said.

The company’s own configuration includes 16 sites across the U.S. with 35 PDs per site and hundreds of simultaneous readers/writers to deliver instantaneous access to billions of objects.

Traffic Volumes Increasing at 32 Percent Rate Per Year

“Internet traffic volumes are increasing at a rate of 32 percent globally each year. It’s not unrealistic to think companies looking to mine that data would need to effectively analyze 80 exabytes of data per month by 2015,” said Russ Kennedy, Cleversafe vice president of product strategy, marketing and customer solutions.

“To any company, data is a priceless component. However, it’s only valuable if a company can effectively look across that data over time for trends or to analyze behavior and to do it cost effectively.”

Pricing and other information can be obtained on an individual basis via email or on the Cleversafe Website.

Rise in Global Disasters Means Rise in Data Backup Solutions

January 6th, 2012 Comments off

Data Backup Solutions,Global Disasters,Data StorageOn November 18, the UN Intergovernmental Panel on Climate Change (IPCC) released a report, Managing the Risks of Extreme Events and Disasters, claiming that scientists are “virtually certain” the world will have more extreme heat spells. By 2050, heat waves could be in the range of 5 degrees hotter, and 9 degrees by 2100. This increase in global temperatures means that heavy rainfall will occur more often, and tropical cyclones will become more severe.

“By the end of this century, intense, heavy rainstorms that now typically happen only once every 20 years are likely to occur about twice a decade”, the report stated.

So far, the predictions made in the report seem to be accurate:  The last few years have seen a heightened level of natural disasters, prompting thousands of businesses around the world to reassess their continuity and disaster recovery plans. With more and more sensitive data being stored electronically, these companies have begun to take online data backup services seriously in a bid to reduce downtime and avoid extensive loss in revenue.

Last month, Thailand experienced its worst flooding in seven years. Bangkok – the center of commerce and trade in the country – was completely inundated, and hundreds of businesses lost a considerable amount of revenue as a result. Most businesses in Thailand had been slow to adopt cloud computing and backup data centers as part of a contingency plan, but in the wake of these historic floods, companies are taking their data backup seriously.

Monsinee Keeratikrainon, manager of global research firm Frost & Sullivan, said demand in cloud computing services was set to expand more quickly because of natural disasters like the flooding in Thailand.

“The cloud will likely get more attention from companies as they prepare business continuity plans for any future crisis,” she said.

So far, backup centers in Thailand have received a lot of attention:  Demand for offsite data storage centers in Thailand rose by nearly 300% during the floods, mainly from manufacturers who needed to transfer their data to a safe location.

“The cloud market in Thailand is expected to grow by 50% to 1.5 billion baht (USD $47,800,000) next year,” Keeratikrainon said. This growth has the potential to prompt an investment of at least 500 million baht (USD $15,000,000) in expanding online data centers to accommodate demand.

But the growth of the cloud market isn’t confined to Thailand. As natural disasters increase worldwide, the demand for backup data centers is increasing with it.  The United States has recently experienced one of its worst years in natural disasters. According to the National Oceanic and Atmospheric Administration, the US experienced over $55 billion in damages from natural disasters in 2011, the worst in US history.  This has prompted US businesses to embrace data backup and disaster recovery systems. IT companies that provide support and strategy services for small and medium-sized businesses have experienced significant growth, citing the rise in natural disasters as one cause. ProviDyn, one such company, has increased revenue by 75% in the past year, and recently expanded their staff by 25 percent.

“At its basic level, controlling data is about controlling risk, which means being prepared in the event of disaster so that you can restore your business without losing its most important asset – information,” said Blaine Rigler, general manager of US-based Iron Mountain Data Backup and Recovery. “The amount of information that needs to be protected is growing at an incredible pace, creating new data challenges every day.”

And no other industry feels these challenges more than the healthcare industry. Natural disasters can cause a healthcare facility more than financial loss; it can potentially affect the lives of its patients if their personal charts or prescriptions are lost. Having their data stored remotely is crucial to restoring the facility’s operating conditions. Last May, a tornado leveled St. John’s Regional Medical Center in Joplin, Mo., completely wiping out their Electronic Health Records (EHR). But, because their medical records were all stored in a remote data center, they were able to reduce the devastation of the tornado and quickly return to providing for their patients.

“Within seven days, we had the EHR system up and running again, having retrieved the data from a mobile medical unit,” said Michael McCreary, chief of technology services for Sisters of Mercy Health Systems,  the organization in charge of St. John’s rebuilding efforts.

“We were lucky to have a paperless system that could be restored fairly quickly,” McCreary continued. “Some of the hospital’s old paper records got blown 70 miles away.”