5 Ways To Secure Your Data

It’s never been more important to have comprehensive data security. Hacking is on the rise, as is malware and data loss. Many business entities and individuals are investing
major money into ensuring that their information is safe. Here are the top 5 ways to secure your data:

Use cloud services and/or data

centers—More and more companies are securing their data by utilizing online storage in the cloud, or cloud services, in order to centralize data in real time environments. Data centers like the Digital Realty Trust allow companies to colocate their services, host, and use cloud services in order to cut down on data loss and back up their data.

Steganography—Steganography has been around for a while. It involves hiding data inside of files so that they cannot be easily found. This can be as simple as storing programming code within a folder named “Pictures of Lily,” or it can be a complex process of using security schemes to embed data.

Protect your network—This can be cumbersome process but some would say it is crucial to ensuring that your information cannot be accessed by hackers. The first step is to change your wireless router password. Forget about hiding the Service Set Identifier, disabling DHCP, and filtering MAC addresses—these options will do little to actually protect your network. Your best bet is to throw out the WEP network and adopt WPA or WPA 2, which are encryption standards that use 128-bit encryption keys and a Temporary Key Integrity Protocol. This security should include your Xbox, Wii, laptops, and smartphones.

Encrypt your data—Do your own encrypting and make sure that even if a hacker or thief does get a hold of your hardware they won’t be able to access its data. A great option for this is TrueCrypt, a cross-platform software application for natively securing drives and folders.

Acquire browsing and downloading

privacy—Let’s face it, much of your most important data is entered by you as usernames and passwords while you’re browsing the Internet and downloading files. That’s why securing your Internet privacy and downloads is extremely important . Use an anonymous proxy service such as Vidalia in order to hide your online activity. These are five options for securing data. Combined they should provide you with an extremely valuable safety net against hacking, malware, and viruses. Many of these options, such as cloud computing, will include additional benefits like mor

Read More

Data BackUp Now 100x Faster, Claims Symantec

Symantec revealed new versions of their flagship NetBackup enterprise-class and Backup Professional midrange backup programs — Backup Professional 2012 and NetBackup v7.5.

Vijay Mhaskar, V . P ., Information Management Group stated that worldwide backup teams are frustrated with skipped backup home windows and non-integrated solutions for physical and virtual backup copies, deduplication and pictures.  Presently organizations face a scenario where recovery, the finish goal of why backup copies exist, is complex and unnecessarily costly.

“Our survey of just one,425 organizations worldwide implies that only 28 percent from the information mill completely certain that 100 % from the backed-up data could be retrieved, virtualization backup ranks second cheapest of effective IT initiatives because the complexity demands more agile systems.” states Mhaskar

The NetBackup v7.5 for that bigger businesses includes features such as the integration of incremental backup copies and deduplication to reduce I/O bandwidth and lower time required for backup copies, in addition to to offer eDiscovery without getting to first replicate data right into a separate repository.

“Our tests around the new system versus existing techniques pf backup reveal that a 61GB file being backed up from US to china which earlier required about 4 hrs 18 minutes 7 seconds now takes only one minute 33 seconds,” states Mhaskar.

Symantec also introduced an broadened partnership with NetApp to integrate with NetBackup Replication Director, enabling clients to unify Snapshot and backup management, and take away the price and risk connected with multiple backup and recovery tools

The V-Ray Edition of NetBackup v7.5 can back up data on virtual machines running either VMware or Microsoft’s HyperV. Furthermore, backup managers can also add physical machines to V-Ray’s backup capacity policy and manage both conditions via a single console view.

The update adds a brand new feature known as Virtual Machine Intelligent Policy (Very important personel) to NetBackup v7.5 , that may instantly identify and back up new, moved, or cloned virtual machines, using deduplication to reduce the quantity of data saved. Again, physical machines could be added to the virtual machine backup copies.

According to Symantec, many companies face the issue of cost exposure caused by over retention of backup tapes.  NetBackup Search identifies what information to archive and what to remove according to relevance to legal discovery or compliance cases.

Symantec stated additionally, it added bare metal backup copies and restore disaster recovery abilities into Backup Professional 2012, which permit a unsuccessful system to be retrieved to an actual server, or to a Hyper-V or VMware machine. The brand new disaster recovery features also allow entire virtual machines, single files, Active Directory objects, Exchange Emails, or SharePoint Documents from the single-pass physical, VMware or Hyper-V backup to be retrieved.

The NetBackup Accelerator feature functions by copying only incremental file changes individuals incremental file changes are then also deduplicated to further reduce network traffic. When Accelerator can be used to recover data, we have an index that enables it to rapidly restore single documents or files.

“The system monitors the files which have transformed and also the blocks which have transformed within individuals files. Then when a backup is performed, the machine knows what blocks to back up and where they’re,Inch stated Mhaskar.

By the certification policy stands on the SaaS model or can be purchased around the capacitive certification model. The organization needs the items to be accessible by before summer 2012.

“Companies have enough money the quantity of data they decide to store around or pay on the per machine license policy,” Mhaskar clarified.

symantec-logo

Read More

Heat Makes Data Storage Faster

A new way of magnetic recording using heat will allow data processing hundreds of times faster than by current hard drive technology, – British researchers say.

International research led by the physics department at the University of York found heat could be used to record information onto magnetic media at a much faster rate than current technologies, a York release said Tuesday.

“Instead of using a magnetic field to record information on a magnetic medium, we harnessed much stronger internal forces and recorded information using only heat,” York physicist Thomas Ostler said.

“This revolutionary method allows the recording of Terabytes (thousands of Gigabytes) of information per second, hundreds of times faster than present hard drive technology. As there is no need for a magnetic field, there is also less energy consumption.”

Until now it has been believed that in order to record one bit of information — by inverting the poles in a magnetic medium — there was a need to apply an external magnetic field.

The researchers demonstrated the positions of the poles of a magnet can be inverted by an ultrashort heat pulse, harnessing the power of much stronger internal forces.

“For centuries it has been believed that heat can only destroy the magnetic order, now we have successfully demonstrated that it can, in fact, be a sufficient stimulus for recording information on a magnetic medium.” ” said Alexey Kimel of the Radboud University Nijmegen in the Netherlands.

Read More

Cleversafe Launches 10 Exabyte Data Storage System

Enterprises are now routinely storing workloads comprised of terabytes of data, which eventually add up to petabytes of storage. Next stop? Exabytes.

We’re going to be hearing the preface “exa-” as it refers to the data storage industry a lot more as time goes on, so we might as well get used to it.

A current illustration of this point is object-based storage provider Cleversafe, which launched a new multi-rack array system Jan. 30 that can hold billions of objects inside up to 10 exabytes of capacity.

That’s a serious amount of space. While some people describe it as limitless, it isn’t—but it’s pretty close.

For those who would like to see the actual numbers that describe just 1 exabyte, here they are:

1 exabyte = 1,000 petabytes = 1,000,000 terabytes = 1,000,000,000 gigabytes = 1,000,000,000,000 megabytes = 1,000,000,000,000,000 kilobytes = 1,000,000,000,000,000,000 bytes.

Terabyte Loads Now Routine

Enterprises are now routinely storing workloads comprised of terabytes of data, which eventually add up to petabytes of storage. All those packs of petabytes also pile up as time goes on, so what’s the next level of storage needed? Right: exabytes.

Realistically, only the true high-end enterprise systems—such as those deployed by scientific researchers, online game providers, digital video studios, stock markets, government and military installations and high-end financial services companies—are using petabyte-type storage now and will be looking at exabyte-able storage in 2012 or 2013.

But Chicago-based Cleversafe is one storage provider that figures no time is better than the present for planning for the future.

In its new 10-exabyte configuration, Cleversafe uses the same object-based dispersed storage system it developed on its own six years ago; only now it has been expanded to allow for independent scaling of storage capacity through what it calls a “portable datacenter,” a collection of storage and network racks that can be easily deployed or moved.

Each portable datacenter contains 21 racks with 189 storage nodes per PD and forty-five 3TB drives per storage node. This geographically distributed model allows for rapid scale and mobility and is optimized for site failure tolerance and high availability, Cleversafe said.

The company’s own configuration includes 16 sites across the U.S. with 35 PDs per site and hundreds of simultaneous readers/writers to deliver instantaneous access to billions of objects.

Traffic Volumes Increasing at 32 Percent Rate Per Year

“Internet traffic volumes are increasing at a rate of 32 percent globally each year. It’s not unrealistic to think companies looking to mine that data would need to effectively analyze 80 exabytes of data per month by 2015,” said Russ Kennedy, Cleversafe vice president of product strategy, marketing and customer solutions.

“To any company, data is a priceless component. However, it’s only valuable if a company can effectively look across that data over time for trends or to analyze behavior and to do it cost effectively.”

Pricing and other information can be obtained on an individual basis via email or on the Cleversafe Website.

Read More

Rise in Global Disasters Means Rise in Data Backup Solutions

Data Backup Solutions,Global Disasters,Data StorageOn November 18, the UN Intergovernmental Panel on Climate Change (IPCC) released a report, Managing the Risks of Extreme Events and Disasters, claiming that scientists are “virtually certain” the world will have more extreme heat spells. By 2050, heat waves could be in the range of 5 degrees hotter, and 9 degrees by 2100. This increase in global temperatures means that heavy rainfall will occur more often, and tropical cyclones will become more severe.

“By the end of this century, intense, heavy rainstorms that now typically happen only once every 20 years are likely to occur about twice a decade”, the report stated.

So far, the predictions made in the report seem to be accurate:  The last few years have seen a heightened level of natural disasters, prompting thousands of businesses around the world to reassess their continuity and disaster recovery plans. With more and more sensitive data being stored electronically, these companies have begun to take online data backup services seriously in a bid to reduce downtime and avoid extensive loss in revenue.

Last month, Thailand experienced its worst flooding in seven years. Bangkok – the center of commerce and trade in the country – was completely inundated, and hundreds of businesses lost a considerable amount of revenue as a result. Most businesses in Thailand had been slow to adopt cloud computing and backup data centers as part of a contingency plan, but in the wake of these historic floods, companies are taking their data backup seriously.

Monsinee Keeratikrainon, manager of global research firm Frost & Sullivan, said demand in cloud computing services was set to expand more quickly because of natural disasters like the flooding in Thailand.

“The cloud will likely get more attention from companies as they prepare business continuity plans for any future crisis,” she said.

So far, backup centers in Thailand have received a lot of attention:  Demand for offsite data storage centers in Thailand rose by nearly 300% during the floods, mainly from manufacturers who needed to transfer their data to a safe location.

“The cloud market in Thailand is expected to grow by 50% to 1.5 billion baht (USD $47,800,000) next year,” Keeratikrainon said. This growth has the potential to prompt an investment of at least 500 million baht (USD $15,000,000) in expanding online data centers to accommodate demand.

But the growth of the cloud market isn’t confined to Thailand. As natural disasters increase worldwide, the demand for backup data centers is increasing with it.  The United States has recently experienced one of its worst years in natural disasters. According to the National Oceanic and Atmospheric Administration, the US experienced over $55 billion in damages from natural disasters in 2011, the worst in US history.  This has prompted US businesses to embrace data backup and disaster recovery systems. IT companies that provide support and strategy services for small and medium-sized businesses have experienced significant growth, citing the rise in natural disasters as one cause. ProviDyn, one such company, has increased revenue by 75% in the past year, and recently expanded their staff by 25 percent.

“At its basic level, controlling data is about controlling risk, which means being prepared in the event of disaster so that you can restore your business without losing its most important asset – information,” said Blaine Rigler, general manager of US-based Iron Mountain Data Backup and Recovery. “The amount of information that needs to be protected is growing at an incredible pace, creating new data challenges every day.”

And no other industry feels these challenges more than the healthcare industry. Natural disasters can cause a healthcare facility more than financial loss; it can potentially affect the lives of its patients if their personal charts or prescriptions are lost. Having their data stored remotely is crucial to restoring the facility’s operating conditions. Last May, a tornado leveled St. John’s Regional Medical Center in Joplin, Mo., completely wiping out their Electronic Health Records (EHR). But, because their medical records were all stored in a remote data center, they were able to reduce the devastation of the tornado and quickly return to providing for their patients.

“Within seven days, we had the EHR system up and running again, having retrieved the data from a mobile medical unit,” said Michael McCreary, chief of technology services for Sisters of Mercy Health Systems,  the organization in charge of St. John’s rebuilding efforts.

“We were lucky to have a paperless system that could be restored fairly quickly,” McCreary continued. “Some of the hospital’s old paper records got blown 70 miles away.”

Read More

Data Storage Corporation Partners With inFORM Decisions

Cloud storage firm Data Storage Corporation announced on Thursday that it has formed a partnership with document automation solutions provider inFORM Decisions to offer a one-stop-shop for automating document processes and protecting the data, applications and systems to ensure business continuity.

This partnership comes a month after Data Storage Corp signed a $20 million equity line agreement with investment firm Southridge Partners.

According to the press release, inFORM Decisions provides solutions specifically for IBM i Power System/iSeries environments.

“With document automation solutions specifically designed for the IBM i environments, inFORM is a logical partner for Data Storage Corp., As inFORM focuses on helping organizations more efficiently manage their electronic documents, we can help protect the IBM infrastructure to ensure that data is protected, recoverable and available during any potential system downtime – planned or unplanned.” Peter Briggs, executive VP at DSC said in a statement.

inFORM Decisions specializes in electronic document automation and management, and accounts payable solutions for IBM System i, AS/400, iSeries and IBM Power Systems, according to the press release.

“Our solutions enable organizations to reduce costs, increase productivity and save trees by automating document processes and eliminating dependency on printing paper, Partnering with Data Storage Corp. makes sense, as we can now mutually offer our clients not only state-of-the-art document management solutions, but also provide Data Storage solutions to safeguard their document and report data on the IBM i system.” Alex Rodriguez, business development manager at inFORM Decisions said in a statement.

About inFORM Decisions inFORM Decisions specializes in electronic Document Automation and Management, and Accounts Payable solutions for IBM System i, AS/400, iSeries and IBM Power Systems. inFORM’s acclaimed iDocs Suite make it easy to design, distribute and print laser forms and MICR checks directly from IBM i output; intelligently burst, sort, format and distribute reports; and provide easy, 24-7 Web access to all electronic documents, saving thousands of dollars plus obsolete paper document management, inventory and mailing costs. iDocs works with any IBM i-based ERP/accounting solution with no additional coding. IFD was one of the first IBM Business Partners to implement a comprehensive e-document distribution system powered by intelligent routing capabilities for fax, email, archive-retrieval and laser forms. The company’s products are organized into two product families, compatible with more than 30 popular brands of application software. An IBM Business Partner since 1998, inFORM’s worldwide headquarters are located in Rancho Santa Margarita, California. To learn more, visit www.informdecisions.com or call 949.709.5838.

About Data Storage Corp. Data Storage Corporation was incorporated in the state of Delaware on August 21, 2001. DSC is the resulting company of a merger between Emergent LLC , a broadband service company and Data Storage Corporation. Following the merger, DSC quickly became a leading edge service bureau of offsite backup, offering and providing disaster recovery solutions. Over the years DSC has emerged into a one-stop shop for all your disaster recovery, business continuity and information technology integration needs. Working with Strategic partners such as Microsoft, Cisco, Dell and many others, DSC can provide the solutions your business requires.

Read More

10 Trends in Cloud Data for 2012

Cloud Data,Data Storage,Disaster RecoveryWhen 2010 came to a close, the introduction of new private and hybrid cloud systems was on everybody’s trending-up list. Now, as 2011 plays out, we are able to observe that individuals forecasts were generally correct it had been indeed annually of cloud adoption. 1000’s of recent clouds were architected, built and used throughout the final 12 several weeks, plus they came online in most size marketplaces. Where is trending headed for the following 12 several weeks? The majority of the IT business prognosticators have been in agreement on a minumum of one factor: The bend for cloud-based IT buying will continue “up and right.Inch You will find a lot of cost, deployment and monitoring benefits involved for companies to disregard this. With this particular like a backdrop, eWEEK presents here a couple of forecasts for 2012 within the cloud infrastructure space. Our resource is TwinStrata Boss and co-founder Nicos Vekiarides. TwinStrata offers the CloudArray storage-area network (SAN) which comes either like a cloud service or on the commodity server and may be blocked in to a data center. CloudArray finds data stores wherever they’re and combines them.

Existing Storage Remains Used
For many companies, the idea of moving almost all their data towards the cloud isn’t achievable. However, continuously growing data storage is fueling an excuse for more capacity. Believe to deal with this need compared to cloud storage? The advantages are use of a safe and secure, unlimited pool of storage capacity that never requires upgrade/alternative and reduces capital expense.

Private Clouds Growing in Large Businesses
Businesses searching to leverage the financial systems, efficiencies and scale that cloud companies have accomplished are implementing cloud models in-house for example OpenStack for compute and storage conditions. These private clouds offer scale, agility and cost/performance typically unmatched by traditional infrastructure solutions and, simultaneously, can reside in the company’s firewall.

Disaster Recovery to Cloud Becomes a possible option
Typically, firms that need disaster recovery and business continuity have always needed to depend on devoted duplicated infrastructure in an off-site place to have the ability to get over physical disaster. What this means is having to pay capital expenses for frequently-idle hardware, before the disaster strikes. A DR cloud service means not needing to purchase this infrastructure, except when it’s needed. Trade-off? While zero-down time disaster recovery is going to be unlikely, search for service-level contracts (SLAs) that provide recovery-time objectives (RTOs) within hrs.

DR In the Cloud Can Become essential
What goes on to any or all your computer data that resides within the cloud trapped with a software like a service (SaaS) application within the situation of the disaster? The fact is, most companies will have a tragedy recovery strategy, but how will you create an additional degree of protection that’s beneath your control? Search for a brand new variety of solutions that backup SaaS data either in your area in order to another provider.

Simpler On-Boarding of Programs towards the Cloud
Certain business application could be moved entirely towards the cloud, saving the administration and upkeep of their hardware/software platforms on-site. Companies are searching for tools to create this migration viable, specially the IT-strapped organizations that may help the most. Search for new robust toolsets that may migrate programs to a range of cloud companies.

Nonrelational Databases for Large Data Workloads
NoSQL databases, for example Apache CouchDB, enable great scalability to meet the requirements of terabytes and petabytes of information for countless customers. Large data workloads will pressure a lot of companies to think about these options to traditional databases, and cloud deployment models will simplify the rollout. Search for suppliers supplying supported NoSQL solutions.

Solid-Condition Disk Storage Tiers within the Cloud
Moving greater-performance programs within the cloud does not always guarantee that they’ll get the amount of performance they require using their data storage. By providing high-performance tiers of storage which are SSD-based (mainly NAND expensive), cloud companies will have the ability to address the requirements for foreseeable and faster application response occasions.

Data Reduction Will get Better
With data storage still commanding a per-GB operating expense within the cloud, deduplication and compression technologies have grown to be rather ubiqitous in assisting minimize costs. Although some may argue the capability optimisation game has performed out, there’s still the task of capacity optimisation on the more global scale to lessen aggregate capacity usage across multiple tenants. Furthermore, there remains challenging for wealthy media content, which doesn’t fare particularly well with present day technologies. Search for the development of new data reduction IT that addresses both needs.

More Utilisation of the Cloud for Statistics
Statistics need a scalable compute and storage atmosphere that may be very costly to construct from devoted hardware (i.e., Oracle’s $a million Exalogic database machine). Statistics software may also be an extremely costly area of the proposition. Much like hardware remaining idle for disaster recovery reasons, statistics for a lot of companies might be a periodic exercise that only runs in a nutshell bursts and might not be suited or viable for devoted conditions. Statistics conditions within the cloud turn the periodic expense right into a “pay-per-use” bill, meeting business goals in a cheaper cost point.

‘Cloud-Envy’ Gets To Be More Commonplace
Although companies will adopt clouds in 2012, others can always wait and ponder well past 2012. In most cases, the realization from the financial aspects and efficiencies from the cloud is apparent, although the strategy might be different. In reaction, a few of the laggards could find ideas and applications proven cloud methods that improve IT efficiency on-premise or, regrettably, some may be taken in by cloud-washing by buying traditional IT infrastructure having a cloud title inside a feeble make an effort to satisfy their “cloud-envy.”

Read More

Data Storage to Be Priority in 2012 – KeepItSafe

Data Storage,KeepItSafeThe storage, securement, access and management of growing volumes of information will mark 2012 within the technology sector, handled online backup and disaster recovery services provider KeepItSafe forecasts.

This rise in the quantity of data does mean using USB stays for data storage will decrease as companies be conscious of the potential risks that may include while using products.

USB sticks are small , portable, and therefore they are able to easily finish up lost or stolen. Research because of it security and data protection company Sophos also says two-thirds of 50 USB stays available on Australian trains and buses were have contracted adware and spyware and contained unencrypted details about their former proprietors.

Therefore, KeepItSafe stated, the virtualisation of servers and desktop computers increases as companies will have to adjust their storage systems to support the growing quantity of data they produce.

“It is impossible for service companies to carry on to handle storage needs with traditional local storage, this can pressure natural evolution to disaster-recovery solutions and data backup towards the cloud. This change won’t be viewed among large businesses but additionally among SMBs. stated Eoin Blacklock, controlling director, KeepItSafe.

“All companies will have to look for flexible, scalable and affordable storage options that may grow using their companies, putting data support high on the diary for 2012.”

Tape has become progressively hard to rely on and it is receding of favour with lots of organisations, Blacklock added.

Technology research and advisory firm Gartner discovered that 71pc of backup tape reinstates fail.

The escape from tape may also benefit companies financially, based on EMC Ireland?¡¥s country manager Jason Ward.

“Companies and public-sector organisations can significantly spend less and be slimmer by eliminating tape, that is cumbersome and susceptible to security breaches,” Ward stated.

Read More

Top Data Storage Acquisitions: SSD Technology in Demand

SSD Technology,Data StorageIn December 2011, the closely scrutinized $1.4 billion Seagate-Samsung deal closed to place an exclamation point on the year by which hard disks (HDDs) and solid-condition drive (SSD) technology were in demand. There is a lot of consolidation among hard disk suppliers throughout the entire year, carrying out a flurry of storage array purchases this year. But storage array purchases didn’t entirely disappear this year.

1. Seagate-Samsung make deadline, Western Digital-HGST deal pending.

Seagate Technology acquired Samsung Electronics’ M8 products for $1.4 billion in December. The offer needed seven several weeks to shut, mainly due to major regulating hurdles. The Seagate-Samsung acquisition gave Seagate its archrival’s enterprise hard disk business, which brought government bodies to possess serious concerns by what which means for competition available on the market. The offer gives Seagate Samsung’s type of 2.5-inch high-capacity hard disk drives. Samsung will even provide Seagate with chips for enterprise SSDs, while Seagate will give hard disk drives to Samsung for Computers and consumer products.

Using the alarms sounding the brand new Year, the greatest hard disk deal of — Western Digital’s suggested $4.3 billion takeover of Hitachi Global Storage Technology (HGST) – didn’t near the coast time for you to result in the 2011 calendar. The Western Digital-HGST deal is anticipated to shut next March, annually after it was initially introduced. The close was postponed by anti-trust rules, but Western Digital finally won Eu approval in November to get Hitachi’s Hard disk drive business after saying yes to market off some assets to get rid of concerns regarding competition.

2. Hitachi and BlueArc make it official.

The only real surprise with this particular data storage technology acquisition is it required such a long time. Before finishing the $600 million Hitachi-BlueArc deal, Hitachi Data Systems (HDS) offered BlueArc NAS systems for 5 years with an OEM deal. And although BlueArc filed forms to accomplish an IPO and go public, it depended on HDS in excess of 40% of their revenue and not were built with a lucrative quarter before joining the HDS fold.

3. Oracle brings Pillar in to the fold.

It’s tough to put a cost on that one. What we should can say for certain is the fact that Oracle Corp. Boss Ray Ellison plus some key business affiliates were owed $544 million by Pillar Data Systems, caused by financial loans and interest that funded Pillar from the beginning. In This summer, Oracle acquired Pillar, calling it a vital 4th element to the data storage strategy. Just how much Oracle really pays Ellison, Pillar and also the a number of other gamers involved won’t be revealed until 2014, when Pillar’s performance is going to be examined.

But enough concerning the amounts let’s talk storage. Pillar’s Axiom storage array handles block and file work on the same time frame, and it is application-aware — using faster disk drives for data that’s more dynamic than data designated to reduced disks. The Oracle purchase of Pillar was referred to by Oracle included in its technique to “redefine storage” and, essentially, build out a type of hardware items which makes its programs improve your speed.

4. NetApp grabs LSI’s Engenio.

NetApp Corporation.’s $480 million purchase of LSI’s Engenio means NetApp has two platforms after many years of positioning its single, unified platform like a competitive advantage versus. the different network-attached storage (NAS) and storage-area network (SAN) possibilities using their company suppliers. But NetApp executives performed in the two different marketplaces the organization is focusing on, saying the Engenio block-storage systems is going to be situated in video along with other high-performance computing (HPC) marketplaces as the NetApp FAS platform is targeted at mainstream storage and virtualized infrastructures. Just like the HDS-BlueArc and Oracle-Pillar deals, the gamers within this acquisition understood one another well- NetApp Boss Tom Georgens held exactly the same publish at Engenio before joining NetApp in 2005.

5. SanDisk bets on Pliant, multi-level cell (MLC).

SanDisk Corp. acquired startup Pliant Technology for $327 million, giving SanDisk entry in to the enterprise market and supplying more assets for growth and development of Pliant’s technology. SanDisk purchasing Pliant was the business’s method of betting on Pliant’s MLC-based expensive technology because the next large factor. SanDisk intends to still sell Pliant’ single-level cell (SLC) expensive too, but SanDisk executives see MLC as the easiest method to push more expensive technology into enterprise shops.

6. Fusion-io acquires caching software startup IO Turbine.

PCI Express (PCIe)-based solid-condition storage vendor Fusion-io Corporation. went public this year making its first acquisition a couple of several weeks later. Feeling the warmth using their company expensive suppliers for example STEC Corporation. and also the OCZ Technology Group Corporation. — which each introduced new PCIe-based SSD choices this year — Fusion-io bought caching software startup IO Turbine for roughly $95 million. Fusion-io purchasing IO Turbine outlined the growing recognition of SSD-based, or expensive-based, caching.

7. Quantum pockets Pancetera.

Backup and archive player Quantum Corp. acquired Pancetera Software for $12 million. Quantum acquired the virtual machine (VM) backup specialist to enhance its DXi data deduplication line and, eventually, its StorNext file system. The majority of Pancetera’s employees became a member of Quantum, such as the co-founders and Boss, and also the acquisition was seen for Quantum to achieve credibility in the region of safeguarding data on virtual servers.

Read More

Data Backup Glossary (Letter S)

SaaS: Software as a Service
Software as a Service (SaaS) is a software delivery method that provides access to software and its functions remotely as a web-based service. Software as a Service allows organizations to access business functionality at a cost typically less than paying for licensed applications since SaaS pricing is based on a monthly fee. Also, because the software is hosted remotely, users don’t need to invest in additional hardware. Software as a Service removes the need for organizations to handle the installation, setup, and often daily upkeep and maintenance. Software as a Service may also be referred to as simply hosted applications.

SaaS: Storage as a Service
Storage as a Service (SaaS) is a storage model in which a business or organization (the client) rents or leases storage space from a third-party provider. Data is transferred from the client to the service provider via the Internet and the client then accesses the stored data using software provided by the storage provider. The software is used to perform common tasks related to storage, such as data backups and data transfers. Storage as a Service is popular with SMBs because there usually are no start-up costs (for example, servers, hard disks, IT staff, and so on) involved. Businesses pay for the service based only on the amount of storage space used. Storage as a Service may also be called hosted storage.

SAN
A Storage Area Network (SAN) is a high-speed subnetwork of shared storage devices. A storage device is a machine that contains nothing but a disk or disks for storing data. A SAN’s architecture works in a way that makes all storage devices available to all servers on a LAN or WAN. As more storage devices are added to a SAN, they too will be accessible from any server in the larger network. In this case, the server merely acts as a pathway between the end user and the stored data. Because stored data does not reside directly on any of a network’s servers, server power is utilized for business applications, and network capacity is released to the end user.

SAN fabric
The hardware that connects workstations and servers to storage devices in a SAN. The SAN fabric enables any-server-to-any-storage device connectivity through the use of Fibre Channel switching technology.

SAN services
A technology used by businesses to obtain greater flexibility in their data storage. A Storage Area Network (SAN) provides raw storage devices across a network, and is typically sold as a service to customers who also purchase other services. SAN services may also be administered over an existing, local fiber network, and administered through a service subscription plan.

Scratch disk
Space dedicated on a hard drive for temporary storage of data. Scratch disks are commonly used in graphic design programs, such as Adobe Photoshop. Scratch disk space is only for temporary storage and cannot be used for permanently backing up files. Scratch disks can be set to erase all data at regular intervals so that the disk space is left free for future use. The management of scratch disk space is typically dynamic, occurring when needed.

Seed
The first full backup of company data.

Secret storage technology
A technology for encrypting and hiding data on a hard drive, flash drive, or when transferring files. Secret storage is a portion of encrypted data, hidden in some file or FAT/FAT32/NTFS partitions. To the end-user, it looks like a folder in which he may add files and folders and protect it with a password.

Selective backup
A type of backup where only the user-specified files and directories are backed up. A selective backup is commonly used for backing up files which change frequently or in situations where the space available to store backups is limited. Also called a partial backup.

Serial storage architecture
Serial storage architecture (SSA) is an open industry-standard interface that provides a high-performance, serial interconnect technology used to connect disk devices and host adapters. SSA serializes the SCSI data set and uses loop architecture that requires only two wires: transmit and receive. The SSA interface also supports full-duplex, so it can transmit and receive data simultaneously at full speed.

Server cage area
The area where a company stores its data center equipment. This area is protected from personnel access.

Service-level agreement
A service-level agreement (SLA) is an agreement between a service provider, such as an IT department, an Internet services provider, or an intelligent device acting as a server, and a service consumer. A service level agreement defines parameters for measuring the service, and states quantitative values for those parameters.

Slack space
The unused space in a disk cluster. The DOS and Windows file systems use fixed-size clusters. Even if the actual data being stored requires less storage than the cluster size, an entire cluster is reserved for the file. The unused space is called the slack space. DOS and older Windows systems use a 16-bit file allocation table (FAT), which results in very large cluster sizes for large partitions. For example, if the partition size is 2 GB, each cluster will be 32 K. Even if a file requires only 4 K, the entire 32 K will be allocated, resulting in 28 K of slack space. Windows 95 OSR 2 and Windows 98 resolve this problem by using a 32-bit FAT (FAT32) that supports cluster sizes smaller than 1K.

Small and medium enterprise (SME)
Companies whose headcount or turnover fall below certain limits. In the United states, a small business is often defined as having fewer than 100 employees. A medium-size business is often defined as having fewer than 500 employees.

Small to mid-size business (SMB)
Companies whose headcount or turnover fall below certain limits. In the United states, a small business is often defined as having fewer than 100 employees. A mid-size business is often defined as having fewer than 500 employees.

Snapshot backup
A virtual copy of a device or file system. Snapshots imitate the way a file or device looked at the precise time the snapshot was taken. It is not a copy of the data, only a picture in time of how the data was organized. Snapshots can be taken according to a scheduled time and provide a consistent view of a file system or device for a backup and recovery program to work from.

Solid state disk
A solid state disk (SSD) is a high-performance plug-and-play storage device that contains no moving parts. SSD components include either DRAM or EEPROM memory boards, a memory bus board, a CPU, and a battery card. Because SSDs contain their own CPUs to manage data storage, they are a lot faster (18MBps for SCSI-II and 35 MBps for UltraWide SCSI interfaces) than conventional rotating hard disks; therefore, they produce highest possible I/O rates.

Spin valve
Another name for a giant magnetoresistive(GMR) head. The term was coined by IBM.

Storage

  • The capacity of a device to hold and retain data.
  • Short for mass storage.

Storage bay bridge
Storage bridge bay (SBB) is a specification that defines mechanical, electrical, and low-level enclosure management requirements for an enclosure controller slot that will support a variety of storage controllers from a variety of independent hardware vendors and system vendors. Any storage controller design based on the SBB specification will be able to fit, connect, and operate within any storage enclosure controller slot design based on the same specification.

Storage Consolidation
The concept of centralized storage where resources are shared among multiple applications and users. Traditionally, organizations have deployed servers with direct-attached storage (DAS) as file servers. However, many organizations are facilitating server consolidation by deploying network-attached storage (NAS). NAS provides a single purpose device that can provide CIFS and NF- connected storage that can scale from gigabyte to petabytes.

Storage device
A device capable of storing data. The term usually refers to mass storage devices, such as disk and tape drives.

Storage footprint
The amount of energy, physical space, and other equipment necessary to run a data storage management system.

Storage management
The tools, processes, and policies used to manage storage networks and storage services such as virtualization, replication, mirroring, security, compression, traffic analysis, and other services. The phrase also encompasses other storage technologies, such as process automation, storage management and real-time infrastructure products, and storage provisioning. In some cases, the phrase storage management may be used in direct reference to storage resource management (SRM).

Storage management initiative specification
Storage management initiative specification (SMI-S) is an interface standard that enables interoperability in both hardware and software between storage products from different vendors used in a SAN environment. The interface provides common protocols and data models that storage product vendors can use to ensure end user manageability of the SAN environment.

Based on the CIM and Web-Based Enterprise Management (WBEM) standards, SMI-S adds common interoperable and extensible management transport, automated discovery, and resource locking functions. SMI-S was developed by the Storage Networking Industry Association (SNIA) in 2002.

Storage networking
A high-speed network of shared storage devices. The storage network is used by IT departments to connect different types of storage devices with data servers for a larger network of users. As more storage devices are added to the storage network, they too will be accessible from any server in the larger network.  Storage networking is a phrase most commonly associated with enterprises and data centers.

Storage optimization
The implementation and management of tiered storage solutions to obtain a lower cost per capacity across a corporation or enterprise. Storage optimization is an information lifecycle management (ILM) strategy.

Storage over IP
Storage over IP (SoIP) technology refers to the merging of Fibre Channel technologies with IP-based technology to allow for accessing storage devices over TCP/IP networks. SoIP is the framework for storage area networking (SAN) using Internet Protocol (IP) networks to directly connect servers and storage. SoIP products are designed to support transparent interoperability of storage devices based on Fibre Channel, SCSI, and a new class of Gigabit Ethernet storage devices using iSCSI and iFCP. Existing Fibre Channel or SCSI devices, such as servers with host bus adapters (HBAs) or storage subsystems, can be included in an SoIP storage network without modification.

Storage resource management
Storage resource management (SRM) refers to software that manages storage from a capacity, utilization, policy, and event management perspective. SRM includes bill-back, monitoring, reporting, and analytic capabilities that allow you to drill down for performance and availability.
Key elements of SRM include asset management, charge back, capacity management, configuration management, data and media migration, event management, performance and availability management, policy management, quota management, and media management.

Storage service provider
A storage service provider (SSP) is a company that provides computer storage space and related management services. SSPs also offer periodic backup, archiving, and the ability to consolidate data from multiple company locations so that data can be effectively shared.

Storage virtualization
Storage virtualization is the amalgamation of multiple network storage devices into what appears to be a single storage unit. Storage virtualization is often used in a SAN (storage area network), a high-speed subnetwork of shared storage devices, and makes tasks such as archiving, backup, and recovery easier and faster. Storage virtualization is usually implemented via software applications.

Store
To copy data from a CPU to memory, or from memory to a mass storage device.

Stripe
The process of distributing data across several storage devices to improve performance.

Superparamagnetism
In magnetic disk drive storage technology, the fluctuation of magnetization due to thermal agitation. When the areal density—the number of bits that can be stored on a square inch of disk media—of a disk medium reaches 150 gigabits per square inch, the magnetic energy holding the bits in place on the medium becomes equal to the ambient thermal energy within the disk drive itself. When this happens, the bits are no longer held in a reliable state and can “flip,” scrambling the data that was previously recorded. Because of superparamagnetism, hard drive technologies are expected to stop growing once they reach a density of 150 gigabits per square inch.

Synchronization
In Fibre Channel, a receiver’s identification of a transmission word boundary.

Synthetic backup
A synthetic backup is identical to a regular full backup in terms of data, but it is created when data is collected from a previous, older full backup and assembled with subsequent incremental backups. The incremental backup will consist only of changed information. A synthetic backup is used when time or system requirements do not allow for a full complete backup. The end result of combining a recent full backup archive with incremental backup data is two kinds of files which are merged by a backup application to create the synthetic backup. Benefits to using a synthetic backup include a smaller amount of time needed to perform a backup, and lower system restore times and costs. This backup procedure is called “synthetic” because it is not a backup created from original data.

Read More