Modern Data Management and Protection Challenges
Customers of all types and sizes are seeking new and innovative ways to overcome challenges associated with data growth and storage management. While these challenges are not necessarily new, they continue to become more complex and more difficult to overcome due to the following:
- Pace of data growth has accelerated
- Location of data has become more dispersed
- Linkages between data sets have become more complex
Data and storage management challenges are compounded by the need for companies to protect critical data assets against disaster through backup and recovery solutions. In order to maintain backups of critical data assets, additional secondary storage resources are required. This additional layer of backup storage must be implemented wherever backups occur, including central data centers and remote offices.
Storage Efficiencies through Data Deduplication
Backup Exec 2012 includes advanced data deduplication technology that allows companies to dramatically reduce the amount of storage required for backups, and to more efficiently centralize backup data from multiple sites for assured disaster recovery. These data deduplication capabilities are available in the Backup Exec 2012 Deduplication Option.
Backup Exec 2012 Data Deduplication Technology
The data deduplication technology within Backup Exec 2012 breaks down streams of backup data into “blocks.” Each data block is identified as either unique or non-unique, and a tracking database is used to ensure that only a single copy of a data block is saved to storage by that Backup Exec server. For subsequent backups, the tracking database identifies which blocks have been protected and only stores the blocks that are new or unique. For example, if five different client systems are sending backup data to a Backup Exec server and a data block is found in backup streams from all five of those client systems, only a single copy of the data block is actually stored by the Backup Exec server. This process of reducing redundant data blocks that are saved to backup storage leads to significant reduction in storage space needed for backups.
Figure 1: Deduplication Process
The deduplication technology within Backup Exec is applied across all backups managed by a deduplication-enabled Backup Exec server.
Deduplication Methods within Backup Exec 2012
The Backup Exec 2012 Deduplication Option gives administrators the flexibility to choose when and where deduplication calculations take place. Three deduplication methods are supported by Backup Exec 2012. These are as follows:
The client-side deduplication method is a software-driven process. Deduplication takes place at the source or protected client, and backup data is sent over the network in deduplicated form to the Backup Exec server. Only unique blocks of backup data are sent to the backup server and saved to backup storage; non-unique blocks are skipped.
Backup Exec Server-side Deduplication
The Backup Exec server-side deduplication method is also a software-driven process. Deduplication takes place after backup data has arrived at the Backup Exec server and just before data is stored to disk (also known as inline deduplication). Only unique blocks of backup data are stored; non-unique blocks are skipped.
The appliance deduplication method is a hardware-driven process. Deduplication takes place on the deduplication appliance (can be in-line or post-process deduplication, for example, ExaGrid or Quantum). 3rd-party deduplication devices handle all aspects of deduplication.
Administrators can mix and match deduplication methods to fit their unique needs. For example, a single Backup Exec server enabled for deduplication can simultaneously use client-side deduplication for some jobs, Backup Exec server-side deduplication for others, and appliance deduplication for yet another set of jobs.
Figure 2: Deduplication Methods
The different deduplication methods supported by Backup Exec 2012 have various configurations for which they are best suited. The benefits of each method, as well as the configurations for which each method is best suited, will be detailed in the following weeks.
Cancer research group endorses Backup Exec 2012 upgrade, by Dave Raffo, Tech Target, September 6, 2012
Not all Backup Exec users hated the new Backup Exec 2012 interface or demanded Symantec Corp. make changes before they upgraded. Scott Gould, senior network and systems analyst at the Gynecological Oncology Group’s statistical data center in Buffalo, N.Y., said he found the new version easier to use than Backup Exec 2010 practically from the start.
The Gynecological Oncology Group (GOG) uses Backup Exec to protect clinical trial data for cancer research generated by about 10,000 GOG members at more than 700 institutions. The group installed Backup Exec (BE) 2010 about two years ago and performed a Backup Exec 2012 upgrade early this year. Gould said the new interface was no problem for him or his two-person support staff to master.
“I liked it; no ifs, ands or buts about it,” he said. “There was a learning curve, but by the time I was setting up my fifth protected resource, I realized how easy it was to use. And it wasn’t just easy for me. I was responsible for designing and setting up the system from the ground up, butother users [at GOG] have picked it up easily. They can do simple restores on their own. That says a lot about the user interface.”
Although, not everyone agreed with that assessment, as Symantec soon found after releasing the BE 2012 upgrade. There was an angry user backlash, placated only by the vendor’s promise to fix some of the issues via service packs. But Symantec execs said newer users and heavily virtualized shops should take to the new interface quicker than those who have used the product for many years.
Gould said he quickly realized how to create a resource group that showed him what he could see in the old interface. “It wasn’t that difficult to get a view similar to what we had before,” he said. “The whole job staging flow from resource to disk or tape has gotten easier to manipulate or manage.”
Gould said GOG’s servers are about 80% virtualized with about 40 virtual machines on afive-host VMware cluster. GOG switched from CA ARCserve to Backup Exec 2010 at the same time it moved to Dell Compellent SANs. Gould said Backup Exec handled virtual machine backups better than ARCserve, and has superior data deduplication. He said BE’s one-pass backup lets him protect virtual and physical machines while backing up only once.
Gould said GOG struggled to do full backups in 24 hours with ARCserve, so it had to wait until the weekends to do full backups on some systems.
He said GOG now starts full backups Monday through Friday at around 5 p.m., and completes them by around 9 a.m. the next morning. GOG backs up data directly to a Dell PowerEdge server, and then copies it out to tape at night.
He said his dedupe ratio is about 32-1, with roughly 4.5 terabytes (TB) of disk space protecting around 150 TB of uncompressed data.
Over the last 12 months it’s hard not to miss all the messages from vendors that promise to modernize your backup infrastructure. I particularly like the messages from vendors that champion solutions that address one or two aspects of backup – such as deduplication, snapshots or tools for backing up just VMware and Hyper-V environments. These are quick fixes, not modern data protection. Throwing more solutions at a problem as a quick fix is the cause of backup complexity and cost. A solution that increases complexity, risk or cost is hardly a solution that will modernize your backups. Why? Because the very nature of modernization is moving forward – reducing complexity, risk and cost. Yet these vendors, ironically, still try to position themselves as the solution to backup modernization without really offering a solution that does exactly that.
So what does it really mean to modernize your backup infrastructure? How do you know if your backup infrastructure is out of date? Is it the software, hardware or network that needs updating or is it a combination of all three?
To answer these questions and more, the Backup Exec team at Symantec have teamed up with leading analyst firm ESG to bring to you the latest backup trends and give pragmatic advice on how to apply those ideas to your environment that will truly modernize your environment. On Wednesday 9th May during Symantec Vision 2012 at the MGM in Las Vegas (http://www.symantec.com/vision), Jason Buffington from ESG will be presenting on: How to modernize your backups in 2012 (session number IM B01). In this session Jason will uncover his research on the latest trends in data protection. Not only will Jason give pragmatic advice on how to apply those ideas to your environment, he’ll also uncover backup and recovery best practices that will radically improve your backup and recovery performance. From virtualization to cloud, and from dedupe to replication, Jason will squeeze everything into 55 minutes.
If you are unable to attend vision this year don’t worry! This session will be recorded so if would like to get your hands on the replay, simply reply to this post and we will send you a link to view the recording as soon as it becomes available.
In the meantime, if you have a question in relation to modernizing your backups in 2012 send it via twitter to @JBUFF. Jason will be answering these questions and more during the session.
Before I sign out, I wanted to leave you with one closing thought. While many vendors talk about modernizing backup, there is only one company who is really delivering upon that promise and that’s Symantec. If you are attending Vision, stop by the Backup Exec booth and find out how.
How to help reduce the amount of data you backup …
BEDAT – What’s that?
The Backup Exec Deduplication Assessment Tool (BEDAT) is a utility designed to help partners demonstrate the value of Backup Exec and its deduplication technology to their customers – without having to have Backup Exec installed on the system! BEDAT scans user-selected data sets on one or more Windows-based systems in a customer’s network environment and estimates the deduplication savings that would be experienced if the same systems were protected using Backup Exec or the Backup Exec 3600 Appliance and deduplication – using the same algorithms used in BE itself. BEDAT returns global deduplication results, per resource deduplication results, and per data type deduplication results. BEDAT does not actually capture or transport any customer data during the assessment process; it only captures deduplication fingerprint information and transmits this data to be included in deduplication results.
How does it work?
The Backup Exec Deduplication Assessment Tool (BEDAT) installs to almost any Windows-based, x86 or x64 computer system. When run, it can calculate deduplication results for the system on which it is installed, as well as other systems available on the network. When capturing deduplication data from remote network systems, a small agent is temporarily installed to the remote servers and removed after deduplication calculations have been completed. BEDAT is designed to be as simple and as easy to use as possible. It is a wizard-driven utility that does not require any specific IT expertise to use successfully.
Who is it for?
The Backup Exec Deduplication Assessment Tool (BEDAT) is designed to be used by Backup Exec partners as they help customers understand the storage optimisation benefits of the deduplication technology found in Backup Exec. If you are a customer please feel free to ask your Symantec IT supplier to provide this service.
Where can Partners get it?
The Backup Exec Deduplication Assessment Tool (BEDAT) is available for partners to download at the Symantec PartnerNet site. For end user customers interested in using BEDAT in their environments, please contact a local Symantec partner and talk to them about how Backup Exec can help optimise your backup storage and network utilisation.
What platforms and data types are supported?
The Backup Exec Deduplication Assessment Tool (BEDAT) supports Windows 2003 and Windows 2008 x86 and x64 platforms, including both physical and virtual systems. It supports estimating deduplication results for file system data, Exchange data, and SQL data.
Please note: while designed to be highly accurate, the results offered by the Backup Exec Deduplication Assessment Tool (BEDAT) represent estimates of the storage savings that would be gained by using Backup Exec deduplication technology.
Very loosely, we were instructed to delete everything pre dot com bubble bursting (2000), keep everything post and now we are fast running out of data centre disk allocation space, err?
In fact it’s wonder we manage to do anything given the amount of information we need to process. As a consequence we are now facing a greater threat – too much information. There are somewhere between 60 to 160 Billion mails sent around the world every single day. These emails include attachments such as reports, presentations, letters and pictures. In spite of the limitations such as privacy and too much unwanted mail, email is the best way to communicate efficiently, quickly and cheaply. The danger with email, as with any other way of sharing information, is that too much information simply clogs the system up and become a bottleneck to productivity.
Here are some useful top tips that may help:
- Understand the new business user – organisations must better understand the challenges employees are facing when navigating the world of information management. Look at when and how employees are accessing their information, make sure that data is indexed and categorised, and that intelligent archiving and search tools are available
- Prepare the infrastructure – with the relentless flow of information only set to continue, IT infrastructure must be able to cost effectively manage the increasing requirements for storage by implementing solutions able to dedupe and archive appropriately, automate processes and monitor and report on system status across all different devices and environments
- Prepare people – create IT policies that educate employees on how to manage their information – from email practices like limiting the ‘CC’ and ‘reply to all culture’, to saving only the latest document version and overcoming the fear of the delete button. Help employees understand the company’s information retention strategy so they know what information is recoverable. This will empower them to take charge of information control and maintain productivity and efficiency
- Keep security front of mind – it seems like an obvious statement, but reinforcing company security policies around mobile devices could protect against significant and damaging data loss. Make sure employees know the company processes and take advantage of technologies that enable the IT department to see where the most important information is, at all times
- Encourage staff to switch off – with the information era in full swing and with more and more opportunity for employees to stay connected at all times, it’s important that organisations support staff welfare and encourage them to switch off every once in a while
Seriously consider optimising your storage to reduce overall front end storage usage. Improving capacity can be done through integrated archiving and deduplication as well as tiering your storage. Archiving moves old data to a separate store so you don’t have to backup the same data day-in, day-out – forever. Deduplication only backs up data (at a block level) once, using a pointer to the unique data. So you can both reduce the amount you backup as well as dramatically reducing your backup window with archiving and data deduplication.
But, I hear you say, if I implement deduplication technology what are the benefits? Well, Backup Exec can help with that too. Read all about the Backup Exec Deduplication Assessment Tool in Part III.
Yes, it’s true – we are becoming a nation of information addicts – at least according to a survey Symantec recently carried out. Symantec wanted to find out more about how the so-called information explosion is affecting the everyday lives of British office workers. What was abundantly clear is that we are all suffering from this 21st century ailment – Information Overload – sounds like a Tom Cruise film, or AC/DC album – and it is overtaking not only our working lives, but our personal ones too.
Accessing work information out of hours, compulsively checking emails, texts and social media and hoarding endless emails and multiple versions of the same file are all symptoms of information overload experienced by those we surveyed. See the stats here.
But whereas the technology enabling us to do this (fantastic mobile devices and faster connectivity) all purport to make us more productive in the workplace, is our mismanagement of information actually counter-productive?
IDC has recently estimated that in 2011 over 1.8 Zetabytes of information was created and replicated (IDC, “The 2011 Digital Universe Study: Extracting Value from Chaos”) and if we go by Moore’s Law this will continue to grow almost immeasurably over the coming years. What does this mean for our state of mind and the systems we work with – will we reach a moment when we are essentially ‘drowning’ in information?
Not if the technologies that store and manage information also continue to improve. We are working very hard to make managing information easier, faster and more efficient for businesses of all sizes. This means making sure that what is actually useful and valuable is stored, archived and backed up correctly, while the rest is relegated to permanent deletion.
But technology can only go so far, some of the onus is still on businesses and individuals to moderate their work behaviour to take into account this new work paradigm.
Part II – What can we do about it?
– with improvements in virtualization backup, archiving, and security
Backup Exec 2010 R3 continues with our “R” series of releases – new features, new capabilities, and continued improvement to our award-winning backup and recovery application.
There’s quite a bit of goodness in this release. Since Symantec as a company has focused heavily in the virtualization space, it stands to reason that many of the improvements in Backup Exec 2010 R3 revolve around virtualization. But that’s not all we improved in this release!
If you backup and restore VMware with Backup Exec and the Deduplication Option, this update is for you. We’ve made significant improvement to the Deduplication engine regarding VMware backups – you should see significant improvement in the Deduplication rates with Backup Exec 2010 R3 compared to previous versions of Backup Exec. In our environments we’ve seen 30%, 50%, or higher data reduction rates with the Backup Exec 2010 R3 compared to previous versions. If you use Backup Exec and you want better Deduplication ratios with VMware backups, then this is the update for you.
(And don’t worry, Hyper-V users – we haven’t forgotten about you – we’ll have good news for you in the next major release of Backup Exec. So be patient, and we’ll get Hyper-V Deduplication into Backup Exec as soon as we can.)
We’ve added a new plug-in – the Backup Exec Management Plugin for VMware – to the mix, too. This utility allows you to monitor backups made with the Agent for VMware from within a vSphere Client or vCenter installation. Administrators can get all sorts of insight into Backup Exec jobs, including a status view of all VM’s, last backup run, next scheduled backup, what type of backup occurred, and allows administrators to drill down into Backup Exec backup job logs. This is a completely free update that is very useful for VMware administrators who use Backup Exec.
Plus, we’ve improved a number of other areas involving deduplication in virtual environments since Backup Exec 2010 R2 was released. If you protect VMware, use Deduplication, or use Application GRT for SQL, Exchange or Active Directory, this update also a great fit for you.
Archiving has seen some love, too – especially the Exchange Mailbox Archiving Option. We now support Archiving from Exchange 2010 SP1! So those administrators who have been looking to move or have moved to latest and greatest messaging platform from Microsoft, we support backup, recovery, and storage management through Archiving with Backup Exec 2010 R3. In addition to new platform support, we’ve also introduced Virtual Vault for the Exchange Mailbox Archiving Option. This Outlook Plug-In allows end users to see Archived emails directly from within Outlook – meaning that users never need to leave the familiar Outlook interface to search, view, and reply to Archived emails.
The Agent for Enterprise Vault has seen some improved platform support. We now support backup, recovery, and migration from Enterprise Vault 10 installations. Enterprise Vault is the flagship Archiving product from Symantec, and we are committed to supporting the latest and greatest releases of the Enterprise Vault application.
We’ve also done some work with our underlying security infrastructure. Backup Exec 2010 R3 now provides TSL/SSL support from the agent to the server, providing an extra layer of security for customers that transmit backup data across the WAN or to a private cloud. Essentially, the Backup Exec Media Server becomes its own Certificate Authority (CA) with the power to sign certificates, establishing identity and trust. This now encrypts the control and data connections between the media server and remote agents. This added security features will help ensure that backed up data sent over any network or Internet connection is secure.
And last but not least, Backup Exec 2010 R3 brings support for the latest versions of Microsoft Small Business Server 2011. Both SBS 2011 Essentials and SBS 2011 Standard, along with the SBS 2011 Premium Add-On, are supported with Backup Exec 2010 R3.
To sum up, we’ve done a lot of work to make Backup Exec more effective in VMware environments, more secure, and even better suited to backup and protect your environments. We’ve produced a short “What’s New in Backup Exec 2010 R3” webcast that you can find that on www.BackupExec.com– with an introduction to Backup Exec 2010 R3. If you already own Backup Exec 2010 or Backup Exec 2010 R2, this is a no-cost upgrade for you. If you are interested in evaluating Backup Exec 2010 R3, you can download trialware from www.BackupExec.com.
Aiden Finley, Backup Exec Product Management
See Symantec Connect
I need a new service, so I need an application, and a new server, and perhaps some storage … and if we’re lucky we ask ourselves “oh, yes, what about the backup?” Have you noticed how really never turn IT off, we just add to it. So we end up with a backup strategy that encompassed everything 3 or 4 years ago, but one that falls pretty short today; that’s how it really works.
Even though we know that we really should backup all our data – just in case – are we absolutely convinced we actually are? Backup is our critical data protection solution and yet we rarely review our backup strategy.
With server virtualisation, the need for fast reliable application recovery, the exponential growth of unstructured data and poor data lifecycle management are some of the root causes of operational inefficiencies in IT and why we are change the way we approach our backup strategies.
With more and more companies adopting virtualisation technologies to improve efficiencies and reduce CAPEX costs, organisations are looking for ways of protecting both virtual and physical environments with a single backup tool. It makes sense to use a solution that gives you granular recovery from a single pass backup, saving time, money and any amount of effort – don’t use separate tools and end up backing up the backup it turns the recovery process into a nightmare!
The backup and recovery of Microsoft Applications is an inherently challenging process that becomes more difficult as the databases grow and the demands on its online availability increases, further limiting the time available for backup and recovery operations. Granular Recovery of Exchange, SQL and Active Directory from a single pass backup makes it easy and efficient to identify and recover only those objects needed.
Optimising storage to reduce overall storage capacity can be done through integrated archiving and deduplication. Archiving moves old data to a separate store so you don’t have to backup the same data day-in, day-out – forever. Deduplication only backs up data (at a block level) once, using a pointer to the unique data. You can reduce the backup window dramatically with both archiving and data deduplication.
Backup Exec 2010
Backup Exec Agents and Options enhance and extend platform and feature support for your backup environments for Microsoft applications, virtual environments (VMware and Microsoft Server 2008 R2 Hyper-V) as well as storage reduction or optimisation technologies.
… or so Computerworld UK said on Monday. Something that I’ve been saying for 3 years – but, hey, what do I know? According to Computerworld UK the consumerisation of IT is an unavoidable phenomenon that will force businesses to rethink their security policies. What about all that stuff some poor soul will have to back up? CIOs need to deal with consumerisation. For many years we were able to say to our end users – “no you can’t have that”, or “no we don’t support this”. But those days are over. The security of this phenomenon is certainly a concern, but IDC reckon that 80% of data created by individual consumers will end up on corporate networks. This will inevitably cause a overload on our already overloaded systems. Until we have the management capabilities for streaming applications to the desktop (oh, sorry Symantec already does that). OK, so when we get around to migrating to this model where all our data is help inside the network consumerisation won’t be nearly as scary – until then data growth will continue on its upward curve.
That’s why deduplication and archiving are key to our backup strategies. We are challenged with managing and protecting the ever-increasing amounts of data. Backup Exec offer deduplication across physical and virtual machines to reduce the length and size of backups. Deduplication has the power to transform information management; it is great for backup, it is great for archiving, and can even make virtualised server backup manageable. Symantec believes that deduplication should live in every part of the information architecture.
Much of the content produced now consists of email, documents, presentations, and other types of unstructured information. This explosion of information has a significant impact on storage spending and IT’s ability to meet the needs of its internal customers and business units. Backup Exec’s integrated archiving option is focused on reducing the amount of information backed up. Together with Enterprise Vault (EV), Backup Exec’s Agent for EV helps organisations to unify content sources, apply retention policies, reduce backup windows, shorten recovery times, and optimise storage resources making it easier for companies of all sizes to store, manage, and protect all unstructured data.
Actually, I’ve discovered it’s not just me – thank goodness. When you go through a product launch process there is always a chance that the general ”noise” is you banging on about stuff and is limited to inside your own head.
Not so in BE 2010 case … I’ve been running around Europe in the last month or so, and the feedback I’ve been getting is that there is an awful amount of interest in deduplication. Even smaller companies who thought that deduplication was probably too much for their needs are seriously looking at getting rid of some of the duplicate data on their primary storage, but even more are looking at deduplication as a way of improving backup and restore times.
I was at an event a couple of months back where every conversation I had was around the length of time it takes to backup. As data continues to grow across the IT infrastructure, everywhere: laptops, disparate storage devices, remote offices, as well as the good old Data Centre, it has created a fundamental shift in the way organisations need to manage information. Keeping information on disk for faster DR restores is all fine and dandy, but there is simply too much data around. Disk based backup is now getting a tricky to manage and as cumbersome as tape-based backup.
A number of customers are turning to deduplication technologies in order to facilitate faster backups, reduce primary storage, and reduce not just the amount of disk being used up but also improve tape media rotation and management.
Deduplication gives you the ability to tape a strategic approach to storage and backups. Organisations now have the ability to deploy an integrated platform that is easy to manage and supports source and target-based deduplication.
Primary storage deduplication will become widely deployed in the next 6 to 12 months. Most organisations have not yet gone down this route, however, with the Option now built into Backup Exec 2010 this is all the more likely because it’s available simply, really easy to install and the benefits are huge.