By Kate Lewis
Agents. Agentless. VADP integration. VSS integration. Image based backups. File based backups. Hypervisor based snapshots. Array based snapshots. Host based backups. Guest based backups. It’s no wonder why backup professionals are confused about what the best approach is for backing up their virtual machines. With a myriad of vendors, all positioning their own way as the “best” way, it leaves in its wake ambiguity and a good dose of confusion about what an agent is, or does.
With the help of my technical experts here at Symantec, this blog distils the confusion with unbiased information so you can make the right choice for your environment. By looking at each method and highlighting the pros and cons of each, you can make informed decisions without the distraction of smoke and mirrors.
Caution: But before we dive in, it’s important to mention that the phrases agentless backup and agent-based backup can mean different things to different vendors. To truly determine what the best approach for your organization is, you need to look under the cover and weigh in the pros and cons of each method. Don’t worry; we have done the hard work for you at Symantec. Now let us jump in and take a look at each one in turn.
1) Traditional Agent-Based Backup (also known as guest based backup)
Traditional agent-based backup is also known as guest based backup. An agent is installed in every virtual machine and treats each virtual machine as if it was a physical server. The agent in this scenario is reading data from disk and streaming the data to the backup server. This method must not be confused with agent-assisted backups that we will cover later.
There are many people today using this approach to protect their virtual machines. According to ESG1, 46% of all environments are utilizing guest based protection methods with a backup agent running inside the guest OS. Although there are newer methods available, you may be asking yourself why so many people are still using this method.
- Both physical and virtual machines are protected using the same method
- Application owners can manage backups and restores from guest OS
- Time tested and proven solution
- Meets their recovery needs
- This is the only way to protect VMware Fault Tolerant virtual machines
- Significantly higher CPU, memory, I/O and network resources utilization on virtual host machines when backups run.
- Need to install and manage agents on each virtual machine
- Cost may be high for solutions that license on a per agent basis as opposed to per hypervisor based licensing
- Cannot accommodate virtual machine sprawl, lack of visibility into changing virtual infrastructure
- No visibility for backups from VM administrators’ point of view; for example, backups are not visible at vSphere client level
- May need multiple kinds of backups and recovery methods; for example separate backup policies may be needed for file and folders backups, Microsoft Exchange backups, bare metal recovery etc.
- Complex disaster recovery strategies
- Lack of SAN transport backups to offload backup processing job from virtual infrastructure
- No protection for offline virtual machines and virtual machine templates
- Slow file by file backup by agent sending the even unchanged data over and over again
Verdict: A cumbersome, traditional backup and recovery method, but offers flexible recovery features.
2) Agentless backup (also known as host-based backup)
Agentless backup, also known as host-based backup, refers to solutions that do not require an agent to be installed on each VM by the administrator. However, it’s important to note that the software may be injecting an agent onto the guest machines without your knowledge.
These solutions integrate with VMware APIs for Data Protection (VADP) or Microsoft VSS, which creates fast, high performance snapshots of the virtual disks attached to VMs. The backup software communicates with VADP or VSS and tells it what it wants to backup. VADP and VSS carry out a number of steps and in turn prepare the data to be backed up. The VSS / VADP provider snaps the volume and gives the backup solution access to this snapshot by feeding the file to the backup server. The backup solution then backs up the snapshot.
While it provides recovery for full VMs, files and folders, the recovery of applications and application data can be complex and time consuming. This is because it requires additional processing that engages resources external to the virtual machines. Applications on these hypervisors won’t truncate their transactions logs or perform other database maintenance tasks. An Exchange Server is a perfect example. Without an agent or agent-like executable in the VM gathering metadata about the Exchange information store there is a need for additional processing external to the exchange VM in order to map mailbox data. If you ignore this process, it can result in unmanaged transactional applications that must be manually managed by the application owner, and data that might only be recovered by first restoring the entire VM and its virtual disks. Therefore, one of the key differences between Agentless and Agent-Assisted backups is how the transactional post-processing happens.
- VMs can be backed up online or offline
- Less CPU, memory, I/O and network impact on the virtual host
- An agentless architecture doesn’t require the management of agent software
- No per VM agent licensing fees
- Extremely difficult to recover granular object data – first restore the entire VM and its virtual disks
- Traditional login techniques to log into the server
- Temporary “injected” drivers can destabilize the system and compromise data integrity
- Troubleshooting is more complex when using injected (temporary) agents
- A centralized controller is a single-point-of-failure
- Requires a fully-virtualized environment. Physical machines still require agent-based backup. If you have physical and virtual you will need two backup solutions – one for physical and the other for virtual.
- Additional processing e.g. post backup scripts and truncation engages resources external to the virtual machines
Verdict: Good method for protecting file and print servers, but not an optimal solution for VMs with applications. Recovery is operationally painful for applications and application data.
3) Agent-Assisted Backup: Next generation backup (also known as host based backup)
Agent assisted backups are also known as host based backup and integrate with VMware’s VADP and Microsoft VSS to provide fast and efficient online and offline backups of ESX, vSphere and Hyper-V. The primary difference between agentless and this design is its perspective: it pairs the VMware VADP or Microsoft VSS with an agent that gathers application metadata to enable multiple avenues of recovery (full VM, applications, databases, files, folders and granular objects). The agent-like executable in this instance is not carrying out the backup and thus does not impact the performance of the VM. It’s simply handling metadata and necessary post backup processing like log truncation.
- The backup is for the entire virtual machine. This is important because it means the entire VM can be recovered from the image. It also means that products like Backup Exec & NetBackup can offer “any-level” of recovery from the image contents: Files / Folders, Databases and Granular database contents, like email and documents.
- The backup can be offloaded from both VM as well as the hypervisor. This means that Backup Exec & NetBackup have the flexibility to offload VM backup onto an existing backup server, instead of burdening the hypervisor. It also means that users have the option of deploying a dedicated VM, e.g. a virtual appliance, when a physical backup server is not practical.
- Application owner can self serve restore requests: The application owner can request restores directly from the guest operating system.
- Enhanced security: The agent installed for assisting with VM backup can be managed by the application owner. Thus you are avoiding the need to share guess OS credentials with backup administrator.
The most resource efficient backups are the backup operations done at the hypervisor level and provide some of the following advantages:
- Backup is still image based. Leverages VADP / VSS.
- Ability to directly recover files and folders directly back to a virtual machine.
- Enable automatic discovery of application inside VM
- Granular Application Recovery
- For VSS Compliant Applications, backup is application consistent via VSS integration
- For Non VSS compliant applications, backup is crash consistent
- Less Performance and I/O impact on the Virtual Machines
- Can be on LAN or SAN interface
Verdict: Excellent method for VMs with applications like AD, Exchange, SQL and SharePoint.
Before I close out this blog, it is very important to understand that backup vendors who utilize VMware VADP or Microsoft VSS to perform backups are not the same. Some are better than others. How good the backup software is will depend on the validation phase. So don’t be fooled in thinking that all solutions with VMware VADP or Microsoft VSS integration offer the same functionality and benefits. A quality backup application that integrates with VMware VADP or Microsoft VSS to perform backups should at the very minimum:
- Validate snapshots prior to backup
- Use VMware change block tracking (CBT) mechanism to have smaller storage footprint by backing up only changed data
- Verify backup data
- Backup VM directly from storage location for example, SAN, iSCSI, NAS, without having to install any software a.k.a agent inside the VMs
- Offer flexible recovery options (Full VM recovery , File/Folder level recovery, Application level recovery and granular object recovery)
- Centralized backups for Virtual machines
- Dynamic inclusion of VMs
- Ability to transport your data offsite for disaster recovery
As your embrace virtualization or increase your virtual footprint, selecting a backup solution that integrates with VMware VADP and Microsoft VSS will provide fast snapshot-based image backups of online and offline guest virtual machines. For superior recovery capabilities a solution that gathers metadata and executes post processing tasks is a must. So we’ve learnt that the question isn’t whether you have an agent or other differently named binary in the guest. The question is what is the agent’s function. Jason Buffington, a Senior Analyst at ESG, wrote a great blog on “good agents” and “bad agents”. If you would like to learn more, you can check out his blog here: http://www.esg-global.com/briefs/agent-best-practices-for-host-based-backups/.
In summary, an agent-assisted solution that integrates with VMware VADP and Microsoft VSS is the clear winner in today’s environments where physical and virtual machines require a holistic approach.
Finally, I couldn’t close out this blog without mentioning Backup Exec. Yes – I am biased on this one, but Backup Exec provides a perfect solution for virtual environments with technology that was designed for VMware and Hyper-V. Not only does Backup Exec provide superior data protection for virtual environments, it also provides market leading technology for physical environments too. With Backup Exec you get it all in a single solution. So here’s my pitch. Backup Exec 2012 dramatically reduces the time to recover from small or major disasters by protecting all of your virtual machines and/or physical servers through a single pass backup, while still allowing for individual file, folder and granular object level recovery. In short, it’s powerful, efficient, reliable and fast.
If you have any questions or would like to know more, email me at: email@example.com.
1 Source: ESG Research Report, 2012 Trends in Data Protection Modernization, August 2012.
Over the last 12 months it’s hard not to miss all the messages from vendors that promise to modernize your backup infrastructure. I particularly like the messages from vendors that champion solutions that address one or two aspects of backup – such as deduplication, snapshots or tools for backing up just VMware and Hyper-V environments. These are quick fixes, not modern data protection. Throwing more solutions at a problem as a quick fix is the cause of backup complexity and cost. A solution that increases complexity, risk or cost is hardly a solution that will modernize your backups. Why? Because the very nature of modernization is moving forward – reducing complexity, risk and cost. Yet these vendors, ironically, still try to position themselves as the solution to backup modernization without really offering a solution that does exactly that.
So what does it really mean to modernize your backup infrastructure? How do you know if your backup infrastructure is out of date? Is it the software, hardware or network that needs updating or is it a combination of all three?
To answer these questions and more, the Backup Exec team at Symantec have teamed up with leading analyst firm ESG to bring to you the latest backup trends and give pragmatic advice on how to apply those ideas to your environment that will truly modernize your environment. On Wednesday 9th May during Symantec Vision 2012 at the MGM in Las Vegas (http://www.symantec.com/vision), Jason Buffington from ESG will be presenting on: How to modernize your backups in 2012 (session number IM B01). In this session Jason will uncover his research on the latest trends in data protection. Not only will Jason give pragmatic advice on how to apply those ideas to your environment, he’ll also uncover backup and recovery best practices that will radically improve your backup and recovery performance. From virtualization to cloud, and from dedupe to replication, Jason will squeeze everything into 55 minutes.
If you are unable to attend vision this year don’t worry! This session will be recorded so if would like to get your hands on the replay, simply reply to this post and we will send you a link to view the recording as soon as it becomes available.
In the meantime, if you have a question in relation to modernizing your backups in 2012 send it via twitter to @JBUFF. Jason will be answering these questions and more during the session.
Before I sign out, I wanted to leave you with one closing thought. While many vendors talk about modernizing backup, there is only one company who is really delivering upon that promise and that’s Symantec. If you are attending Vision, stop by the Backup Exec booth and find out how.
Virtualization- what can I say other than it has “virtually” changed the IT world in which we all work and play? Why is virtualization so attractive to IT administrators? In short, the answer is easy- there are many uses and benefits that we gain through virtualization. For starters, the thought of being able to have a single server’s physical footprint represent many servers on the network has been a boon to administrators looking to consolidate space and reduce operation costs. Having the ability to quickly stand up a vm copy of a major application or work server for patch testing is a benefit that allows administrators to test during business hours is simply a game changer.
So, how else can we leverage this exciting technology? Well…how about recovery? What if I said we were talking about both physical and virtual environments?
I often speak with administrators that look for ways to simply protect their virtualized assets for the purpose full recovery in the event of disaster- i.e. their backup solution is only working to back up, but not truly embrace their virtual solution. What if we began taking the approach of having the backup software actually begin using virtualization as a true extension of the recovery plan? Can we take virtualization from being a resource that is typically one that is only backed up to a resource that can be leveraged as the platform for recovery for both physical and virtual servers alike?
We at Symantec say YES! …and Backup Exec 2012 is just the catalyst needed to truly and finally unite vrtual and physical environments.
Sure, the world is going virtual in a strong way but this is not something that is going to happen overnight. Although many early adopters have moved forward to become near 100% virtualized, most administrators are still governing environments that are heavily comprised of both physical and virtual server assets. As such, administrators need a solution that is not only purpose-built to work for their entire environment but one that takes advantage of the virtual infrastructure specifically to allow them to further leverage their IT investments.
Backup Exec 2012 is just the solution to deliver the proverbial goods. Not only is Backup Exec 2012 offering a fresh and new user experience but also is leveraging companies’ existing virtual infrastructure for items including instant recovery of any physical or virtual server that is protected but also has the ability to leverage the cloud to recover, test or migrate any VMware virtual machine that is in your environment. And for good measure, imagine that when it is time to migrate that physical server to a new virtual body you simply power on the virtual copy that was created and maintained as part of the standard backup of a physical server with Backup Exec 2012? Well, its time to stop imagining- the time has come to fully realize and embrace the power and fleibility of your virtual environment- and Backup Exec 2012 is here to help you do just that.
So again I ask….have you hugged your VM lately? If not you should- it can help you in more ways than you know!
Interested yet? If so please take a few moments to visit us virtually on the web or in person at Symantec VISION 2012 in May to learn more about how Backup Exec 2012 can help you truly embraced your virtual infrastructure.
In the meantime, check out this short video on leveraging “No Hardware” recovery with Backup Exec 2012:
Planning on attending Symantec VISION US this year? Come learn more in our BE 2012 session!
Discover the power of your virtual environments with Backup Exec 2012
Session IM B15 @ Symantec Vision 2012
MGM Grand – Las Vegas – May 7-10
Come join us at VISION US and learn more about:
- How to embrace and leverage your virtual infrastructure for near-line recovery, DR and sandbox testing
- Migrating to virtual? Learn about how to simply use your future and existing backups to make the process painless
- Need offsite DR for VM’s? Come see how we are leveraging the cloud for DR and much more!
Very loosely, we were instructed to delete everything pre dot com bubble bursting (2000), keep everything post and now we are fast running out of data centre disk allocation space, err?
In fact it’s wonder we manage to do anything given the amount of information we need to process. As a consequence we are now facing a greater threat – too much information. There are somewhere between 60 to 160 Billion mails sent around the world every single day. These emails include attachments such as reports, presentations, letters and pictures. In spite of the limitations such as privacy and too much unwanted mail, email is the best way to communicate efficiently, quickly and cheaply. The danger with email, as with any other way of sharing information, is that too much information simply clogs the system up and become a bottleneck to productivity.
Here are some useful top tips that may help:
- Understand the new business user – organisations must better understand the challenges employees are facing when navigating the world of information management. Look at when and how employees are accessing their information, make sure that data is indexed and categorised, and that intelligent archiving and search tools are available
- Prepare the infrastructure – with the relentless flow of information only set to continue, IT infrastructure must be able to cost effectively manage the increasing requirements for storage by implementing solutions able to dedupe and archive appropriately, automate processes and monitor and report on system status across all different devices and environments
- Prepare people – create IT policies that educate employees on how to manage their information – from email practices like limiting the ‘CC’ and ‘reply to all culture’, to saving only the latest document version and overcoming the fear of the delete button. Help employees understand the company’s information retention strategy so they know what information is recoverable. This will empower them to take charge of information control and maintain productivity and efficiency
- Keep security front of mind – it seems like an obvious statement, but reinforcing company security policies around mobile devices could protect against significant and damaging data loss. Make sure employees know the company processes and take advantage of technologies that enable the IT department to see where the most important information is, at all times
- Encourage staff to switch off – with the information era in full swing and with more and more opportunity for employees to stay connected at all times, it’s important that organisations support staff welfare and encourage them to switch off every once in a while
Seriously consider optimising your storage to reduce overall front end storage usage. Improving capacity can be done through integrated archiving and deduplication as well as tiering your storage. Archiving moves old data to a separate store so you don’t have to backup the same data day-in, day-out – forever. Deduplication only backs up data (at a block level) once, using a pointer to the unique data. So you can both reduce the amount you backup as well as dramatically reducing your backup window with archiving and data deduplication.
But, I hear you say, if I implement deduplication technology what are the benefits? Well, Backup Exec can help with that too. Read all about the Backup Exec Deduplication Assessment Tool in Part III.
- Unite Virtual and Physical: Powered by Symantec V-Ray technology, Backup Exec 2012 enables visibility across both virtual and physical environments for fast and efficient backup and recovery while eliminating the need for specialised point products.
- Eliminate Backup Complexity with a New Administration Console: A newly redesigned administration console will provide users with fast, concise management and monitoring capabilities.
- Integrated Disaster Recovery: With bare-metal disaster recovery and Backup Exec’s “no hardware DR” built in, organizations will be able to easily recover a failed system to a physical server, or to a Hyper-V or VMware guest.
- Capacity Licensing: New capacity licensing model for Managed Service Providers (MSPs), mid-sized and lower enterprise organizations will provide easier purchasing and maintenance by capacity as an alternative to existing a la carte pricing.
- Small Business Edition: In less than 10 minutes and with just three simple steps, Backup Exec 2012 Small Business Edition will install and configure backups so small businesses with limited IT experience can protect their data with ease. The new Backup Exec Small Business Edition will bundle Symantec’s data and system recovery technology into one affordable solution with a single license that’s designed specifically for a growing business.
I’m still shocked by organisations jumping on the virtualisation bandwagon. Platform proliferation has always been part of the Symantec strategy – something we almost take for granted. Any backup solution should be able to backup anything regardless of whether the data resides in a physical OR virtual environment. No IT manager in their right mind would seek to complicate their management overhead by adding proprietary solutions for every platform. So it seems extraordinary that bespoke backup software companies should shout so loud about their capabilities that are in reality so limited.
Symantec is one of the longest standing technical partners of both VMware and Microsoft and Backup Exec was first to market with virtualisation capabilities. There was a recent announcement from one company that they going to support Hyper-V later this year (Backup Exec has supported Hyper-V for the last 3 years). Hurrah! Bring out the banners, unfurl the flags! In today’s businesses, extremely small, small, medium, large, tremendously large, right across every segment and every industry, across geography and culture you cannot reduce a backup strategy to just VMware (and now Hyper-V).
Very few businesses will achieve total server virtualisation in the near future.Many organizations remain either partially virtualised or in various stages of testing and implementing virtualisation. Symantec Backup Exec and NetBackup support all mixes of physical and virtual environments, using a single product and a single console. Any solution that only supports virtual servers will mean customers still need a separate application for physical server backup. In fact, in some cases, other solutions on the market don’t even run in a VM environment and actually need a physical machine to run on.
But even if you were to completely virtualise your environment why would you not continue to rely on the leader in the market? More than 25,000 Hyper-V servers already rely on Symantec’s innovative Hyper-V support. Backup Exec protects over 85,000 virtual hosts (approx 60,000 VMware and 25,000 Hyper-V) and over 1 million guests, and an estimated 24,000 customers are using NetBackup to protect their virtual environment. With a VM disk based backup solution you require a 3rd party backup product to backup itself – in other words customers are required to have multiple backup vendors to achieve a platform agnostic level of data protection.
This makes the recovery process all the more trying, by backing up the backup a series of staging processes will be required long before you can start looking for the file you’re after – even worse if you are using incremental backups you’ll have fun staging everything and trawling through the backups to find anything – might take weeks!
There are some pretty aggressive and misleading marketing, and in some cases, inaccurate stabs at Backup Exec being touted from various companies. But they are right about a few things“80% of CIOs prefer to manage Virtual Infrastructures with existing enterprise management systems.” A no-brainer given the capabilities of Backup Exec and its long standing relationship with VMware – remember VMware Ready does not include some of these point players, make sure you dig deeper before committing to something that may not give you the best protection.
– with improvements in virtualization backup, archiving, and security
Backup Exec 2010 R3 continues with our “R” series of releases – new features, new capabilities, and continued improvement to our award-winning backup and recovery application.
There’s quite a bit of goodness in this release. Since Symantec as a company has focused heavily in the virtualization space, it stands to reason that many of the improvements in Backup Exec 2010 R3 revolve around virtualization. But that’s not all we improved in this release!
If you backup and restore VMware with Backup Exec and the Deduplication Option, this update is for you. We’ve made significant improvement to the Deduplication engine regarding VMware backups – you should see significant improvement in the Deduplication rates with Backup Exec 2010 R3 compared to previous versions of Backup Exec. In our environments we’ve seen 30%, 50%, or higher data reduction rates with the Backup Exec 2010 R3 compared to previous versions. If you use Backup Exec and you want better Deduplication ratios with VMware backups, then this is the update for you.
(And don’t worry, Hyper-V users – we haven’t forgotten about you – we’ll have good news for you in the next major release of Backup Exec. So be patient, and we’ll get Hyper-V Deduplication into Backup Exec as soon as we can.)
We’ve added a new plug-in – the Backup Exec Management Plugin for VMware – to the mix, too. This utility allows you to monitor backups made with the Agent for VMware from within a vSphere Client or vCenter installation. Administrators can get all sorts of insight into Backup Exec jobs, including a status view of all VM’s, last backup run, next scheduled backup, what type of backup occurred, and allows administrators to drill down into Backup Exec backup job logs. This is a completely free update that is very useful for VMware administrators who use Backup Exec.
Plus, we’ve improved a number of other areas involving deduplication in virtual environments since Backup Exec 2010 R2 was released. If you protect VMware, use Deduplication, or use Application GRT for SQL, Exchange or Active Directory, this update also a great fit for you.
Archiving has seen some love, too – especially the Exchange Mailbox Archiving Option. We now support Archiving from Exchange 2010 SP1! So those administrators who have been looking to move or have moved to latest and greatest messaging platform from Microsoft, we support backup, recovery, and storage management through Archiving with Backup Exec 2010 R3. In addition to new platform support, we’ve also introduced Virtual Vault for the Exchange Mailbox Archiving Option. This Outlook Plug-In allows end users to see Archived emails directly from within Outlook – meaning that users never need to leave the familiar Outlook interface to search, view, and reply to Archived emails.
The Agent for Enterprise Vault has seen some improved platform support. We now support backup, recovery, and migration from Enterprise Vault 10 installations. Enterprise Vault is the flagship Archiving product from Symantec, and we are committed to supporting the latest and greatest releases of the Enterprise Vault application.
We’ve also done some work with our underlying security infrastructure. Backup Exec 2010 R3 now provides TSL/SSL support from the agent to the server, providing an extra layer of security for customers that transmit backup data across the WAN or to a private cloud. Essentially, the Backup Exec Media Server becomes its own Certificate Authority (CA) with the power to sign certificates, establishing identity and trust. This now encrypts the control and data connections between the media server and remote agents. This added security features will help ensure that backed up data sent over any network or Internet connection is secure.
And last but not least, Backup Exec 2010 R3 brings support for the latest versions of Microsoft Small Business Server 2011. Both SBS 2011 Essentials and SBS 2011 Standard, along with the SBS 2011 Premium Add-On, are supported with Backup Exec 2010 R3.
To sum up, we’ve done a lot of work to make Backup Exec more effective in VMware environments, more secure, and even better suited to backup and protect your environments. We’ve produced a short “What’s New in Backup Exec 2010 R3” webcast that you can find that on www.BackupExec.com– with an introduction to Backup Exec 2010 R3. If you already own Backup Exec 2010 or Backup Exec 2010 R2, this is a no-cost upgrade for you. If you are interested in evaluating Backup Exec 2010 R3, you can download trialware from www.BackupExec.com.
Aiden Finley, Backup Exec Product Management
See Symantec Connect
I need a new service, so I need an application, and a new server, and perhaps some storage … and if we’re lucky we ask ourselves “oh, yes, what about the backup?” Have you noticed how really never turn IT off, we just add to it. So we end up with a backup strategy that encompassed everything 3 or 4 years ago, but one that falls pretty short today; that’s how it really works.
Even though we know that we really should backup all our data – just in case – are we absolutely convinced we actually are? Backup is our critical data protection solution and yet we rarely review our backup strategy.
With server virtualisation, the need for fast reliable application recovery, the exponential growth of unstructured data and poor data lifecycle management are some of the root causes of operational inefficiencies in IT and why we are change the way we approach our backup strategies.
With more and more companies adopting virtualisation technologies to improve efficiencies and reduce CAPEX costs, organisations are looking for ways of protecting both virtual and physical environments with a single backup tool. It makes sense to use a solution that gives you granular recovery from a single pass backup, saving time, money and any amount of effort – don’t use separate tools and end up backing up the backup it turns the recovery process into a nightmare!
The backup and recovery of Microsoft Applications is an inherently challenging process that becomes more difficult as the databases grow and the demands on its online availability increases, further limiting the time available for backup and recovery operations. Granular Recovery of Exchange, SQL and Active Directory from a single pass backup makes it easy and efficient to identify and recover only those objects needed.
Optimising storage to reduce overall storage capacity can be done through integrated archiving and deduplication. Archiving moves old data to a separate store so you don’t have to backup the same data day-in, day-out – forever. Deduplication only backs up data (at a block level) once, using a pointer to the unique data. You can reduce the backup window dramatically with both archiving and data deduplication.
Backup Exec 2010
Backup Exec Agents and Options enhance and extend platform and feature support for your backup environments for Microsoft applications, virtual environments (VMware and Microsoft Server 2008 R2 Hyper-V) as well as storage reduction or optimisation technologies.
… or so Computerworld UK said on Monday. Something that I’ve been saying for 3 years – but, hey, what do I know? According to Computerworld UK the consumerisation of IT is an unavoidable phenomenon that will force businesses to rethink their security policies. What about all that stuff some poor soul will have to back up? CIOs need to deal with consumerisation. For many years we were able to say to our end users – “no you can’t have that”, or “no we don’t support this”. But those days are over. The security of this phenomenon is certainly a concern, but IDC reckon that 80% of data created by individual consumers will end up on corporate networks. This will inevitably cause a overload on our already overloaded systems. Until we have the management capabilities for streaming applications to the desktop (oh, sorry Symantec already does that). OK, so when we get around to migrating to this model where all our data is help inside the network consumerisation won’t be nearly as scary – until then data growth will continue on its upward curve.
That’s why deduplication and archiving are key to our backup strategies. We are challenged with managing and protecting the ever-increasing amounts of data. Backup Exec offer deduplication across physical and virtual machines to reduce the length and size of backups. Deduplication has the power to transform information management; it is great for backup, it is great for archiving, and can even make virtualised server backup manageable. Symantec believes that deduplication should live in every part of the information architecture.
Much of the content produced now consists of email, documents, presentations, and other types of unstructured information. This explosion of information has a significant impact on storage spending and IT’s ability to meet the needs of its internal customers and business units. Backup Exec’s integrated archiving option is focused on reducing the amount of information backed up. Together with Enterprise Vault (EV), Backup Exec’s Agent for EV helps organisations to unify content sources, apply retention policies, reduce backup windows, shorten recovery times, and optimise storage resources making it easier for companies of all sizes to store, manage, and protect all unstructured data.
Microsoft Windows 7 Backup is getting trashed in a Microsoft forum for being unbelievably bad and slow.
Like this comes as such a surprise … Windows Backup? “It is an insult.” Jon Hell on a Microsoft Forum: http://social.technet.microsoft.com/Forums/en-US/windowsbackup/thread/3e08fc65-52f5-48ca-ae13-321cdfc44fbd … “Windows Backup is an embarrassment.” Said another.
Why everyone isn’t using Backup Exec System Recovery Desktop Edition 2010 (BESR 2010) for their desktops and laptops beats me. Really simple to use as either a backup or disaster recovery tool, BESR 2010 is a cost-effective solution that helps minimise downtime and avoid disaster – it’s like Zero to DR in 10 minutes. You can recover individual data files/folders or complete Windows desktops or laptops in minutes.
I use it on my machine and mount the backups on a 500 GB USB drive (no power required) and take an off-site copy to our file servers at the same time. I never see any system impact – there is a slider that allows you to reduce the impact if you do see some degradation. It’s really simple to install, licence and set up. In fact, to be honest it’s easier to schedule your backups with BESR than it is to set up a meeting in Outlook, and recovery is just as simple.
With an off-site copy I can ensure that even if I lose my external drive I can get my data or systems back and backing up to a USB drive means I can restore data from anywhere even if I’m not connected to the Symantec network. Easy, simple, effective. New machine with Windows 7 O/S, no problem restoring data to different hardware … you can store your backups in a virtual environment which makes sense.
It’s a standalone solution as well as a 1+1=3 component of a larger data protection strategy, in fact I know of NetBackup customers who use BESR Server Edition to give themselves DR capabilities for their Windows Servers as well as using BESR Desktop to backup their employees laptops and desktops.
So, there’s no need to rely on Windows 7 Backup … given that Symantec has just won Best of Tech Ed Award fBackup and Reco.very
Full article in The Register can be found at:
Questions to ask yourself:
- How long does it take you to recover a desktop or laptop?
- How long does it take you to recover a Windows system?
- Can you recover a system to dissimilar hardware?
- Can you recover a complete system?
- How easily can you recover a virtual system?