By Kate Lewis
Agents. Agentless. VADP integration. VSS integration. Image based backups. File based backups. Hypervisor based snapshots. Array based snapshots. Host based backups. Guest based backups. It’s no wonder why backup professionals are confused about what the best approach is for backing up their virtual machines. With a myriad of vendors, all positioning their own way as the “best” way, it leaves in its wake ambiguity and a good dose of confusion about what an agent is, or does.
With the help of my technical experts here at Symantec, this blog distils the confusion with unbiased information so you can make the right choice for your environment. By looking at each method and highlighting the pros and cons of each, you can make informed decisions without the distraction of smoke and mirrors.
Caution: But before we dive in, it’s important to mention that the phrases agentless backup and agent-based backup can mean different things to different vendors. To truly determine what the best approach for your organization is, you need to look under the cover and weigh in the pros and cons of each method. Don’t worry; we have done the hard work for you at Symantec. Now let us jump in and take a look at each one in turn.
1) Traditional Agent-Based Backup (also known as guest based backup)
Traditional agent-based backup is also known as guest based backup. An agent is installed in every virtual machine and treats each virtual machine as if it was a physical server. The agent in this scenario is reading data from disk and streaming the data to the backup server. This method must not be confused with agent-assisted backups that we will cover later.
There are many people today using this approach to protect their virtual machines. According to ESG1, 46% of all environments are utilizing guest based protection methods with a backup agent running inside the guest OS. Although there are newer methods available, you may be asking yourself why so many people are still using this method.
- Both physical and virtual machines are protected using the same method
- Application owners can manage backups and restores from guest OS
- Time tested and proven solution
- Meets their recovery needs
- This is the only way to protect VMware Fault Tolerant virtual machines
- Significantly higher CPU, memory, I/O and network resources utilization on virtual host machines when backups run.
- Need to install and manage agents on each virtual machine
- Cost may be high for solutions that license on a per agent basis as opposed to per hypervisor based licensing
- Cannot accommodate virtual machine sprawl, lack of visibility into changing virtual infrastructure
- No visibility for backups from VM administrators’ point of view; for example, backups are not visible at vSphere client level
- May need multiple kinds of backups and recovery methods; for example separate backup policies may be needed for file and folders backups, Microsoft Exchange backups, bare metal recovery etc.
- Complex disaster recovery strategies
- Lack of SAN transport backups to offload backup processing job from virtual infrastructure
- No protection for offline virtual machines and virtual machine templates
- Slow file by file backup by agent sending the even unchanged data over and over again
Verdict: A cumbersome, traditional backup and recovery method, but offers flexible recovery features.
2) Agentless backup (also known as host-based backup)
Agentless backup, also known as host-based backup, refers to solutions that do not require an agent to be installed on each VM by the administrator. However, it’s important to note that the software may be injecting an agent onto the guest machines without your knowledge.
These solutions integrate with VMware APIs for Data Protection (VADP) or Microsoft VSS, which creates fast, high performance snapshots of the virtual disks attached to VMs. The backup software communicates with VADP or VSS and tells it what it wants to backup. VADP and VSS carry out a number of steps and in turn prepare the data to be backed up. The VSS / VADP provider snaps the volume and gives the backup solution access to this snapshot by feeding the file to the backup server. The backup solution then backs up the snapshot.
While it provides recovery for full VMs, files and folders, the recovery of applications and application data can be complex and time consuming. This is because it requires additional processing that engages resources external to the virtual machines. Applications on these hypervisors won’t truncate their transactions logs or perform other database maintenance tasks. An Exchange Server is a perfect example. Without an agent or agent-like executable in the VM gathering metadata about the Exchange information store there is a need for additional processing external to the exchange VM in order to map mailbox data. If you ignore this process, it can result in unmanaged transactional applications that must be manually managed by the application owner, and data that might only be recovered by first restoring the entire VM and its virtual disks. Therefore, one of the key differences between Agentless and Agent-Assisted backups is how the transactional post-processing happens.
- VMs can be backed up online or offline
- Less CPU, memory, I/O and network impact on the virtual host
- An agentless architecture doesn’t require the management of agent software
- No per VM agent licensing fees
- Extremely difficult to recover granular object data – first restore the entire VM and its virtual disks
- Traditional login techniques to log into the server
- Temporary “injected” drivers can destabilize the system and compromise data integrity
- Troubleshooting is more complex when using injected (temporary) agents
- A centralized controller is a single-point-of-failure
- Requires a fully-virtualized environment. Physical machines still require agent-based backup. If you have physical and virtual you will need two backup solutions – one for physical and the other for virtual.
- Additional processing e.g. post backup scripts and truncation engages resources external to the virtual machines
Verdict: Good method for protecting file and print servers, but not an optimal solution for VMs with applications. Recovery is operationally painful for applications and application data.
3) Agent-Assisted Backup: Next generation backup (also known as host based backup)
Agent assisted backups are also known as host based backup and integrate with VMware’s VADP and Microsoft VSS to provide fast and efficient online and offline backups of ESX, vSphere and Hyper-V. The primary difference between agentless and this design is its perspective: it pairs the VMware VADP or Microsoft VSS with an agent that gathers application metadata to enable multiple avenues of recovery (full VM, applications, databases, files, folders and granular objects). The agent-like executable in this instance is not carrying out the backup and thus does not impact the performance of the VM. It’s simply handling metadata and necessary post backup processing like log truncation.
- The backup is for the entire virtual machine. This is important because it means the entire VM can be recovered from the image. It also means that products like Backup Exec & NetBackup can offer “any-level” of recovery from the image contents: Files / Folders, Databases and Granular database contents, like email and documents.
- The backup can be offloaded from both VM as well as the hypervisor. This means that Backup Exec & NetBackup have the flexibility to offload VM backup onto an existing backup server, instead of burdening the hypervisor. It also means that users have the option of deploying a dedicated VM, e.g. a virtual appliance, when a physical backup server is not practical.
- Application owner can self serve restore requests: The application owner can request restores directly from the guest operating system.
- Enhanced security: The agent installed for assisting with VM backup can be managed by the application owner. Thus you are avoiding the need to share guess OS credentials with backup administrator.
The most resource efficient backups are the backup operations done at the hypervisor level and provide some of the following advantages:
- Backup is still image based. Leverages VADP / VSS.
- Ability to directly recover files and folders directly back to a virtual machine.
- Enable automatic discovery of application inside VM
- Granular Application Recovery
- For VSS Compliant Applications, backup is application consistent via VSS integration
- For Non VSS compliant applications, backup is crash consistent
- Less Performance and I/O impact on the Virtual Machines
- Can be on LAN or SAN interface
Verdict: Excellent method for VMs with applications like AD, Exchange, SQL and SharePoint.
Before I close out this blog, it is very important to understand that backup vendors who utilize VMware VADP or Microsoft VSS to perform backups are not the same. Some are better than others. How good the backup software is will depend on the validation phase. So don’t be fooled in thinking that all solutions with VMware VADP or Microsoft VSS integration offer the same functionality and benefits. A quality backup application that integrates with VMware VADP or Microsoft VSS to perform backups should at the very minimum:
- Validate snapshots prior to backup
- Use VMware change block tracking (CBT) mechanism to have smaller storage footprint by backing up only changed data
- Verify backup data
- Backup VM directly from storage location for example, SAN, iSCSI, NAS, without having to install any software a.k.a agent inside the VMs
- Offer flexible recovery options (Full VM recovery , File/Folder level recovery, Application level recovery and granular object recovery)
- Centralized backups for Virtual machines
- Dynamic inclusion of VMs
- Ability to transport your data offsite for disaster recovery
As your embrace virtualization or increase your virtual footprint, selecting a backup solution that integrates with VMware VADP and Microsoft VSS will provide fast snapshot-based image backups of online and offline guest virtual machines. For superior recovery capabilities a solution that gathers metadata and executes post processing tasks is a must. So we’ve learnt that the question isn’t whether you have an agent or other differently named binary in the guest. The question is what is the agent’s function. Jason Buffington, a Senior Analyst at ESG, wrote a great blog on “good agents” and “bad agents”. If you would like to learn more, you can check out his blog here: http://www.esg-global.com/briefs/agent-best-practices-for-host-based-backups/.
In summary, an agent-assisted solution that integrates with VMware VADP and Microsoft VSS is the clear winner in today’s environments where physical and virtual machines require a holistic approach.
Finally, I couldn’t close out this blog without mentioning Backup Exec. Yes – I am biased on this one, but Backup Exec provides a perfect solution for virtual environments with technology that was designed for VMware and Hyper-V. Not only does Backup Exec provide superior data protection for virtual environments, it also provides market leading technology for physical environments too. With Backup Exec you get it all in a single solution. So here’s my pitch. Backup Exec 2012 dramatically reduces the time to recover from small or major disasters by protecting all of your virtual machines and/or physical servers through a single pass backup, while still allowing for individual file, folder and granular object level recovery. In short, it’s powerful, efficient, reliable and fast.
If you have any questions or would like to know more, email me at: email@example.com.
1 Source: ESG Research Report, 2012 Trends in Data Protection Modernization, August 2012.
Backup Exec 2012 introduced the first revolutionary change in its administrative paradigm in over nine years by moving to a resource-centric model that enables data protection lifecycle flexibility that was not possible in previous versions. Designed and optimized for physical, virtual, tape, disk, deduplication and the cloud, the new Backup Exec 2012 experience gives users the power to tailor the right protection paradigm in a very stepped, logical fashion.
In order to help our more established customers make the transition to the new paradigm, the Backup Exec team has quickly reacted to customer feedback on how to improve the migration experience. Available now in the Backup Exec 2012 SP1 release, is the Job View button that gives users a dedicated view of all jobs that have been configured in their Backup Exec environment.
This feature is similar to the Job Monitor feature found in previous versions of Backup Exec. The Job View allows users to view – and take action on – all backup jobs managed by their Backup Exec server through a simple “one click” button feature. Going forward we will focus on various improvements such as the ability to group servers to provide a “job” analogue, prioritize server backup order, target multiple server backups to the same tape and the introduction of the next-generation Job Monitor will be made available to all Backup Exec 2012 customers.
What does this really mean?
Well, in spite of what I might think there have been some grumblings on the ground that we have gone “A Bridge Too Far” with the new Backup Exec 2012 User Interface (Marketing would refer to this as “Exxxxxperrriance!). I love it. Loads of you do too, but it is a big leap for those hardened amongst us. However, things are not so bad as we’re introduced a “Job View” … a bit like the Job Monitor with SP1a. You can already group assets and therefore edit multiple policies (for multiple assets) at the same time – described in an earlier blog. All good stuff!
The bottom line is: When you are migrating from BE 2010 to BE 2012 make sure you read the wording in the wizzards which explains what is going to happen next. Clicking, click, click, clickerty, click through will give you a shock or two.
Over the last 12 months it’s hard not to miss all the messages from vendors that promise to modernize your backup infrastructure. I particularly like the messages from vendors that champion solutions that address one or two aspects of backup – such as deduplication, snapshots or tools for backing up just VMware and Hyper-V environments. These are quick fixes, not modern data protection. Throwing more solutions at a problem as a quick fix is the cause of backup complexity and cost. A solution that increases complexity, risk or cost is hardly a solution that will modernize your backups. Why? Because the very nature of modernization is moving forward – reducing complexity, risk and cost. Yet these vendors, ironically, still try to position themselves as the solution to backup modernization without really offering a solution that does exactly that.
So what does it really mean to modernize your backup infrastructure? How do you know if your backup infrastructure is out of date? Is it the software, hardware or network that needs updating or is it a combination of all three?
To answer these questions and more, the Backup Exec team at Symantec have teamed up with leading analyst firm ESG to bring to you the latest backup trends and give pragmatic advice on how to apply those ideas to your environment that will truly modernize your environment. On Wednesday 9th May during Symantec Vision 2012 at the MGM in Las Vegas (http://www.symantec.com/vision), Jason Buffington from ESG will be presenting on: How to modernize your backups in 2012 (session number IM B01). In this session Jason will uncover his research on the latest trends in data protection. Not only will Jason give pragmatic advice on how to apply those ideas to your environment, he’ll also uncover backup and recovery best practices that will radically improve your backup and recovery performance. From virtualization to cloud, and from dedupe to replication, Jason will squeeze everything into 55 minutes.
If you are unable to attend vision this year don’t worry! This session will be recorded so if would like to get your hands on the replay, simply reply to this post and we will send you a link to view the recording as soon as it becomes available.
In the meantime, if you have a question in relation to modernizing your backups in 2012 send it via twitter to @JBUFF. Jason will be answering these questions and more during the session.
Before I sign out, I wanted to leave you with one closing thought. While many vendors talk about modernizing backup, there is only one company who is really delivering upon that promise and that’s Symantec. If you are attending Vision, stop by the Backup Exec booth and find out how.
Virtualization- what can I say other than it has “virtually” changed the IT world in which we all work and play? Why is virtualization so attractive to IT administrators? In short, the answer is easy- there are many uses and benefits that we gain through virtualization. For starters, the thought of being able to have a single server’s physical footprint represent many servers on the network has been a boon to administrators looking to consolidate space and reduce operation costs. Having the ability to quickly stand up a vm copy of a major application or work server for patch testing is a benefit that allows administrators to test during business hours is simply a game changer.
So, how else can we leverage this exciting technology? Well…how about recovery? What if I said we were talking about both physical and virtual environments?
I often speak with administrators that look for ways to simply protect their virtualized assets for the purpose full recovery in the event of disaster- i.e. their backup solution is only working to back up, but not truly embrace their virtual solution. What if we began taking the approach of having the backup software actually begin using virtualization as a true extension of the recovery plan? Can we take virtualization from being a resource that is typically one that is only backed up to a resource that can be leveraged as the platform for recovery for both physical and virtual servers alike?
We at Symantec say YES! …and Backup Exec 2012 is just the catalyst needed to truly and finally unite vrtual and physical environments.
Sure, the world is going virtual in a strong way but this is not something that is going to happen overnight. Although many early adopters have moved forward to become near 100% virtualized, most administrators are still governing environments that are heavily comprised of both physical and virtual server assets. As such, administrators need a solution that is not only purpose-built to work for their entire environment but one that takes advantage of the virtual infrastructure specifically to allow them to further leverage their IT investments.
Backup Exec 2012 is just the solution to deliver the proverbial goods. Not only is Backup Exec 2012 offering a fresh and new user experience but also is leveraging companies’ existing virtual infrastructure for items including instant recovery of any physical or virtual server that is protected but also has the ability to leverage the cloud to recover, test or migrate any VMware virtual machine that is in your environment. And for good measure, imagine that when it is time to migrate that physical server to a new virtual body you simply power on the virtual copy that was created and maintained as part of the standard backup of a physical server with Backup Exec 2012? Well, its time to stop imagining- the time has come to fully realize and embrace the power and fleibility of your virtual environment- and Backup Exec 2012 is here to help you do just that.
So again I ask….have you hugged your VM lately? If not you should- it can help you in more ways than you know!
Interested yet? If so please take a few moments to visit us virtually on the web or in person at Symantec VISION 2012 in May to learn more about how Backup Exec 2012 can help you truly embraced your virtual infrastructure.
In the meantime, check out this short video on leveraging “No Hardware” recovery with Backup Exec 2012:
Planning on attending Symantec VISION US this year? Come learn more in our BE 2012 session!
Discover the power of your virtual environments with Backup Exec 2012
Session IM B15 @ Symantec Vision 2012
MGM Grand – Las Vegas – May 7-10
Come join us at VISION US and learn more about:
- How to embrace and leverage your virtual infrastructure for near-line recovery, DR and sandbox testing
- Migrating to virtual? Learn about how to simply use your future and existing backups to make the process painless
- Need offsite DR for VM’s? Come see how we are leveraging the cloud for DR and much more!
At Lotus F1 Team, Information is the true currency of business but the by-product data is gold bullion
Many say that a company’s most valuable asset is its human capital. But a strong workforce without ideas is not really strong, or much of a force. So we would say human capital is nothing without information or to put it another way, knowledge. And it’s generally accepted that knowledge is power.
Power is the currency traded at Lotus F1 Team - a company that Symantec is proud to have worked in partnership with for over 15 years and an affair set to continue into the foreseeable future.
When we say power, we’re not just referring to the engines in its cars, though for sure they are some of the most powerful in the world. But before an engine makes it onto the track, hundreds of engineers and scientists pour thousands of hours into engineering & testing designs before completing a final blueprint. The blueprint is the real prize. It forms the basis of the most valuable asset a business can possess – Intellectual Property. This is the stuff that sets your business apart from competitors, which can be sold or licensed, providing an important revenue stream.
But the kind of innovation taking place on a daily basis at Lotus F1 Team produces a by-product: Data – reams of it. Anything not directly related to racing and winning is going to be a distraction but get sloppy with how you keep and manage data and you may as well had over the blueprints to your competitor in person. Lotus F1 Team has for example developed Wind Tunnel technology used by the company in the development and measurement of aerodynamic forces – known as Computational Fluid Dynamics (CFD). This is an area of research not only important to car racing but also to the aerospace, road and wind turbine industries and one pioneered and perfected by F1 over the years.
As well as building and driving fast cars, there’s also a commercial and business side too – all of which produces data and business processing that needs to be managed effectively, efficiently and securely. Ultimately, the organisation needs to know its power (information, knowledge, deals and ideas) are safe and subject to the highest integrity.
This is why we work with companies like Lotus F1 Team – because we are in the business of protecting and managing data at every level so that innovators can focus on the business they are in.
Have you ever seen pictures of the control room in a power generation plant? It’s an entire wall of dials, knobs, and gauges, all telling you important bits of information about the system. That’s great – and probably necessary! – if your only job was to manage that power plant. But as an IT professional, you have lots of daily jobs; some of you manage Exchange, SQL Server, firewalls, security, storage, servers, you name it. Backup and recovery might be only one of the many jobs you do on a daily basis. And here at Symantec, we don’t want your backup application to be like that power plant – we want you to be
able to sit down, do what you need to do, and get on with your day.
Backup Exec has an entirely new user interface. You’re going to like it – it’s simpler, more intuitive, and much easier to navigate. We’ve also taken a lot of time to keep all those great features you are used to in previous versions of Backup
Exec – so this interface is going to appeal to the new Backup Exec user and the seasoned professional. At-a-glance status is easily available in Backup Exec 2012, both from servers you are protecting and from the storage you are using to store your backups. The latest version of Backup Exec is just a cleaner, more intuitive way to manage your backup and recovery environment.
We’ve also included a new way to create and manage backup jobs and policies. No longer do you need an advanced degree in Data Protection to set up disk-to-disk-to-tape backups or replicate data between sites – the new Backup Stages feature shows you, in graphical detail, how your backup data will be transferred, when it will be backed up, and where it’s going to be transferred to.
Speaking of backups, how many of you just want to create backups to protect your critical servers and applications without any headaches? How many of you would rather not pore over every detail of application backups? Well, if that sounds like you, Backup Exec has made it much easier for you to set up backup jobs, because we have included some seriously intelligent defaults – based off our own expertise in data protection and by taking the most successful backup configurations from
our customers and partners – and building them into Backup Exec. If you want to get into the nuts and bolts of backup job creation, however, Backup Exec has all the same great features and customizability you have used before – so you have the right tools to get the job done.
With Backup Exec, we’ve also stepped up our “telemetry” program – gathering non-personally identifiable information from our customers and partners who choose to participate in the program. This gives us invaluable intelligence about how backups and restores are working in the field, and we have used that information extensively to make Backup Exec the easiest to use, full-featured data protection application for physical or virtual environments on the market.
Be one of the first to find out what other new ground breaking backup and recovery features are coming soon in Backup Exec.
Visit the Countdown to Better Backup web site here: http://bit.ly/yenx3z
By Aidan Finley … Symantec’s Aidan Finley talks about simplifying intelligent backup: http://bit.ly/vHJXoa #BetterBackup [Video]
How to help reduce the amount of data you backup …
BEDAT – What’s that?
The Backup Exec Deduplication Assessment Tool (BEDAT) is a utility designed to help partners demonstrate the value of Backup Exec and its deduplication technology to their customers – without having to have Backup Exec installed on the system! BEDAT scans user-selected data sets on one or more Windows-based systems in a customer’s network environment and estimates the deduplication savings that would be experienced if the same systems were protected using Backup Exec or the Backup Exec 3600 Appliance and deduplication – using the same algorithms used in BE itself. BEDAT returns global deduplication results, per resource deduplication results, and per data type deduplication results. BEDAT does not actually capture or transport any customer data during the assessment process; it only captures deduplication fingerprint information and transmits this data to be included in deduplication results.
How does it work?
The Backup Exec Deduplication Assessment Tool (BEDAT) installs to almost any Windows-based, x86 or x64 computer system. When run, it can calculate deduplication results for the system on which it is installed, as well as other systems available on the network. When capturing deduplication data from remote network systems, a small agent is temporarily installed to the remote servers and removed after deduplication calculations have been completed. BEDAT is designed to be as simple and as easy to use as possible. It is a wizard-driven utility that does not require any specific IT expertise to use successfully.
Who is it for?
The Backup Exec Deduplication Assessment Tool (BEDAT) is designed to be used by Backup Exec partners as they help customers understand the storage optimisation benefits of the deduplication technology found in Backup Exec. If you are a customer please feel free to ask your Symantec IT supplier to provide this service.
Where can Partners get it?
The Backup Exec Deduplication Assessment Tool (BEDAT) is available for partners to download at the Symantec PartnerNet site. For end user customers interested in using BEDAT in their environments, please contact a local Symantec partner and talk to them about how Backup Exec can help optimise your backup storage and network utilisation.
What platforms and data types are supported?
The Backup Exec Deduplication Assessment Tool (BEDAT) supports Windows 2003 and Windows 2008 x86 and x64 platforms, including both physical and virtual systems. It supports estimating deduplication results for file system data, Exchange data, and SQL data.
Please note: while designed to be highly accurate, the results offered by the Backup Exec Deduplication Assessment Tool (BEDAT) represent estimates of the storage savings that would be gained by using Backup Exec deduplication technology.
Yes, it’s true – we are becoming a nation of information addicts – at least according to a survey Symantec recently carried out. Symantec wanted to find out more about how the so-called information explosion is affecting the everyday lives of British office workers. What was abundantly clear is that we are all suffering from this 21st century ailment – Information Overload – sounds like a Tom Cruise film, or AC/DC album – and it is overtaking not only our working lives, but our personal ones too.
Accessing work information out of hours, compulsively checking emails, texts and social media and hoarding endless emails and multiple versions of the same file are all symptoms of information overload experienced by those we surveyed. See the stats here.
But whereas the technology enabling us to do this (fantastic mobile devices and faster connectivity) all purport to make us more productive in the workplace, is our mismanagement of information actually counter-productive?
IDC has recently estimated that in 2011 over 1.8 Zetabytes of information was created and replicated (IDC, “The 2011 Digital Universe Study: Extracting Value from Chaos”) and if we go by Moore’s Law this will continue to grow almost immeasurably over the coming years. What does this mean for our state of mind and the systems we work with – will we reach a moment when we are essentially ‘drowning’ in information?
Not if the technologies that store and manage information also continue to improve. We are working very hard to make managing information easier, faster and more efficient for businesses of all sizes. This means making sure that what is actually useful and valuable is stored, archived and backed up correctly, while the rest is relegated to permanent deletion.
But technology can only go so far, some of the onus is still on businesses and individuals to moderate their work behaviour to take into account this new work paradigm.
Part II – What can we do about it?
- Unite Virtual and Physical: Powered by Symantec V-Ray technology, Backup Exec 2012 enables visibility across both virtual and physical environments for fast and efficient backup and recovery while eliminating the need for specialised point products.
- Eliminate Backup Complexity with a New Administration Console: A newly redesigned administration console will provide users with fast, concise management and monitoring capabilities.
- Integrated Disaster Recovery: With bare-metal disaster recovery and Backup Exec’s “no hardware DR” built in, organizations will be able to easily recover a failed system to a physical server, or to a Hyper-V or VMware guest.
- Capacity Licensing: New capacity licensing model for Managed Service Providers (MSPs), mid-sized and lower enterprise organizations will provide easier purchasing and maintenance by capacity as an alternative to existing a la carte pricing.
- Small Business Edition: In less than 10 minutes and with just three simple steps, Backup Exec 2012 Small Business Edition will install and configure backups so small businesses with limited IT experience can protect their data with ease. The new Backup Exec Small Business Edition will bundle Symantec’s data and system recovery technology into one affordable solution with a single license that’s designed specifically for a growing business.
We’ve produced a really smart little app – well, it’s not actually an App quite yet, but will be pretty soon – that helps anyone who doesn’t know what all the agents and options for Backup Exec actually do: http://www.symantec.com/redirects/backup_exec/beguide/
I’ve been meaning to do this forever but now I am delighted to announce the newly available website. Launched at VISION EMEA, Barcelona 4-6 October, in a nutshell, this is Backup Exec Guide “Made Simple”. For customers, partners … anyone interested in backup frankly, it drills into each Agent or Option and explains what it does.
This publication is designed to give an overview of Backup Exec 2010 R3. It is aimed at all professionals working in or with Symantec Backup Exec and Symantec System Recovery to improve general knowledge of Symantec’s Data Protection strategy and wider Information Management capabilities. Using this document you should be able to build availability solutions and data protection strategies for all types of organisation.
Maximise the potential of Symantec Backup Exec
Whether you want to go straight to Agents & Options or whether you want to search by Backup priority including Virtualization, Storage Management, Disaster Recovery, Application Recovery or Management & Platform Support, there is a route for you.
The portable website has been designed to be accessed in multiple ways. We will shortly be announcing the BE Guide Mobile App available for iPhone, iPad and Android. More-to-follow.
Find out which Backup Exec Agents and Options are right for your business using this new interactive guide – simply click either to select a business priority or view our complete list of Agents and Options – simple, easy and effective – have a play today.
Adding to the Guide all the time
We will be adding content to the guide as we go – this will be of a technical nature to keep an eye out for updates.