Virtualized Exchange Servers in Distributed Configurations
As previously described, Backup Exec supports modern image-level (“agentless”) protection of VMware and Hyper-V virtual machines, including virtual machines hosting applications such as Exchange. It’s important to note that Backup Exec does not currently support image-level backups of virtualized Exchange servers in a distributed configuration. Only virtual standalone Exchange servers are supported for image-level backup and granular recovery.
In order to achieve granular recovery support of virtualized Exchange servers in a distributed configuration, such as an Exchange 2010 Database Availability Group (DAG), the virtual machines must be protected using agent-based backups, which essentially treats each virtual machine as if it were a standalone, physical system.
Note: Backup Exec does not support granular recovery of Exchange 2013 mailbox objects. This functionality is planned for a later release of Backup Exec.
Protection of Physical Exchange Servers
For physical Exchange servers, the Agent for Windows is installed locally to the Exchange server. The Agent for Windows interacts with the physical Exchange server to prepare the Exchange databases for backup and to transmit backup data to the Backup Exec server over the NDMP protocol.
Backup of Physical Exchange Servers
VSS Integration and Physical Exchange Servers
Backups of physical Exchange servers that are captured by the Agent for Windows are snapshot backups performed using Microsoft’s VSS Writers (the only exception is Exchange 2003, which does not have a VSS writer). In most cases, Backup Exec uses a VSS full backup, which ensures that Exchange is placed into a consistent state at the time of backup and also truncates transaction logs, a key element of maintaining a healthy database application over time.
The Agent for Windows can only protect Exchange components of a server after the Agent for Applications and Databases has been licensed within Backup Exec.
Granular Application Recovery of Physical Exchange Servers
In addition to preparing physical Exchange servers for backup and transmitting Exchange backup data to the Backup Exec server for storage, the Agent for Windows also plays a key role during Exchange recovery. For example, the presence of the Agent for Windows locally installed to a physical Exchange server enables the Backup Exec server to directly transmit and restore granular Exchange objects back to the production Exchange environment of an organization.
Offhost Backups of Physical Exchange Servers
Backup Exec also supports offhost backups of physical Exchange servers. Offhost backups help alleviate the processing overhead of backup operations from the physical Exchange server and transfer them to the Backup Exec server.
For more information on Backup Exec and configuring offhost backups of physical Exchange servers, refer to the Backup Exec Administrator’s Guide and the following technote: http://www.symantec.com/docs/HOWTO12231.
Granular Application Recovery of Exchange Virtual Machines
To enhance Backup Exec’s virtual machine protection and recovery capabilities, particularly when the virtual machine is hosting Exchange, the Agent for Windows should be installed into the guest virtual machine itself. In this configuration, Backup Exec can still capture snapshot-based, image-level backups of the destination virtual machine, but can then also offer dynamic application discovery capabilities and granular recovery of Exchange application components, all from a single-pass backup. In other words, even with the Agent for Windows installed to the virtual machine, the backup process remains what is known in the industry as an “agentless” backup; the presence of the Agent for Windows within the virtual machine simply allows for application metadata capture and granular recovery of application objects directly back to the original virtual machine.
Agent for Windows Enables Granular Recovery of Virtualized Exchange Servers
While Backup Exec fully supports protecting virtualized Exchange servers without installing the Agent for Windows to the virtual machine, recovery options are limited in this configuration. When the Agent for Windows is not present on the Exchange virtual machine, Backup Exec has no direct knowledge of Exchange being present on the virtual machine, and recovery options are limited to full virtual machine recovery and file/folder recovery.
Application-specific recovery features are only available when the Agent for Windows is installed to the Exchange virtual machine, which allows Backup Exec to discover the Exchange application and capture the Exchange metadata needed to enable application-specific recovery features.
VSS Integration and Virtualized Exchange Servers
When protecting virtualized Exchange servers, Backup Exec utilizes Microsoft’s VSS service to prepare the Exchange virtual machine for backup and truncation of Exchange transaction logs. For VMware environments, these VSS calls are made to the Agent for Windows on the Exchange virtual machine through interactions with the vStorage API, and involves the VSS writer on the virtual machine. The VSS writer will be either the VSS writer included with VMware Tools, or the Backup Exec VSS writer that is installed with the Agent for Windows. For Hyper-V environments, a similar process happens through interactions with the Hyper-V host via the local Agent for Windows agent installed to the Hyper-V host. The VSS writer that is used to prepare the virtual machine for backup will be either the VSS writer installed to the virtual machine along with Hyper-V Integration Services, or the Backup Exec VSS writer that is installed with the Agent for Windows.
With either VMware or Hyper-V environments, Backup Exec invokes a virtual machine-level VSS full backup, which prepares Exchange for the virtual machine snapshot event and truncates Exchange transaction logs. If the Agent for Windows is installed to the Exchange virtual machine, the VSS backup method can be changed to a VSS copy, which will not truncate log files.
For more information, refer to the following technote: http://www.symantec.com/docs/HOWTO74082.
Uniquely Named Mailbox
To enable key features related to the protection and recovery of Exchange servers, such as granular recovery of Exchange objects, Backup Exec must have access to a uniquely named mailbox within the Exchange infrastructure. Access to this mailbox enables Backup Exec to interact with Exchange and important components within the Exchange Information Store. In order to enable granular recovery of Exchange objects, you must use the appropriate Exchange Server management utility to assign the user account to the Exchange Organization Administrators role (Exchange 2007) or the Exchange Organization Management role (Exchange 2010/2013).
The uniquely named mailbox cannot be hidden in the Exchange Global Address List.
For more information about this mailbox and associated requirements, refer to the Backup Exec Administrator’s Guide or the following technote:
- Ensuring Exchange mailbox name is unique http://www.symantec.com/docs/TECH24691.
Exchange Management Tools
You must install the Exchange Management Tools on the Backup Exec server. The management tools on the Backup Exec server must be the same version or later as the management tools that are on the Exchange Server. For more information about installing the Exchange Management Tools, refer to your Microsoft Exchange documentation.
Protection of Virtualized Exchange Servers
For virtualized Exchange servers, Backup Exec interacts with the Exchange server through the virtual host, either through software APIs provided by the virtual infrastructure (VMware), or through the Agent for Windows installed to the virtual host (Hyper-V). For virtualized Exchange servers, Backup Exec fully supports what is generally known as “agentless” backup, both for VMware as well as Hyper-V environments.
Backup of Virtualized Exchange Servers
For additional information on requirements for protecting Exchange environments using Backup Exec, refer to the Backup Exec Administrator’s Guide and the following technotes:
- General Exchange protection requirements: http://www.symantec.com/business/support/index?page=content&id=HOWTO24128
- Exchange granular recovery requirements: http://www.symantec.com/docs/TECH51740
Note: “Licensing Backup Exec in Exchange Environments” provides more information about the Agent for Applications and Databases licensing.
We have become a species of information addicts – the “information explosion” is affecting the everyday lives of office workers. What is crystal clear is that we are all suffering from a 21st century ailment – Information Overload – and it is taking over our personal lives, working lives, and our businesses.
Businesses need to protect a broad range of information, generated in a plethora of ways, through multiple applications used by billions of individuals around the world. Organisations not only need to protect their information and IT infrastructure, but need to be aware of how the infrastructure facilitates the sharing and use of the information used by the organisation between, not just connections among colleagues, but linking businesses, from businesses to consumers, as well as between consumers themselves. In other words, all the collaborative environments, and the movement of data while in use.
Accessing work information out of hours, compulsively checking emails, texts and social media and hoarding endless emails and multiple versions of the same file are all symptoms of information overload. But the technology enabling us to be more productive (fantastic mobile devices and faster connectivity) together with the mismanagement of information is actually counter-productive.
Email is Business Critical
Email has become an indispensable way of communicating and transferring data in the modern electronic age. In the year 2010, it was estimated that almost 300 billion emails were sent each day, and around 90 trillion emails were sent every year. Considering the rate at which data continues to increase year-over-year, the number of emails sent today is likely significantly greater. Email is used for many forms of communication, including business critical communications for companies of all sizes.
Companies rely heavily upon email systems to conduct day-to-day business operations, and any significant period where access to email is lost is considered to be highly intolerable.
All email solutions used by modern businesses are based upon a server infrastructure hosting an email software system. Whether hosted locally on physical or virtualized servers, hosted by a partner, or hosted in the cloud, these email software systems support the incredible amount of email transmissions that happen every day, and can be implemented in many different sizes and configurations. Perhaps the most common and popular email system used in the industry today is Microsoft Exchange.
Because Microsoft Exchange plays such a critical role in the ability for organizations to conduct day-to-day business, it’s equally critical that companies employ protection solutions that enable them to quickly and easily recover their Exchange system should a data loss or disaster event occur. Backup and recovery solutions of the highest value will offer features that enable the following:
- Functionality designed specifically for Microsoft Exchange
- Protection of Exchange while it remains online and functional
- Ability to protect physical Exchange servers as well as virtualized Exchange systems
- Support for highly available Exchange configurations
- Adherence to Microsoft best practices for Exchange backup and recovery
- Optimization of secondary (backup) storage using data deduplication technology
- Support for local as well as offsite storage of backup data
- Multiple levels of recovery from a single-pass backup
Symantec Backup Exec
For many years, the Symantec Backup Exec product family has offered market-leading solutions for the protection of Microsoft Exchange, and solves each of the key problems mentioned above. The Agent for Applications and Databases offers purpose-built functionality to ensure Microsoft Exchange is properly protected against disaster and to help partners and customers quickly and easily perform any level of Exchange recovery, whether it’s bare metal recovery of a physical Exchange server, granular recovery of an individual Exchange email, or anything in between.
Key benefits of Backup Exec include the ability to:
- Reduce business downtime
- Eliminate complexity
- Spend less time managing backups
- Ensure critical information on virtual or physical systems is always protected
- Restore data in seconds
- Reduce storage and management costs
- Optimise network utilisation
- Eliminate redundant additional backup jobs
- Provide granular recovery of data for applications and databases
Underlying Technical Principles
Exchange Mailbox Archiving Option Basic Architecture
The archiving technology imbedded within Backup Exec 2012 SP2 is based upon Enterprise Vault and uses the same core archiving storage components found in Enterprise Vault. So it’s pretty fine technology. The storage components in EV & BE include:
- Vault store
- Vault store partitions
- Fingerprint database (Single Instance Storage (SIS) Deduplication)
The Backup Exec 2012 SP2 server manages the vault store and vault store partitions as a storage device and writes data into archives from backup data sources according to the backup jobs configured by the administrator. So the Administrator defines the archiving policies for Exchange – 90 day/200 day whatever …
Figure 2: Archiving Storage Components Diagram
The vault store is the parent container for archived data. The vault store is managed as a storage device by the Backup Exec 2012 SP2 server. The vault store is separated into partitions which contain the actual archive data. Only one partition can be open at any given time to receive new archive data. However, Archives can span more than one partition.
Vault Store Partition
A vault store partition is a path to storage (e.g. ‘E:\Archive’). A vault store partition can be in one of two states: open or closed. The partition with the open state is the partition to which new archive data is written and stored. As mentioned previously, only one partition can be open at a time, so you can only write to one partition at any one time even though archives can span across partitions.
Vault store partitions with the closed state do not receive new archive data; however, closed partitions can still be read for data recovery purposes and can have data elements deleted according to archive expiration policies configured by the Backup Exec 2012 SP2 Administrator.
An archive is a collection of archived data. For the purposes of the Backup Exec 2012 SP2 Exchange Mailbox Archiving Option, one archive corresponds to one backed up user mailbox.
Archives can have new data added to them and can have old data deleted from them according to the settings configured by the Backup Exec 2012 SP2 administrator. An archive can span more than one vault store partitition, since the partition that was open at the time the archive was created may not be the same partition that is open and receives new data for the archive at a later time.
Although the data that’s been archived is deleted at source – thereby creating storage space on the Exchange Server – the end user can still locate the archive data in the same way that they could using Enterprise Vault – dead cool frankly!
Unique Value of Backup Exec 2012 SP2 and Integrated Archiving
For small and medium size environments, Backup Exec 2012 SP2 offers a unique approach to archiving through the unification of backup and archiving processes into a single offering. By linking backup and archiving technologies into a single solution, administrators can both protect critical servers and applications for disaster recovery and also archive email and file system data to secondary or tertiary storage. Backup Exec 2012 SP2 enables administrators to realize both backup and archiving benefits while only ‘touching’ critical servers and applications once.
Integration with Enterprise Vault Technology
Backup Exec 2012 SP2 includes integrated archiving solutions for both file system data and Exchange email data. The archiving technology within Backup Exec 2012 SP2 is based on the proven, market-leading Enterprise Vault family of products. By leveraging this proven technology, Backup Exec 2012 SP2 is able to offer the following advantages to administrators:
- A single, integrated solution for both backup and archiving
- Lower total cost of ownership from using a single product to solve two key IT problems
(1) Protection of critical servers and applications for disaster recovery
(2) Controlled archiving of file system and Exchange email data to lower cost storage
- Lower impact on production servers as backup and archiving are achieved from a single ‘touch’
- Compatibility and interoperability assurance from true technology integration
Reliability from utilizing market-proven archiving technology
Archiving Process Diagram
As organizations grow and expand, upgrade paths are available that enable organizations to transition from the integrated version of Enterprise Vault in Backup Exec 2012 SP2 to the full Enterprise Vault solution.
Exchange Mailbox Archiving Option
Backup Exec 2012 SP2 licenses its integrated archiving technology through two product options: the Exchange Mailbox Archiving Option and the File System Archiving Option. This blog is designed to assist partners and customers as they design and implement Backup Exec 2012 SP2 and the Exchange Mailbox Archiving Option.
Deduplication Methods – Client-side Deduplication
With Backup Exec 2012 SP2, exciting possibilities for remote office protection are available. The concept of client-side deduplication – where the remote system is responsible for deduplication calculations and where backup data is sent over the network in its deduplicated form – can make the process of protecting remote offices a much more streamlined experience. Remote offices can be challenging to protect effectively; WAN environments may only utilize a fraction of the bandwidth available to a LAN backup. Backups over the WAN can be a challenge to set up, as well as to complete. Some environments include backup servers that are not as powerful as the application servers they are protecting – often, the SQL server or the Exchange server in the environment is the most powerful machine available in terms of processor speed or disk throughput. Where appropriate, why not leverage some of this remote computing power to achieve faster backups? Both of these situations are problems where client-side deduplication can offer a comprehensive solution to the data protection challenges brought on by the environment.
Generally, remote office backup strategies have two basic architectures. First, there are remote offices which do not have local storage, and where backup data is sent directly over the LAN or WAN to the central data center for storage. Second, there are remote offices that employ local storage and then “forward” that locally stored backup data to the central data center for protection. Both of these configurations can use the Backup Exec 2012 SP2 Deduplication Option to streamline and improve backup and recovery for remote offices.
Client-side deduplication is the act of skipping redundant data blocks at the backup source before transmitting the backup stream to the Backup Exec server. Data from the source system is refined into smaller deduplication blocks, and only the unique blocks (that is, the data the Backup Exec server doesn’t yet contain) are sent to the Backup Exec server’s deduplication disk storage device.
A deduplication disk storage device is special type of disk storage configured by Backup Exec where all deduplication data blocks are stored. With the client-side deduplication method, the majority of the processing necessary for deduplication is done on the remote system rather than as the data arrives at the Backup Exec server. Client-side deduplication is the default deduplication method Symantec recommends for several reasons:
Client-side deduplication enables greater scalability by spreading processor usage out across all clients running backups, enabling the Backup Exec server to process more concurrent backups.
Reduced Network Data Transfers
Client-side deduplication minimizes network data transfers as only unique data blocks – not yet stored by the Backup Exec server – are transferred. Most environments – either LAN or WAN environments – can benefit from less data being sent across the network.
Each Backup Exec Agent for Windows and Agent for Linux has the built-in capability to perform client-side deduplication calculations. Note that all deduplication operations require the Deduplication Option to be licensed on the Backup Exec server.
|Backup Exec 2012 SP2 Agent||Client Deduplication Support|
|Agent for Windows||Yes|
|Agent for Linux||Yes|
|Agent for Mac||No|
|Agent for Applications and Databases||Yes|
|Agent for VMware and Hyper-V (VMware)||No*|
|Agent for VMware and Hyper-V (Hyper-V)||Yes**|
|*While it is possible to utilize client-side deduplication when protecting VMware virtual machines, this configuration requires that backups be processed by locally installed agents within the virtual machines themselves (the Agent for Windows or the Agent for Linux). This configuration bypasses the optimized, image-level backup capabilities of the Agent for VMware and Hyper-V in VMware environments. For these reasons, using client-side deduplication in VMware environments is generally not recommended. Backup Exec server-side deduplication is usually optimal.|
|**Client-side deduplication can be used when protecting Hyper-V environments using the Agent for VMware and Hyper-V. In this configuration, optimized, image-level backups of virtual machines are captured and deduplicated through the Backup Exec Agent for Windows installed locally to the Hyper-V host. It is not necessary to install an individual agent into each Hyper-V virtual machine in order to realize client-side deduplication in Hyper-V environments.|
Modern Data Management and Protection Challenges
Customers of all types and sizes are seeking new and innovative ways to overcome challenges associated with data growth and storage management. While these challenges are not necessarily new, they continue to become more complex and more difficult to overcome due to the following:
- Pace of data growth has accelerated
- Location of data has become more dispersed
- Linkages between data sets have become more complex
Data and storage management challenges are compounded by the need for companies to protect critical data assets against disaster through backup and recovery solutions. In order to maintain backups of critical data assets, additional secondary storage resources are required. This additional layer of backup storage must be implemented wherever backups occur, including central data centers and remote offices.
Storage Efficiencies through Data Deduplication
Backup Exec 2012 includes advanced data deduplication technology that allows companies to dramatically reduce the amount of storage required for backups, and to more efficiently centralize backup data from multiple sites for assured disaster recovery. These data deduplication capabilities are available in the Backup Exec 2012 Deduplication Option.
Backup Exec 2012 Data Deduplication Technology
The data deduplication technology within Backup Exec 2012 breaks down streams of backup data into “blocks.” Each data block is identified as either unique or non-unique, and a tracking database is used to ensure that only a single copy of a data block is saved to storage by that Backup Exec server. For subsequent backups, the tracking database identifies which blocks have been protected and only stores the blocks that are new or unique. For example, if five different client systems are sending backup data to a Backup Exec server and a data block is found in backup streams from all five of those client systems, only a single copy of the data block is actually stored by the Backup Exec server. This process of reducing redundant data blocks that are saved to backup storage leads to significant reduction in storage space needed for backups.
Figure 1: Deduplication Process
The deduplication technology within Backup Exec is applied across all backups managed by a deduplication-enabled Backup Exec server.
Deduplication Methods within Backup Exec 2012
The Backup Exec 2012 Deduplication Option gives backup administrators the flexibility to choose when and where deduplication calculations take place. Three deduplication methods are supported by Backup Exec 2012. These are as follows:
The client-side deduplication method is a software-driven process. Deduplication takes place at the source or protected client, and backup data is sent over the network in deduplicated form to the Backup Exec server. Only unique blocks of backup data are sent to the backup server and saved to backup storage; non-unique blocks are skipped.
Backup Exec Server-side Deduplication
The server-side deduplication method is also a software-driven process. Deduplication takes place after backup data has arrived at the Backup Exec server and just before data is stored to disk (also known as inline deduplication). Only unique blocks of backup data are stored; non-unique blocks are skipped.
Third-party Appliance Deduplication
The third-party appliance deduplication method is a hardware-driven process and is driven by Symantec OpenStorage (OST) APIs. Deduplication takes place on the third-party deduplication appliance (can be in-line or post-process deduplication, for example, ExaGrid or Quantum). Third-party appliance deduplication devices handle all aspects of deduplication.
Administrators can mix and match deduplication methods to fit their unique needs. For example, a single Backup Exec server enabled for deduplication can simultaneously use client-side deduplication for some jobs, server-side deduplication for other jobs, and third-party appliance deduplication for yet another set of jobs.
Figure 2: Deduplication Methods
The different deduplication methods supported by Backup Exec 2012 have various configurations for which they are best suited. The benefits of each method, as well as the configurations for which each method is best suited, will be detailed in the following weeks.
Very loosely, we were instructed to delete everything pre dot com bubble bursting (2000), keep everything post and now we are fast running out of data centre disk allocation space, err?
In fact it’s wonder we manage to do anything given the amount of information we need to process. As a consequence we are now facing a greater threat – too much information. There are somewhere between 60 to 160 Billion mails sent around the world every single day. These emails include attachments such as reports, presentations, letters and pictures. In spite of the limitations such as privacy and too much unwanted mail, email is the best way to communicate efficiently, quickly and cheaply. The danger with email, as with any other way of sharing information, is that too much information simply clogs the system up and become a bottleneck to productivity.
Here are some useful top tips that may help:
- Understand the new business user – organisations must better understand the challenges employees are facing when navigating the world of information management. Look at when and how employees are accessing their information, make sure that data is indexed and categorised, and that intelligent archiving and search tools are available
- Prepare the infrastructure – with the relentless flow of information only set to continue, IT infrastructure must be able to cost effectively manage the increasing requirements for storage by implementing solutions able to dedupe and archive appropriately, automate processes and monitor and report on system status across all different devices and environments
- Prepare people – create IT policies that educate employees on how to manage their information – from email practices like limiting the ‘CC’ and ‘reply to all culture’, to saving only the latest document version and overcoming the fear of the delete button. Help employees understand the company’s information retention strategy so they know what information is recoverable. This will empower them to take charge of information control and maintain productivity and efficiency
- Keep security front of mind – it seems like an obvious statement, but reinforcing company security policies around mobile devices could protect against significant and damaging data loss. Make sure employees know the company processes and take advantage of technologies that enable the IT department to see where the most important information is, at all times
- Encourage staff to switch off – with the information era in full swing and with more and more opportunity for employees to stay connected at all times, it’s important that organisations support staff welfare and encourage them to switch off every once in a while
Seriously consider optimising your storage to reduce overall front end storage usage. Improving capacity can be done through integrated archiving and deduplication as well as tiering your storage. Archiving moves old data to a separate store so you don’t have to backup the same data day-in, day-out – forever. Deduplication only backs up data (at a block level) once, using a pointer to the unique data. So you can both reduce the amount you backup as well as dramatically reducing your backup window with archiving and data deduplication.
But, I hear you say, if I implement deduplication technology what are the benefits? Well, Backup Exec can help with that too. Read all about the Backup Exec Deduplication Assessment Tool in Part III.
Keep your business up and running - by discovering backup and storage management inefficiencies you can cut costs, while making sure that your data is fully protected. Highly beneficial at a time when budgets are under strain.
It is really useful to go through the process of trying to find out:
- How well your data is protected
- If you are missing backing up critical data
- How prepared you are for increasing data volumes
- Whether your strategy supports business growth or lowers performance
- If you are taking advantage of the most cost-effective solutions available
It never ceases to amaze me how well we don’t know ourselves. To quote Polonius, “unto thine own self be true”; the more honest you are with yourself the more accurate and the more useful the results will be. We know our business, don’t we? We know that we are doing the best we can, aren’t we? It’s not like someone is trying to catch you out – give it a go, there are some pretty simple questions you can ask yourself just to get going, simply because the world has moved on, the drivers for improving backup and recovery operations are ever stricter:
- How can you keep business-critical applications running, delivering improved ROI, while complying with regulations
- How can you justify spending in times of budgetary constraint by demonstrating the quality and effectiveness of your systems
- What is the best way to convince business users of the importance of investing in backup solutions, before data is lost, while also establishing what should be backed up – and why
- How confident are you that you can cover all your IT service requirements? If you are not very confident – how not confident are you, 25%, 50%, or do you stick to all your business service agreements?
- What level of backup reporting do you have that allows you to justify future IT investment to optimise your recovery time objectives? Is there a requirement for reporting metrics, occasionally, or more regularly?
- How confident are you that your main business managers understand the importance of backup? Most of us take backup for granted but how confident are you that your backup policy covers all areas of the business?
Are you confident that you have the right backup and recovery systems in place and are getting the most out of them? A backup and recovery, or storage, assessment will highlight areas of weakness but also help to identify where Backup Exec can improve efficiencies and save you money.