Very loosely, we were instructed to delete everything pre dot com bubble bursting (2000), keep everything post and now we are fast running out of data centre disk allocation space, err?
In fact it’s wonder we manage to do anything given the amount of information we need to process. As a consequence we are now facing a greater threat – too much information. There are somewhere between 60 to 160 Billion mails sent around the world every single day. These emails include attachments such as reports, presentations, letters and pictures. In spite of the limitations such as privacy and too much unwanted mail, email is the best way to communicate efficiently, quickly and cheaply. The danger with email, as with any other way of sharing information, is that too much information simply clogs the system up and become a bottleneck to productivity.
Here are some useful top tips that may help:
- Understand the new business user – organisations must better understand the challenges employees are facing when navigating the world of information management. Look at when and how employees are accessing their information, make sure that data is indexed and categorised, and that intelligent archiving and search tools are available
- Prepare the infrastructure – with the relentless flow of information only set to continue, IT infrastructure must be able to cost effectively manage the increasing requirements for storage by implementing solutions able to dedupe and archive appropriately, automate processes and monitor and report on system status across all different devices and environments
- Prepare people – create IT policies that educate employees on how to manage their information – from email practices like limiting the ‘CC’ and ‘reply to all culture’, to saving only the latest document version and overcoming the fear of the delete button. Help employees understand the company’s information retention strategy so they know what information is recoverable. This will empower them to take charge of information control and maintain productivity and efficiency
- Keep security front of mind – it seems like an obvious statement, but reinforcing company security policies around mobile devices could protect against significant and damaging data loss. Make sure employees know the company processes and take advantage of technologies that enable the IT department to see where the most important information is, at all times
- Encourage staff to switch off – with the information era in full swing and with more and more opportunity for employees to stay connected at all times, it’s important that organisations support staff welfare and encourage them to switch off every once in a while
Seriously consider optimising your storage to reduce overall front end storage usage. Improving capacity can be done through integrated archiving and deduplication as well as tiering your storage. Archiving moves old data to a separate store so you don’t have to backup the same data day-in, day-out – forever. Deduplication only backs up data (at a block level) once, using a pointer to the unique data. So you can both reduce the amount you backup as well as dramatically reducing your backup window with archiving and data deduplication.
But, I hear you say, if I implement deduplication technology what are the benefits? Well, Backup Exec can help with that too. Read all about the Backup Exec Deduplication Assessment Tool in Part III.
Yes, it’s true – we are becoming a nation of information addicts – at least according to a survey Symantec recently carried out. Symantec wanted to find out more about how the so-called information explosion is affecting the everyday lives of British office workers. What was abundantly clear is that we are all suffering from this 21st century ailment – Information Overload – sounds like a Tom Cruise film, or AC/DC album – and it is overtaking not only our working lives, but our personal ones too.
Accessing work information out of hours, compulsively checking emails, texts and social media and hoarding endless emails and multiple versions of the same file are all symptoms of information overload experienced by those we surveyed. See the stats here.
But whereas the technology enabling us to do this (fantastic mobile devices and faster connectivity) all purport to make us more productive in the workplace, is our mismanagement of information actually counter-productive?
IDC has recently estimated that in 2011 over 1.8 Zetabytes of information was created and replicated (IDC, “The 2011 Digital Universe Study: Extracting Value from Chaos”) and if we go by Moore’s Law this will continue to grow almost immeasurably over the coming years. What does this mean for our state of mind and the systems we work with – will we reach a moment when we are essentially ‘drowning’ in information?
Not if the technologies that store and manage information also continue to improve. We are working very hard to make managing information easier, faster and more efficient for businesses of all sizes. This means making sure that what is actually useful and valuable is stored, archived and backed up correctly, while the rest is relegated to permanent deletion.
But technology can only go so far, some of the onus is still on businesses and individuals to moderate their work behaviour to take into account this new work paradigm.
Part II – What can we do about it?
I need a new service, so I need an application, and a new server, and perhaps some storage … and if we’re lucky we ask ourselves “oh, yes, what about the backup?” Have you noticed how really never turn IT off, we just add to it. So we end up with a backup strategy that encompassed everything 3 or 4 years ago, but one that falls pretty short today; that’s how it really works.
Even though we know that we really should backup all our data – just in case – are we absolutely convinced we actually are? Backup is our critical data protection solution and yet we rarely review our backup strategy.
With server virtualisation, the need for fast reliable application recovery, the exponential growth of unstructured data and poor data lifecycle management are some of the root causes of operational inefficiencies in IT and why we are change the way we approach our backup strategies.
With more and more companies adopting virtualisation technologies to improve efficiencies and reduce CAPEX costs, organisations are looking for ways of protecting both virtual and physical environments with a single backup tool. It makes sense to use a solution that gives you granular recovery from a single pass backup, saving time, money and any amount of effort – don’t use separate tools and end up backing up the backup it turns the recovery process into a nightmare!
The backup and recovery of Microsoft Applications is an inherently challenging process that becomes more difficult as the databases grow and the demands on its online availability increases, further limiting the time available for backup and recovery operations. Granular Recovery of Exchange, SQL and Active Directory from a single pass backup makes it easy and efficient to identify and recover only those objects needed.
Optimising storage to reduce overall storage capacity can be done through integrated archiving and deduplication. Archiving moves old data to a separate store so you don’t have to backup the same data day-in, day-out – forever. Deduplication only backs up data (at a block level) once, using a pointer to the unique data. You can reduce the backup window dramatically with both archiving and data deduplication.
Backup Exec 2010
Backup Exec Agents and Options enhance and extend platform and feature support for your backup environments for Microsoft applications, virtual environments (VMware and Microsoft Server 2008 R2 Hyper-V) as well as storage reduction or optimisation technologies.
… or so Computerworld UK said on Monday. Something that I’ve been saying for 3 years – but, hey, what do I know? According to Computerworld UK the consumerisation of IT is an unavoidable phenomenon that will force businesses to rethink their security policies. What about all that stuff some poor soul will have to back up? CIOs need to deal with consumerisation. For many years we were able to say to our end users – “no you can’t have that”, or “no we don’t support this”. But those days are over. The security of this phenomenon is certainly a concern, but IDC reckon that 80% of data created by individual consumers will end up on corporate networks. This will inevitably cause a overload on our already overloaded systems. Until we have the management capabilities for streaming applications to the desktop (oh, sorry Symantec already does that). OK, so when we get around to migrating to this model where all our data is help inside the network consumerisation won’t be nearly as scary – until then data growth will continue on its upward curve.
That’s why deduplication and archiving are key to our backup strategies. We are challenged with managing and protecting the ever-increasing amounts of data. Backup Exec offer deduplication across physical and virtual machines to reduce the length and size of backups. Deduplication has the power to transform information management; it is great for backup, it is great for archiving, and can even make virtualised server backup manageable. Symantec believes that deduplication should live in every part of the information architecture.
Much of the content produced now consists of email, documents, presentations, and other types of unstructured information. This explosion of information has a significant impact on storage spending and IT’s ability to meet the needs of its internal customers and business units. Backup Exec’s integrated archiving option is focused on reducing the amount of information backed up. Together with Enterprise Vault (EV), Backup Exec’s Agent for EV helps organisations to unify content sources, apply retention policies, reduce backup windows, shorten recovery times, and optimise storage resources making it easier for companies of all sizes to store, manage, and protect all unstructured data.
Actually, I’ve discovered it’s not just me – thank goodness. When you go through a product launch process there is always a chance that the general ”noise” is you banging on about stuff and is limited to inside your own head.
Not so in BE 2010 case … I’ve been running around Europe in the last month or so, and the feedback I’ve been getting is that there is an awful amount of interest in deduplication. Even smaller companies who thought that deduplication was probably too much for their needs are seriously looking at getting rid of some of the duplicate data on their primary storage, but even more are looking at deduplication as a way of improving backup and restore times.
I was at an event a couple of months back where every conversation I had was around the length of time it takes to backup. As data continues to grow across the IT infrastructure, everywhere: laptops, disparate storage devices, remote offices, as well as the good old Data Centre, it has created a fundamental shift in the way organisations need to manage information. Keeping information on disk for faster DR restores is all fine and dandy, but there is simply too much data around. Disk based backup is now getting a tricky to manage and as cumbersome as tape-based backup.
A number of customers are turning to deduplication technologies in order to facilitate faster backups, reduce primary storage, and reduce not just the amount of disk being used up but also improve tape media rotation and management.
Deduplication gives you the ability to tape a strategic approach to storage and backups. Organisations now have the ability to deploy an integrated platform that is easy to manage and supports source and target-based deduplication.
Primary storage deduplication will become widely deployed in the next 6 to 12 months. Most organisations have not yet gone down this route, however, with the Option now built into Backup Exec 2010 this is all the more likely because it’s available simply, really easy to install and the benefits are huge.
Today is a big day for us at Symantec – huge launch of our backup products, NetBackup 7 and Backup Exec 2010, offering a unified backup and recovery portfolio that reaches from the smallest businesses to the largest enterprises.
We are really excited about this launch which must be one of the most significant in the last few years, and there is a considerable amount of interest in the industry.
Businesses today rely on information technology and systems to run their businesses; help to drive new opportunities; operate efficiently and comply appropriately with governance. Most organisations today are organised around servers, storage and applications with islands of static information. The sheer volume of data and its continued growth means that IT is struggling to keep up with growth with the added pressure to do more with less.
The new products are really impressive with integrated deduplication everywhere and archiving, reducing the complexity of storage management, as well as centralised information management and enhanced virtualisation capabilities.
Small Business: Symantec Backup Exec 2010 and Backup Exec System Recovery 2010 provide a simple, cost-effective backup and recovery solution that helps minimise downtime and avoid disaster by easily recovering individual data files/folders or complete Windows systems in minutes even to different hardware, virtual environments, or remote locations – for a multitude of SB environments.
Small and Medium Businesses: Symantec delivers reliable backup and recovery designed for growing businesses. Backup Exec 2010 helps protect more, store less and save more by reducing storage and management costs through integrated deduplication and archiving technology on both virtual or physical systems.
Enterprise: Symantec NetBackup 7 simplifies the protection of heterogeneous enterprise information by automating advanced technologies across applications, platforms, and virtual environments. Integrated deduplication, replication, and virtual machine protection improves storage efficiency, infrastructure use, and recovery times through one console.
Benefits of Next Generation Information Management from Symantec
- Reduce Costs: Gain 10-20 percent net savings from a single platform
- Recover data up to 5 times faster for less downtime
- Reduce unstructured data storage 40-60%
- Compress remote office backups up to 95%
- Protect virtual and physical machines
It doesn’t matter what is the issue – the answer is Symantec!
General availablity 1st February 2010
I don’t know if any of you have noticed but the first fiscal quarter of this year has been pretty bad for server sales. It doesn’t matter what the pundits say, however, it is important for all of us to save money, cut costs – but absolutely not at the expense of our backup.
Whatever else you gamble with, backup shouldn’t be one of them. IDC recently reported that server sales from January to March was the worst quarter in the dozen years that they have been releasing quarterly server figures. The current economic crisis has injected a disconcerting amount of uncertainty into the business climate. Organisations are loath to spend any more money than is absolutely necessary.
The silver lining here is that IT departments can use this opportunity to consolidate existing projects and focus on optimising existing backup systems . Backup Exec 12.5 and Backup Exec System Recovery 8.5 are built on complete protection and recovery providing central management protection for both virtual and physical systems – everything from multiple virtual servers to individual directories and files; a new generation of data protection management tools, powered by Altiris technology, for both Backup Exec and Backup Exec System Recovery.
BE delivers unmatched granular recovery capabilities for Exchange, Active Directory and SharePoint environments reducing the overheads associated with managing Exchange mailbox backups, restoring Active Directory user preferences and attributes without multiple reboots and overall simplifying the recovery process for critical Microsoft applications.
With BE you can ensure fast, efficient recovery of individual emails or documents from a full or incremental backup. Why take the time to recovery an entire database when all you need is an individual document or email?
Backup Exec System Recovery has offered capability to quickly convert physical systems to virtual environments for several years now. BESR enables immediate system recovery to virtual systems by allowing IT administrators to schedule physical to virtual conversions. Through a virtual conversion wizard an IT administrator can schedule P2V conversions to occur monthly, daily, weekly even hourly if desired so that in the event of a failure, you have a virtual system ready to go. In addition to dramatically reducing system downtime, this reduces management time and set-up for IT organisations as well.
Also new to this release is support for the latest virtual environments including VMware ESX 3.5, Microsoft Hyper-V and Citrix XenServer 4.x (when using VMDK or VHD file types). When you add this functionality to the off-site copy capability it really helps organisations address their disaster recovery needs.
If someone wants a high availability solution without investing in a lot of clustering or replication software that are outside their budget BESR technology is a great way to copy these images to other locations, convert them on a schedule, and if the original server goes down those images can be brought up immediately for high availability purposes.
IT has to make do with what they have – what they need to do is make sure that they are “making do” as efficiently as possible, tweaking here and there, centralising management, adding agents and options to optimise the backup and recovery of critical data and applications and slim-lining processes to improve backup process efficiencies.
You might want to take another look at Backup Exec and Backup Exec System Recovery …
You might have had a really (really) good backup strategy 2 or 3 years ago, but does it stack up today? Every vendor you’ve ever heard about is always banging on about backup and recovery being even more tricky than it was yesterday; and how it is now ever more challenging than it ever was, and, frankly it was pretty trick then … what with data growth going crazy, sprawling physical and virtualised environments, shrinking, nay, shrunk backup windows (what’s backup window?), and escalating storage management costs.
Gosh, haven’t we heard it all before? But, it is certainly true that the age old conventional backup solutions have not kept up with the data protection requirements, as well as the growth in data itself forcing substantial investments in hardware and dramatically increased administrator workload.
You might already have a good data backup strategy in place. But if your server hardware were lost or your building suddenly off limits, could you access your data and maintain business continuity? It doesn’t take much to wipe out a critical database of information. Any backup is only as reliable as its ability to restore business data, applications and systems when they are needed most.
A data protection strategy sets out how we go about minimising data protection risk. It has to be concerned with ensuring maximum effectiveness or it remains a pointless exercise. But end users I speak to in organisations large and small are still not convinced that they have reliable backup solutions in their environments – or perhaps more accurately that the backup solution isn’t quite covering everything …
There are also analyst rumours that virtual machines will outnumber physical servers in 2009. The adoption of x86 servers is making virtualisation a crucial factor, but at the same time making traditional backup solutions redundant.
We’ve put together a backup strategy assessment tool – why not have a go at it to find out how robust your Data Protection is?
Links to assessment tool: http://www.emea.symantec.com/mybackupexec/assessment/
Backup Exec Sites:
La France: www.symantec.fr/mybackupexec
Italia : www.symantec.it/mybackupexec
I saw an analyst report the other day that predicts that in spite of the economic downturn companies, large and small will be spending the same, or even increased, amounts on backup and recovery in fiscal year 2010.
To make matters more “huh-like”, this study found that the adoption of disk based technologies is accelerating. Actually, when you think about it this makes sense – disk based backup improves recovery capabilities, backs up virtual environments more effectively and eliminates or reduces the physical requirement (and security hazard) for tape transport.
Actually, it’s making more sense the more you think about it. Older backup solutions or older hardware is less effective, more administratively heavy, time consuming, costing money and effort; new hardware technology on the other hand is more efficient and with new software there is more opportunity to automate more. The automation of IT processes can improve overall IT performance as well as specifically storage, server, and application performance. It also gives IT a fighting chance of managing this unruly thing we call IT saving time, effort and money by better staff productivity, better utilisation rates of servers and storage, better efficiency and understanding of all the assets in the IT environment.
So, we all need to be focused on solving the fundamental challenges of backup: improving recovery objectives, improving success rates, and backing up virtual servers (by the way, if you’ve not sussed this out yet server virtualisation will have a significant impact on backup and data protection strategies in the next year). Backup Exec provides automated backup and recovery that easily integrates into your existing environment. BE’s Agents and Options enhance and extend platform and feature support for BE Environments.
The BE 12.5 Media Server and Agent of Windows Systems licenses include Advanced Open File Option (AOFO) and Intelligent Disaster Recovery Option (IDRO). Advanced Open File Option (AOFO) and Intelligent Disaster Recovery Option (IDRO) are included with each core Backup Exec 12.5 license and with each Agent for Windows Systems providing complete out-of-the-box data protection. Comprehensive Data Protection for VMware Infrastructures and Microsoft Hyper-V Servers provide fast, efficient data protection from a single console. One agent efficiently backs up unlimited virtual guest machines to disk or tape and Backup administrators can easily restore an individual file or folder saving management time and resources.
BE’s agent and options can help provide efficient, flexible database and granular data recovery down to an individual emails or documents with Granular Recovery Technology (GRT) for Exchange, SharePoint, Active Directive and virtual servers. BE provides CDP for Microsoft Exchange, SQL, file servers, and desktops/laptops almost eliminating any backup windows-without disrupting user productivity or application usage.
Where the core product is pretty powerful – why not take a look at how you can enhance your backup strategy with the agents and options you can add to your backup strategy to give you more backup for your bucks.
It really shouldn’t come as a surprise that a backup redesign is sweeping through IT departments. I think everyone is aware that backup has been identified as a primary IT pain point. And the reason is pretty simple actually – disk. There is so much of it. You know that most companies approach the protection of their data in a very dated manner. Although backup and recovery has been around for years, data protection solutions of old are failing to keep pace with today’s overwhelming data growth and complexity.
Although many companies have started to deploy disk-based data protection in the form of virtual tape libraries (VTLs), much of the thinking in data centres and by vendors is still tape-centric. But not just tap-centric but, more to the point, NOT application centric, NOT site centric, NOT virtual machine centric. Protection is getting harder because data centres and remote offices have evolved. Server consolidation through virtualisation technologies has changed the way companies need to think about protecting their data. How long is the backup window if there are 10 virtual machines per physical machine?
Much more is possible now with disk-based data protection: shared disk pools for protection and recovery, continuous data protection for remote office data as well as consolidation or recovery of data centre applications. (Backup Exec provides granular recovery technology and continuous data protection and Backup Exec System Recovery … does what it says on the tin.) Centralised management capabilities have improved to automate global data protection from remote offices to core data centres.
The next generation data protection with Backup Exec provides comprehensive management of enterprises’ complex environments. That means a single data protection solution that covers all databases, and applications. It also means supporting the broadest set of disk (SAN, NAS, and DAS), tape, and virtual tape library vendors, and delivering the latest data protection technologies such as virtualization, granular recovery, and continuous data protection.
Disk-based data protection is the future of backup and recovery for reasons of performance, reliability, and cost. Next generation data protection therefore means support for basic commodity disk, deduplication, replication, and continuous data protection.
Take the Backup Exec challenge – does your backup – stack up?