In information technology, a backup, or data backup, or the process of backing up, refers to the copying into an archive file of computer data that is already in secondary storage—so that it may be used to restore the original after a data loss event. The verb form is "back up" (a phrasal verb), whereas the noun and adjective form is "backup".[1]
Backups primarily serve to restore the previous state after the loss, inadvertent deletion or corruption of data, and secondarily to recover data from an earlier time, based on a user-defined data retention policy.[3] Though backups represent a simple form of disaster recovery and should be part of any disaster recovery plan, backups by themselves should not be considered a complete disaster recovery plan (DRP). One reason for this is that not all backup systems are able to reconstitute a computer system or other complex configuration such as a computer cluster, active directory server, or database server by simply restoring data from a backup.[2]
Not only is a backup worth it to allow for restoration in case of data loss, but it will also allow you to rest assured knowing that your files are safe, similarly to having to worry less about injury when wearing a helmet while riding a bycicle. Data loss might cause an uncertainty of which memorable data is lost, if its date and/or the date range of the lost data is unknown. Murphy's Law states that "if something can go wrong, it probably will". As such, any data one does not wish to lose should be stored on more than one device.
Data may be lost as a result of:
For a more detailed explanation, read the sub page Backup/Causes of data loss.
Consider classifying your data by their degree of importance. Data with a higher class should be backed up more frequently, and/or more redundantly.
First-class data is data that you never want to lose. Second-class data is data you prefer not to lose but losing them is not critical, or data that is relevant in the short term but you are unlikely to need in a decade. Third-class data are files that can easily be recreated or replaced or have no long-term value to you.
First-class data might be writings and scripts and projects you worked a long time on, photographs and home video recordings such as family-related and honeymoon, stuff you want to show to your future grandchildren, important emails, paid software license keys that are still redeemable, and helpful web resources that have been or are likely to be taken down.
Second-class data might be a full backup of your operating system that is periodically overwritten with a new backup anyway, photographs and video recordings that are not critical, and downloaded web pages or online videos that are unlikely to be taken down, which you might have saved to read or watch them in places with no internet connection.
Third-class data are computer programs that can easily be downloaded again, temporary files such as archive files (zip, 7z, tar) which were only created for sending the files contained within, and cached data. Third-class data are also physical popular books and movies that can easily be re-purchased again anytime, or are stored on optical discs (CD, DVD, Blu-ray) which have a high resistance against failure.
Collections of data such as a digital music collection might be considered first- or second-class even though the individual files are replaceable, because a collection might have taken lots of time and effort to create.
First-class data should be backed up at least twice, meaning after which the data exists three times.
If feasible, consider creating an off-site backup of any data you consider first-class. This can be a friend's house or a cloud storage service. This protects the data from natural disasters.
Second-class data should be backed up at least once, though a backup is not urgent if the data is stored on new media that is highly unlikely to fail anytime soon, or have fewer points of failure, such as optical discs. Second-class data on storage media expected to fail soon should be moved onto new storage media.
Third-class data requires no backups since losing that data is not critical. Third-class data may also be only stored on a cloud storage service without a local copy.
Various types of data storage have benefits and disadvantages. As such, optical discs have the most reliability for archival storage, and hard disks are the most suitable for routine short-term backups.
Optical and tape storage are not vulnerable to data loss from mechanical failure, due to their modularity, as their controller is external rather than tied to the component that holds the data. In case of motor or loading mechanism failure of an optical drive, inserting a dull needle into the pinhole can force opening the tray to retrieve the media. Slot-load drives may require opening should the loading mechanism fail.
For routine short-term backups, especially disk image backups, external mechanical hard drives are the most suitable, since such backups involve much sequential writing, which wears down hard disks far less than flash storage. Hard drives just alter magnetic fields when writing data, and the head moves very little when writing sequentially, where as flash storage has to erase and reprogram memory cells, which wears it down. In addition, hard disks cost less than flash storage.
Linear tape storage could also be used for routine short-term backups, however, it is expensive and not user-friendly, has a high learning curve, and thus a high barrier of entry.
Optical media does not have the vulnerabilities of other storage media such as head crashes and controller failures, making it ideal for the sporadic low-maintenance long-term archival of moderately sized chunks of data such as video recordings from an important event, at capacities available as of 2023, meaning 25 GB per single-layer Blu-ray disc (least cost) and up to 128 GB per disc (multi-layer).
Higher-capacity optical formats such as the Sony Archival Disc exist, however, they are proprietary and vendor-exclusive, causing vendor lock-in.
Since optical discs are water-resistant and non-magnetic, they are the likeliest to survive a flood disaster or solar storm.
In addition, optical media can, if supported by the drive model, be scanned for impending integrity errors using software such as QpxTool and Nero DiscSpeed, before any data loss occurs. A higher rate of still correctible errors suggests sooner data corruption, and/or media of lower quality.
Hard drive failure is mechanical and sudden, whereas optical media deterioration ("disc rot") happens slowly over time. Optical media is best stored in a cold and dry environment, whereas hard disk drives are less sensitive to heat and humidity while not in operation.[3][4] However, humidity should be avoided during hard drives' operation, as it increases the likelihood of failure.[5] ard drive motors' lubrication slowly loses effectiveness as well.[6] Hard drives' magnetic fields fade slowly over time, which could lead to logical data errors before the hard drive technically fails. For prevention, data would have to be rewritten once every few years. Proneness to such increases with drives' information density. The magnetic signal on magnetic tapes also fades over time, though manufacturers proclaim a "shelf life" of several decades for such.[7]
On hard disk drives, a slight possibility of manufacturing error leading to rapid failure exists.[8] If a hard drive is intended to be used for long-term storage, it is recommended to only operate it infrequently once written to, to minimize mechanical wear.
Humidity in an enclosed storage location can be reduced using silica gel baglets.
Optical media is the most likely to survive environmental disasters such as a flood, ionizing radiation from nuclear disaster, or the marginal possibility of an electromagnetic/solar radiation storm,[9][10][11] as it is water-resistant and has no internal electrical components vulnerable to single effect events.
There exists optical media that is dedicated to archival, using silver, gold, or rock-like layers rather than organic dye, making it less sensitive to environmental conditions.[12]
Flash memory (solid state drives, USB sticks, memorybcards), while physically the most durable and usually fast, tend to be expensive per capacity, and might not be able to retain full data integrity for a long time (i.e. years), as the transistors which hold data in the form of electrical charge lose that charge over time.[13] This is also referred to as "bit fading".
Flash memory can be used for supplementary backups and short-term storage. However, the price per capacity is higher on flash storage than magnetic and optical storage. The main use of flash storage is for portable electronics and operating systems due to their physical sturdiness and immediate random access.
The retention duration on flash storage tends to be shorter with increasing data density, and deteriorates throughout weardown caused by repeated program/erase cycles. Life expectancy (program/erase cycles) also deteriorates with data density. However, flash memory does not rely on moving mechanical parts that can fail like hard disk drives, hence the name "solid state drive". While USB sticks and memory cards are by definition also "solid state drives", the term is established to refer to larger-sized units for clarity. The lack of mechanical parts means that the flash memory does not wear down while idle unlike the spinning hard drive motor.
While powered on and idle, the flash storage's control firmware usually refreshes the information stored inside the sectors routinely.[14]
Loss of data integrity is indicated by downspiking transfer rates caused by the flash memory controller attempting to correct errors.
Imminent failure of flash storage can be caused by power surges and voltage spikes. In comparison, hard disk drive controllers can also fail by such, but data stored on the metal disks, meaning a costly recovery may be possible, and optical discs can simply be retrieved as described earlier.
As flash storage is typically not intended for archival, and a possibility of component failure exists, anything stored on such that one does not wish to lose should be mirrored to at least one other location. For example, a large hard disk drive can be purposed to store corresponding disk images of flash media that are refreshed once in a while. This is especially recommended when storing a collection of frequently needed files or even an operating system on a flash drive, where scraping the files together or installing an operating system took time and effort, as a disk image allows sequentially reinstating the data onto the same or a different flash drive in case of error.
Mobile phone and tablet PC users may back up files short-term onto the removable memory card as an insurance against technical defect which denies access to its non-removable internal storage.
This can be useful for photography and filming during a trip where cloud storage would be impractical due to possibly limited transfer rates and data plans unable to handle high-resolution imagery, and the protrusion of a flash drive connected through USB On-The-Go would compromise necessary ergonomy.
Online services like cloud storage are just an extension and not a substitute to local data storage. Since cloud storage is a remote service rather than a personally owned product, the end user lacks technical control. Services might have varying retention spans and inactivity policies,[15] and technical difficulties are not predictable to end users. Access requires internet connection, and transfer rates are limited by such. The slight possibility of erroneous account termination by a service provider exists as well.[16][17] However, Cloud storage can act as a supplementary and short-term off-site backup, such as during vacation.
The same principles apply to other online services such as email and social media platforms. Make sure services you utilize are equipped with proper export functionality.[18][19][20]
Cloud storage may be less vulnerable to human error due to typically being managed by trained professionals, and takes the burden of maintenance away from the user. This is especially helpful to people with little file management knowledge.[21]
For people who wish to create an off-site backup but have no separate location other than their residence, cloud storage is an option. However, as described in the section below, there are privacy risks.
If you use a cloud storage service, you need to expect that the people operating the service reserve the ability to access anything you upload to their computers. It does not necessarily mean they will, but that they technically are able to.
Cloud storage operators tend to be reluctant to store any data on their computers that they can not view themselves. Even if they claim to encrypt your data, they might reserve a master decryption key.
If you are lucky, no human employee at the cloud service provider stumbles upon your files. Popular services have millions of users with a sheer amount of data that can not be overviewed by human staff. However, it is not guaranteed never to occur. Some artificial intelligence might mark your files as "suspicious", for example due to suspected copyright violations, putting them in a queue for human review.[22]
All data storage devices have a longer shelf life if stored in a cold and dry place such as a basement. For optical discs, the recommended temperature is at 15°C and 20°C (59°F and 68°F). Additionally, write-once optical media uses a light-sensitive dye, therefore is best stored in a dark location.[23][24]
If there is no spare space for optical discs such as a separate spindle, the cover sheet that is typically included at the top of a disc spindle, usually for labeling, can instead be used in the time being to separate empty and written-to discs.
Hard drives are less sensitive to heat, but more sensitive to humidity, and last longer if stored and operated in dry air.[25][26]
Flash storage is not subject to mechanical wear down, but can retain data for a longer time at lower temperatures. Should it lose data due to bit fading as described in § Flash storage, it is not physically damaged and can still be re-used for new data. As already mentioned, long-term archival is not the design purpose of flash storage. The primary uses for flash storage are in portable devices and for hosting operating systems.[13]
In a risky environment where there is an increased likelihood of data loss, such as a vaction or trip with the possibility of losing equipment, a higher backup frequency such as daily is recommended, which can be done at the base (hotel, holiday apartment, etc.) onto a portable hard disk drive or solid state drive.
For dedicated cameras and camcorders, memory cards can be cycled through.
For users who momentarily lack space storage for backups, an image of merely the file system structure, which contains information about file names, paths, fragments and time attributes, can significantly facilitate later data recovery in case of damage. Without this information, any damage affecting the file system header could lead to files being orphaned and only detectable by forensic software through file headers and footers. Fragmented files would need to be puzzled together.
The file system structure (or header) is usually stored in the first 50 to 200 Megabytes, which can be captured using disk imaging software within seconds.
While such a backup does not contain file contents (except possibly those located at the earliest logical block addresses (LBAs) shortly after the file system header itself), it is a fallback solution which is better than nothing.
Depending on the importance of data, backups should be verified once every few years. If at least three copies exist, one verification per decade should suffice. If one of the backups is no longer properly readable, a new one on fresh storage media can be created.
The easiest way to verify backups and archives is to test whether a sample of randomly picked files is readable.
For archive file formats such as ZIP and 7z and GZip, archive managers such as 7-Zip on Windows and File Roller on Linux have a feature for verifying data integrity and listing corrupt files. For entire devices such as hard drives and optical discs, tools such as the badblocks
command on Linux and the freemium shareware "IsoBuster" on Windows can be used to scan for unreadable sectors. On IsoBuster, sector scanning is part of the free features. These tools list unreadable sectors as numbers.
Another way to verify data integrity is by measuring the sequential reading speed. On Windows, the freeware tool "HDD scan" is able to generate a line graph of the reading speed. Ideally, the line is smooth. On flash storage, it ideally is straight and flat with minimal downspikes at most, and on spinning media such as hard drives and optical discs, the speed slightly decreases or increases due to the changing circumference per rotation where speeds are higher at the outer edge. Significant downspikes in speed are a hint to damage since the data storage controller has to spend effort correcting errors. For optical media, error rate scanning is available as described in § Magnetic and optical.
Unused flash storage would have to be verified at least yearly to prevent losing data, since the process of reading causes the flash memory controller to refresh the data.[13] However, as stated earlier (§ Flash storage), flash memory is not ideal for archival to begin with due to bit fading and higher price per capacity.
Cloning is the practice of copying each bit from one data storage device sequentially to another. On Windows, this task can be performed with third-party software and on Linux with the dd
command-line tool. Special caution should be taken not to select the wrong target device.
If your computer has a modular, replaceable data storage device, it can be replaced with a cloned data storage device in case of failure, allowing for resuming work with only minimal time loss.[27]
lastbackup.txt
or backupinfo.txt
, containing the time of the last backup and optionally additional information such as the name of the backup device..tib
) should be used with caution due to the possibility of vendor lock-in. Should said vendor cease operations, it could become even more difficult to access archives in proprietary formats in the long term.Compressed archives may be used where efficient, such as for text documents and code, where strong compression formats such as 7-Zip (LZMA) and XZip could reduce size by a factor of 100 or more. Uncompressed bitmap images can achieve compression ratios of around 10; more or less depending on content.
Binary data such as multimedia (picture, video, audio) that has already been internally compressed can not be effectively shrunk by applying additional compression to it, as most redundancy has already been defeated by internal compression.
Which level of compression one should use depends on the contents. For example, for creating an archive file containing pre-compressed data such as PNG files, compression can be deactivated entirely for the highest performance, since data does not have to be encoded and later decoded during extraction. For text-based data, a high compression level is highly effective. For archiving disk image files and mixed data with varying levels of compressibility, weak compression is recommended for a minimal impact on performance while still significantly compressing blank parts ("air") of the image and text-based files. Performance matters on disk image files due to their typically large size.
It should however be considered that the slightest damage to a compressed archive file could magnify enormously, possibly rendering the rest of an inside file or the entire archive useless. The scope of the damage depends on compression method, where solid compression causes the latter, as it deduplicates information across contained files to maximize compression effectiveness. As such, it is recommended to store compressed archives on no less than two devices.
If a separate device is unavailable, storing a duplicate of the archive file on the same device for redundancy could still allow repairing errors in uncommon locations by merging the intact parts using a byte editor (or "hex editor"), though errors are difficult to locate if the storage device controller's firmware returns damaged data to the computer without reporting it as such. Some flash storage devices may return sectors (512 bytes each) with damaged data as null bytes. In that case, a multiple of 512 sequential null bytes between binary data suggests a logical error in the file. On optical discs, a sector typically has 2048 bytes.
To get a visualized idea of how fragile compressed archives can be to damage, try opening a PNG or TIFF image inside a byte editor (or hex editor), and edit only few bytes somewhere near the middle, and then try the same on an uncompressed bitmap (BMP) for comparison. The PNG and TIFF images are to be completely demolished and glitched out from the damaged point, while effects on the uncompressed bitmap are only pinhole-sized.
This experiment might not be as effective on a JPEG image as on PNG and TIFF, as its compression algorithm is more robust against damage. It may cause some gitching and hue alterations on ordinary JPEG, and digital stains on progressive JPEG, but nothing that demolishes the entire image beyond repair. For reference, see the Commons: category: Bit-blending experiment.
Compression formats that have a weaker ratio, but in return require far less computing efforts, might use block-based storage, where information is stored in compressed blocks of a fixed size. Data blocks after a damaged one might be recoverable. A popular block-based compression format is Gzip; see Gzip § Damage recovery.
Human-readable text and code is highly compressible, and archive formats' internal file systems typically handle a high number of small files more efficiently than data storage file systems.
On FAT32/16/12 and exFAT for example, any non-empty file reserves at least one entire cluster (space allocation unit), which may be preformatted to around 16 to 256 KB, depending on total storage size. Too many small files cause space being wasted through cluster overhead, whereas archive formats handle many small files efficiently, even with compression deactivated.
However, digital photographs and video are internally compressed to a degree where additional compression through an archive format such as Zip, RAR, and 7z would not shrink the size effectively while significantly slowing down extraction speeds rather than allowing for direct playback, and making it vulnerable to damage.
Note that a program stream video (most mobile video) with an end-of-file moov atom depends on completeness or else be unplayable. This is not the case for transport stream video, frequently used by dedicated camcorders such as those of Panasonic and Sony (AVCHD).
Wherever you store your data, verify in advance that proper export functionality exists.
For example, as of 2022, the email service "ProtonMail" lacks support for email protocols like POP3 or IMAP. The only way to back up messages is individually downloading them, which is impractical, since individually downloading one thousand emails could take several hours of repetitive clicking.
On the Android mobile operating system, applications might store user data inside the /data/
folder. This includes web browsing session, history, and saved pages depending on the browser. This makes it only accessible to the application itself. In other for any other application to access it, root access must be unlocked.[28]
There is a slight likelihood that formats used today will become unsupported in future.
A past example is the HD-DVD, an optical disc format which did not succeed in the market. Unlike DVD plus and minus formats, HD-DVD could not successfully co-exist with its competing format, the Blu-ray disc, and production of HD-DVD media and hardware was halted. As a result, people might have recorded data on HD-DVD discs, but no longer own a drive that reads them. A Blu-ray drive could technically read HD-DVDs, since they are technically very similar and use the same wave length, but optical drive manufacturers see no need in implementing HD-DVD support.
Develop the habit of exporting your browsing session into a text file regularly, which can be done through a browser extension. The automatic session restoration might fail, and the session database might have a proprietary format only readable or decodeable through difficult-to-use tools rather than a simple text editor.
Some browser extensions allow exporting both page title and URL, facilitating later searching. Some extensions have an option to limit the export to only tabs of the current window, which could be of use if tabs in other windows have not been changed.
Some browser extensions have an internal session manager, while others export the session into a file in the download folder. Some might have both. Some extensions for this purpose have been discarded as a result of Mozilla Firefox, one of the most popular web browsers, deprecating support for extensions in a legacy format with the transition to version 57 "Quantum", though functionally similar surrogates are presumed to be released.
Web form data may get lost as a result of browser crashes caused by RAM exhaustion, operating system crashes or power outages without uninterrupted power supply (external unit or laptop battery), or a failed form submission.[29]
Browser extensions such as Textarea Cache prevent loss of form data by backing it up automatically.[30]
Sites for which this is not wanted can be added to an exclusion list in the extension's settings.
Consider drafting any text that is presumed to take minutes or longer to write into an offline text file, from which it can be copied and pasted into a web form.
The same also applies to other areas such as to programs' internal text areas like the WinRAR archive comment field[31], since it may be discarded in case of an error or cancellation. Regarding WinRAR, using proprietary formats is not recommended anyway due to potential incompatibility throughout systems.
Web browsers might automatically delete history under certain conditions such as space storage exhaustion on the partition that stores the user data folder (or "profile folder"), or history from earlier than a time threshold such as three months. An update might unexpectedly change the retention duration, discarding history falling outside if shortened.[32]
If you wish to retain history beyond browsers' retention span, consider routinely creating copies of the history database. Firefox's live database file is named places.sqlite
and located in the profile folder. For Google Chrome/Chromium, extensions exist that allow exporting it into the download folder.[33]
Note that on mobile phones, restrictions by operating system could deny file-level access to browsers' user data. Then, the only way to export browsing data is using a mobile browser which supports third-party extensions, such as the Chromium fork "Kiwi browser", which is compatible with many desktop Google Chrome extensions.[34]
Reliable long-term storage of relatively low amounts of data onto paper is possible using software such as PaperBak (open-source), which can encode digital data into printable machine-readable matrix bar code.
Paper books can not be duplicated as simply as digital computer files, but have supreme longevity and reliability due to immunity from technical failure. Abstractly, paper is a form of optical media.
Backing up paper books themselves may be desirable to protect against short-term losses such as forgetting to take them after use at a location such as school, university, or work place. Pages can be individually digitized in a high quality using optical flatbed scanners, though the process is slow and demands patience. It is faster and more conveniently done using a digital camera, preferably monuted to a tripod pointing down for a fixed position of the pages within the images, and in good lighting. Pairs of pages can be photographed individually, though filming while quickly skimming through the pages is the fastest method. The necessary resolution for clear readability depends on size and clarity of font, but at least 1080p is recommended, where each still frame of the video acts as a 2.1-Megapixel photograph, only with slight quality losses from video compression.
Different commercial products provide Backup and recovery capabilities such as Commvault, Veritas NetBackup and Veritas Backup Exec, Veeam Backup & Replication, Arcserve.
In case of an internal HD crash and failure, there is absolutely nothing quicker to getting back to 100% operation than having a HD clone handy to either boot from, or within 20 mins. installing and removing the bad HD. Nothing to install software-wise, and a speedy immediate return to your computer use and productivity.