Contemporary computer systems produce tons of data, mostly logs. Zemanta, for example, generates 10GB of logs per day which amounts to 4TB of data per year. Logs are usually only written, but rarely read, hence the moniker write only memory for storage dedicated for holding historic logs. Unfortunately, when problems arise and logs are needed for debugging every lost or omitted data is severely limiting doing forensics. Therefore developers prefer logging everything to the chagrin of sysadmin who are left with terabytes of logs to handle. Fortunately, Amazon has recently introduced Glacier that provides low cost storage of large amounts of data with minimal involvement of sysadmins. Instead of hassle of failing disks and migration of data, all that a sysadmin must do is define a lifecycle rule that triggers automatic migration of data from S3 to Glacier after specified amount of time has passed. For just EUR100 per TB per year it's really not worth spending much thought what to delete and what data to keep. Just keep everything for a year and save disputes between developers and sysadmins for more important issues.