[Dovecot] Backing Up
mikkel at euro123.dk
mikkel at euro123.dk
Fri Oct 31 01:49:01 EET 2008
> mikkel at euro123.dk wrote:
>> Just imagine backing the thing up, exporting 60.000.000 SQL queries.
>> Not to say importing them again if something should go really wrong.
>> Actually I'n not even sure it would be faster. When the index files grow
>> to several gigabytes they kind of loose their purpose.
>
> There are many businesses backing up way-more data than that and it it
> isn't 60,000,000 queries -- it is one command. But if you use serious
> hardware "backing up" isn't really needed. RAID, redundant/hot-swap
> servers, etc. make backing up /extra redundancy/. :-)
>
Why make things complicated and expensive when you can make them cheap and
simple?
Anything is possible if you wanna pay for it (in terms of hardware,
administration and licenses).
I have focused primarily on making it as simple as possible.
And while running a 400 GB with 60.000.000 records database isn't
impossible it would be if it were to run on the same hardware that now
comprises the system.
Roughly 1000 IOPS is plenty to handle all mail operations.
I seriously doubt that it would be enough to even supply one lookup a
second on that huge db (and even less over NFS as is now being used).
And I assume that a hundreds of lookups a second would be required to
handle the load.
So it would require a lot more resources and still give nothing but
trouble (risk of crashed database and backup issues that now aren't
there).
By the way data is stored in a SAN it needs to be backed up.
500 GB SATA disks takes a day to synchronize if one breaks down and we
can't really take that chance (Yes I will eventually move the data to
smaller 15.000 RPM disks but there is no need to pay for them before its
necessary). Also there is the risk of data being deleted by a mistake,
hacker attacks or software malfunctioning.
But we really are moving off-topic here.
Regards, Mikkel
More information about the dovecot
mailing list