I'm a bit obsessive when it comes to data redundancy, but this is my process.
2 computers (my desktop and a file server)
Folders are replicated daily back and forth
2 partitions (one for music, one for data) I store different versions of each volume in different places (music on my PC and data on the server). This forces me to go to different locations depending on what I am working with. Its an extra step to make changes to the data so I can't screw anything up if I am drunk. Also, I get an emailed log of all changes between the two drives sent to me every morning.
Both machines have external drives. If I lose a drive I just pick up a replacement, verify the original contents, and force a replication (this has happened once from a head crash (damn cats)). A full 100GB replication takes an hour or so (100MB/FD LAN).
Also, I have an external USB drive enclosure. Every few months I replicate everything over to the drive and then put it in a safe place. This is to prevent data consistency problems. For example, if you backup every day, you will inadvertently replicate problems to the backups, so if I keep a spare drive with everything on it and only push the data there when I am sure that everything is 100%, I have a known good copy that goes back 3-6 months. I had a major problem with audio clipping in a large portion of my music library so I adopted this draconian measure.
My drives are 250-400GB externals, mostly Western Digital. I am only using about 100GB total.
Also, I backup email once a day and do a historical email backup (about 1GB) once a week. Every few months I will manually drop the old data.
Everything is done with perl or wsh scripts, except for the synchronization which is done with Smartsync Pro. It is extremely granular and allows you to configure lots of different scheduled profiles. I highly recommend this program, its great!