Here is a short best practices post for Time Machine on network storage.
Make sure your Macs are running Mac OS X 10.5.6, and your Time Machines are running firmware 7.4.1
Prior versions had some issues that could reduce the reliability of your backups, you really should upgrade.
Use a Time Capsule or Mac OS X Leopard as your AFP server
Time Machine depends on at least two undocumented extensions to AFP 3.2. While the netatalk guys appear to have reverse engineered them, they are not in a stable netatalk branch yet. Even when they put them in a release they may not be 100% reliable because the OSes netatalk is running on may not allow some of the functionality those AFP commands require. Finally, netatalk 2.0.3 was released in 2005, 2.0.4 has been in beta for months, and these patches are not planned until 2.1, which may quite a while away.
Apple sells 500GB and 1TB Time Capsules. If you need more space than that a bit limited. Ideally you would buy a Mac Pro or an XServe, but that is probably cost prohibitive for a personal backup server. My best advice would be to get a Mac Mini and an external drive. As I mentioned in several of my other posts the bridge chips for external drives are often a problem. Realistically fr that to be an issue you would need to lose power in the middle of backup and be a bit unlucky, but your backups are your last resort, so taking chance with them is not a good idea. The sync issue can be somewhat mitigated by placing the Mac Mini and the drive on a UPS. If the computer is set to immediately shutdown then the drives track cache should be flushed long before the battery runs out. It is less than ideal, but short of buying a Mac Pro there are not many options.
If someone actually knows of a bridge chip that pushes syncs (and any enclosure vendors using it) please let me know and I will update this post.
Don't interrupt backups if you can avoid it
While interrupting a backup should not cause data loss, it can substantially increase the amount of time your next backup will take.
Exclude directories with lots of little files if they change frequently
Scanning a directory with lots of little files to determine what to backup can be very slow. Directories that have lots of files but never change are fine since directories don't generally need to be scanned unless something changes. In my experience the most common culprit are IMAP mailboxes in ~/Library/Mail