![]() |
Need help backing up my server...
I wanted to create a back up of my server, so I found out how to use SSH to create a compressed file of the entire server and split that file into 200 mb pieces. Now I've downloaded it and want to join them together again to check that the the file unzips correctly and isn't corrupted. How would you do that? Can you use command prompt (on vista) and if so does anyone know the commands?
Also how do you backup mysql, is there a quicker way to back up all databases on the server, without using phpmyadmin to export every single table? Thanks |
Well i usually do tar operation for archiving/extracting,and export databases over ssh also by using mysql command since phpmyadmin not support export of databases with large size.
|
Quote:
|
I know that you are one of my customers but I also know that you host things other places as well so I will answer this as if you were not my customer. My suggestion is to contact your host and ask about options. I know that some hosts will allow you to send them a external hard drive and they will back up the entire thing on that hard drive and ship it back to you (although it may come at a minimal fee). Another option is to tar it up and download the entire thing (this of course will help the hosting company because it burns bandwidth). There are even more options than these two but they all have a common theme....contact your host and let the System Administrators help you. That is what they are there for ;)
--T |
The following is based on server to server move.
Linux / Unix system w/ file chunk sizes of 500MB each using SSH... --------------------------- Moving a huge site.... tar the file with: nohup nice tar -cf /foo.bu.tar /fooSource & split the file into 500MB chunks with: nohup nice split --line-bytes=500m foo.tar.gz foo_ & command to move example: rsync -a -p -e ssh /home/temp1/#of-file-here easy1@INSERT-IP-ADDRESS-HERE:/home/easy1/temp in new server go to /home/temp1/temp (files location) then you'll see # of folders. move all files into one folder with FTP client or however u like. rejoin the file with: nohup nice cat foo_a* > foo_FULL.tar.gz & THEN UNZIP / UNTAR / RESTORE --------------------------- Good luck with it |
mysqldump is made for backing up and moving SQL databases.
The thing is, you really should be backing up daily, especially your database is likely to change often, so doing this all manually isn't ideal. Check out Clonebox, which takes daily backups of the whole server for a very low price. The Clonebox clones can actually be BOOTED, too, so if your drive fails or your host disappears or whatever your site is still up, running from the clone. http://bettercgi.com/clonebox/ |
Quote:
This isn't a move, just a back up. So far I've learnt to tar the whole thing but the filesize was huge. So I figured out to split them and it downloaded by ftp alot quicker that way. I was going to bring it across to a virtual account once per week, but I noticed when they moved me onto this server I used 65mbps and I doubt any host will let me do that without my own server. Now the only thing I need to learn is how to put those peaces together again. Nat Net kindly gave me a root username and pass for mysql but the dump all databases option isn't dumping anything. I'm now reading various sites which all say that same thing! Eventually I'll ask the tecks how to do it, but I like to try to learn as much as I can myself. I learnt PHP this way and it came in very usefull :) |
Oh I've done it! If anyone else needs this it's:
mysqldump -u root -pPASSWORDHERE --all-databases > alldb_backup.sql (no space between -p and your root pass - that was the mistake I was making) |
| All times are GMT -7. The time now is 01:38 AM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123