Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums.

You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact us.

Post New Thread Reply

Register GFY Rules Calendar
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed.

 
Thread Tools
Old 07-12-2004, 11:05 AM   #1
PeekHoles
Registered User
 
Industry Role:
Join Date: Jan 2002
Posts: 1,151
Linux Tar Guru's

I got a question I have two HD's on my server I want to back my sites over to the other HD once a week. Is there away to tar zip the domains and all sub folders and files in the domain where it keeps its chmod and all permissions on all files and subfolders and it's files? If so can someone please show me the command. Say I want to backup my-domain-name.com with all it's permisiions. That way if I was to have a problem I could just untar the good copy and be back in biz.
PeekHoles is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-12-2004, 11:18 AM   #2
Radik
Confirmed User
 
Join Date: Sep 2003
Location: Vancouver ICQ: 3588423
Posts: 808
Take a look at rsync.
__________________

100% Exclusive, Check Us Out!
Radik is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-12-2004, 11:23 AM   #3
PeekHoles
Registered User
 
Industry Role:
Join Date: Jan 2002
Posts: 1,151
Yeah rsync works good. One of my issues is I got alot of data to move between servers, and wondered if there was a way to tar up the domains that where rsync on the old server. Once all the data is moved over to the new server rsync will be setup on the new server like it is on the old.
PeekHoles is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-12-2004, 11:29 AM   #4
vending_machine
Confirmed User
 
Join Date: Jun 2002
Location: Seattle
Posts: 1,062
This should do it:

Code:
tar zcfp domains.tgz <paths>
snip from tar manual:

Code:
     -p
     --preserve-permissions  Extract all protection information.
vending_machine is online now   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-12-2004, 11:34 AM   #5
PeekHoles
Registered User
 
Industry Role:
Join Date: Jan 2002
Posts: 1,151
Thanks I will give it a shot.
PeekHoles is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-12-2004, 12:04 PM   #6
dgraf
Confirmed User
 
Join Date: Jun 2004
Posts: 133
Quote:
Originally posted by vending_machine
This should do it:

Code:
tar zcfp domains.tgz <paths>
snip from tar manual:

Code:
     -p
     --preserve-permissions  Extract all protection information.
Therefore here is no reason to use this directive while making the archive. The parameter has to be used while extracting. Tar backups the permissions by default but if you do unpack (without the p option) the permissions are stripped according to your umask setting.
dgraf is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-12-2004, 12:31 PM   #7
PeekHoles
Registered User
 
Industry Role:
Join Date: Jan 2002
Posts: 1,151
Ok so say this is my tar file after it's been tared up.

domains.tar


What will be the command line to untar it with the -p option.
PeekHoles is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-12-2004, 12:43 PM   #8
dgraf
Confirmed User
 
Join Date: Jun 2004
Posts: 133
If you used the option "z" before (while packing), you have to use it again while unpacking because "z" means -- use gzip compression. In that case you should use extension .tgz resp. .tar.gz not .tar nex time.

Finally, you can use either

Code:
tar xvf domains.tar
or

Code:
tar xzvf domains.tgz
depending on if you used the "z" while packing.
dgraf is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-12-2004, 01:25 PM   #9
princess
Confirmed User
 
princess's Avatar
 
Join Date: Feb 2001
Location: Ringgold, Georgia
Posts: 1,939
all this linux talk is going to excite us all.. ;)
__________________
*HUGS*!
Marsha
princess is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-12-2004, 01:33 PM   #10
dgraf
Confirmed User
 
Join Date: Jun 2004
Posts: 133
Yeah, but it is still far better to describe parameters represented by alphanumerical characters than places to click on.

Last edited by dgraf; 07-12-2004 at 01:36 PM..
dgraf is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-12-2004, 01:34 PM   #11
Serge Litehead
Confirmed User
 
Serge Litehead's Avatar
 
Industry Role:
Join Date: Dec 2002
Location: Behind the scenes
Posts: 5,190
$ man tar


for huge amounts use bzip2, better compression, a bit slower.

something like:
tar cvyf archive.tar.bz2 domain_dir/

to untar it
tar xvyf archive.tar.bz2


flag v is verbose, you don't need that in script.
Serge Litehead is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-12-2004, 04:12 PM   #12
PeekHoles
Registered User
 
Industry Role:
Join Date: Jan 2002
Posts: 1,151
Quote:
Originally posted by holograph
$ man tar


for huge amounts use bzip2, better compression, a bit slower.

something like:
tar cvyf archive.tar.bz2 domain_dir/

to untar it
tar xvyf archive.tar.bz2


flag v is verbose, you don't need that in script.


So If I use this option when I untar the way you show it. Will it keep the same permissions and such when I untar it on the new server. I just hate having to re chmod all the files and such on my scripts Because once I tar the domain I will use wget and want to untar it and have all the same permissions in such already in place. I use the same file structure and such on both servers. So long as the permissions are the same when I untar I should be fine once I update my mysql DB's.
PeekHoles is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 07-12-2004, 04:14 PM   #13
PeekHoles
Registered User
 
Industry Role:
Join Date: Jan 2002
Posts: 1,151
Quote:
Originally posted by vending_machine
This should do it:

Code:
tar zcfp domains.tgz <paths>
snip from tar manual:

Code:
     -p
     --preserve-permissions  Extract all protection information.

Yeah the -p was throwing me off thats why I did not know the command to untar and save all the permissions.
PeekHoles is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Post New Thread Reply
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >

Bookmarks



Advertising inquiries - marketing at gfy dot com

Contact Admin - Advertise - GFY Rules - Top

©2000-, AI Media Network Inc



Powered by vBulletin
Copyright © 2000- Jelsoft Enterprises Limited.