|
|
|
||||
|
Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums. You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today! If you have any problems with the registration process or your account login, please contact us. |
![]() |
|
|||||||
| Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed. |
|
|
Thread Tools |
|
|
#1 |
|
Confirmed User
Join Date: Jul 2002
Posts: 1,556
|
tar - multivolume???
I have to send about 40GB of data via tar files, but linux will handle 2GB files only. So how should the tar command look like to compress a folder recursive and to make a new file after 2GB every time?
Any help will be appreciated
__________________
Zappu (ICQ: 23141467) European Erotik Content Archive ![]() CONTENT4FREE = CONTENT4CLICKS X JOIN NOW! |
|
|
|
|
|
#2 |
|
Confirmed User
Join Date: Jul 2002
Posts: 1,556
|
I have found a solution, just in case somebody else will be in the need of it:
tar czf - datainput-directory | split -b2000m - name.tgz. and back: tar xzf - < name.tgz* Do not use 2048 because of the limit 2GB-1B
__________________
Zappu (ICQ: 23141467) European Erotik Content Archive ![]() CONTENT4FREE = CONTENT4CLICKS X JOIN NOW! |
|
|
|