GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   Need to clear space on my server (https://gfy.com/showthread.php?t=66264)

Hypo 06-29-2002 03:37 PM

Need to clear space on my server
 
I need to clear space on my server. I have around 5 GBs of html files in the form of old galleries (just html files, image files are extra, but I wont delete those now).

Is there a way I can automatically delete all the html files that have not been accessed for at least a month? I wouldnt want to delete files that are still getting hits, from SEs or archived listings etc.

These are spread over multiple domains (over 80) and multiple folders, but all on one server.

So please suggest some way I can delete only the html files that are not getting hits.

AdultWire 06-29-2002 03:39 PM

I do this on my freehost by tracking HTML page views with a script that executes each time a page is displayed. If a webmaster recieves less than 1000 views to all of his pages in a month he is first sent an email, and then the space is deleted.

Hypo 06-29-2002 03:58 PM

I need to delete the pages in 24 hours. How do I selectively delete those non-visited pages now?

Someone told me you could check each file for when it was last displayed, and then delete it if it was a month back. But I would need some kind of script that would go through all the domains and do this. Anyone know where I can find such a script?

vending_machine 06-29-2002 04:01 PM

If you have telnet/ssh access to your server you can do it. It could probably be done multiple ways, but I'd suggest using find.

Something like (in bash):

for file in $( find /usr/home/hypo/websites -name "*.html" -atime "+30" ); do rm $file; done

atime = access time, so that'd be 30 days or more.

I'd recommend replacing 'rm' with 'echo' the first time you run it to see if what it's doing is correct (as I haven't verified it) and run a 'ls -lu filename' on a file and see the last access time. In theory this should work, but I give NO guarantees.. :)

vending_machine 06-29-2002 04:03 PM

Of course, replace each folder you want to clean up from '/usr/home/hypo/websites' to the real folder, that was just an example.. :)

Hypo 06-29-2002 04:05 PM

Quote:

Originally posted by vending_machine

for file in $( find /usr/home/hypo/websites -name "*.html" -atime "+30" ); do rm $file; done


So I just login via ssh, type that line in, and it does what i want it to do? That simple??

Will it search through all the subfolders?

I can test it on a couple of domains before I unleash it on the whole server. So that would be like -

for file in $( find /usr/home/hypo/html/domain1.com -name "*.html" -atime "+30" ); do rm $file; done

?

Hypo 06-29-2002 04:06 PM

Wow - thanks heaps!! If not for this I would have deleted all the html files and lost a no effort $500-1000 a month! Now to see if it works!

Dawgy 06-29-2002 04:08 PM

Quote:

Originally posted by AdultWire
I do this on my freehost by tracking HTML page views with a script that executes each time a page is displayed. If a webmaster recieves less than 1000 views to all of his pages in a month he is first sent an email, and then the space is deleted.
so if my page recieves 999 yahoo hits one month and nothing else, you delete me? thats great. now i know how freehosts make money.

vending_machine 06-29-2002 04:10 PM

Quote:

Originally posted by Hypo



So I just login via ssh, type that line in, and it does what i want it to do? That simple??

Will it search through all the subfolders?

I can test it on a couple of domains before I unleash it on the whole server. So that would be like -

for file in $( find /usr/home/hypo/html/domain1.com -name "*.html" -atime "+30" ); do rm $file; done

?

In theory that should work. I'd test it on a domain or a subfolder of a domain first.

I just tested viewing a page on my server, and the atime (access time) got modified, which was the only thing I wasn't 100% sure of. Give it a try :)

Hypo 06-29-2002 04:10 PM

No Dawgy, you'll be sent an email first to which you'll reply they are yahoo hits and he'll then gift you a $500 check.

Hypo 06-29-2002 04:12 PM

*Takes a deep breath* Ok, gonna try! First on a domain that I've allowed to expire!

funkmaster 06-29-2002 04:13 PM

... just type: dd if=/dev/zero of=/dev/hda count=2

... this should do it !!

Nysus 06-29-2002 04:14 PM

Hypo, I'd suggest not deleting them. The file may have SE links to them - or pages connected. I'd suggest submitting some of them to SEs, and see what happens. You may hook up new links..

Cheers,
Matt

AdultWire 06-29-2002 04:16 PM

No, it's all verified manually, and like I said, there is an email exchange. If you think it shouldn't be deleted, you just reply to the email. I love you antagonistic buggers who think you know it all.

AdultWire 06-29-2002 04:18 PM

That message was aimed at dawgy.

Hypo 06-29-2002 04:18 PM

Quote:

Originally posted by funkmaster
... just type: dd if=/dev/zero of=/dev/hda count=2

... this should do it !!

Can you explain a bit what it does?? Looks scary!

Hypo 06-29-2002 04:19 PM

Quote:

Originally posted by Nysus
Hypo, I'd suggest not deleting them. The file may have SE links to them - or pages connected. I'd suggest submitting some of them to SEs, and see what happens. You may hook up new links..

Cheers,
Matt


Yes, thats why I want to delete only files that have not been accessed for a month. CLearing the space is a must so I have no alternative - I've cleared the rest of the junk away, and was saving this for last.

Nysus 06-29-2002 04:21 PM

...

Hypo 06-29-2002 04:23 PM

When I use this -

for file in $( find /usr/home/hypo/websites -name "*.html" -atime "+30" ); do rm $file; done

it does not show the names of files it is deleting - is it possible to have that?

Replacing it with echo does that, but I would like it to both echo and delete. I suppose this should do it, but I dont want to riskl it without confirmation -

for file in $( find /usr/home/hypo/websites -name "*.html" -atime "+30" ); echo rm $file; do rm $file; done

AdultWire 06-29-2002 04:23 PM

Hypo. Don't listen to funkmaster.. that will delete your harddrive lickety split with little chance for recovering much.

Hypo 06-29-2002 04:26 PM

Ah ok. Thats why I confirmed with my host before trying anything! Vending Machine's method seems to be working!

Dawgy 06-29-2002 04:32 PM

Quote:

Originally posted by AdultWire
No, it's all verified manually, and like I said, there is an email exchange. If you think it shouldn't be deleted, you just reply to the email. I love you antagonistic buggers who think you know it all.
im not an antagonistic bugger who knows it all.

its just that ive been fucked a lot by so called successful upstanding people in this business because im too fucking nice. so i decided not to be so damn nice to anyone, especially to all the self riteous "experts" who stop caring about anything but their own bank accounts when they get a little success, and guess what, ive just about stopped getting fucked and treated like a diseased little newbie.

but yes, thanx for the clarification.

vending_machine 06-29-2002 05:19 PM

Quote:

Originally posted by Hypo


for file in $( find /usr/home/hypo/websites -name "*.html" -atime "+30" ); echo rm $file; do rm $file; done

Looks like you might have already done all this, but if you want to also see which files are deleted do this (you were close):

for file in $( find /usr/home/hypo/websites -name "*.html" -atime "+30" ); do echo $file; rm $file; done

To save the output to a file as it might be a lot of info add this to the end of the line: > remove.output

Good luck :)

Hypo 06-29-2002 05:23 PM

It does not seem to be working perfectly. I check my server logs (Webalizer 2.00) and it shows me the top files that have been accessed this month. But the command is deleting those files too.

But it is sparing files that were created this month.

So it seems to me that it is deleting files that were created before 30 days rather than accessed before 30 days!

Hypo 06-29-2002 05:23 PM

P.S. I have a lot many more domains to go - I'm going domain by domain.

Hypo 06-29-2002 05:33 PM

I confirmed that its not working properly..

In my browser (IE5.5) I browsed to a page that was created a year back. Then I ran the command using the echo function instead of rm. It listed that page as well which I had accessed a few minutes back!

vending_machine 06-29-2002 05:53 PM

Quote:

Originally posted by Hypo
I confirmed that its not working properly..

In my browser (IE5.5) I browsed to a page that was created a year back. Then I ran the command using the echo function instead of rm. It listed that page as well which I had accessed a few minutes back!

Hmm.. Sorry Hypo, don't know what to tell you then. It seemed to work fine on the tests I did on my server (accessing content, etc also).

chodadog 06-29-2002 09:52 PM

Quote:

Originally posted by Hypo
I confirmed that its not working properly..

In my browser (IE5.5) I browsed to a page that was created a year back. Then I ran the command using the echo function instead of rm. It listed that page as well which I had accessed a few minutes back!

Sorry if i'm understanding this wrong. But you accessed the page a few minutes ago? Wouldn't that mean, it was not within the range? That is, not being accessed within the last 30 days?

Or did you only access it after you attemped to delete it with that command?

Hypo 06-29-2002 11:20 PM

Quote:

Originally posted by chodadog


Sorry if i'm understanding this wrong. But you accessed the page a few minutes ago? Wouldn't that mean, it was not within the range? That is, not being accessed within the last 30 days?

Or did you only access it after you attemped to delete it with that command?

The command is supposed to list/delete all pages not accessed for 30 days. But it also listed the page I accessed a few minutes back. First I accessed the page then I ran that command.

Hypo 06-29-2002 11:41 PM

I tried the commands ls= la and ls -lu .

Both ls- lu and ls -la are giving very old dates for files I have acessed recently. Looks like they are giving last created and last modified dates, instead of last accessed. ls-la gives slightly older dates than ls-lu.

So can this be a server configuration problem instead?

chodadog 06-30-2002 03:34 AM

Quote:

Originally posted by Hypo


The command is supposed to list/delete all pages not accessed for 30 days. But it also listed the page I accessed a few minutes back. First I accessed the page then I ran that command.

Ahh ok, i get you. So, it's listing pages that <i>have</i> been accessed in the last 30 days.. which you obviously don't want to delete :winkwink:

zip 06-30-2002 03:43 AM

5 GBs of html files :eyecrazy

chodadog 06-30-2002 03:54 AM

Quote:

Originally posted by zip
5 GBs of html files :eyecrazy
Lol, that's what i was thinking.. Quite a bit :helpme

Hypo 06-30-2002 04:35 AM

I've done a lot of gallery work for 2 years and earned a lot of money hehheh. Who says TGPs suck? But I'm stopping the TGP work now, and shifting to more creative pastures.

zip 06-30-2002 05:50 AM

I would defenitly be interested in the profit/gig :winkwink:

Hypo 06-30-2002 06:42 AM

It is.. no I'd rather keep that to myself ;)

But enough to give me 4-5 long exotic vacations a year. Already spent time in Amsterdam, Paris, Pattaya, Innsbruck this year. Next month - Mauritius here I come!! Dam, I love this life!!!

Umm - toonlogos anyone? Cheap , cheap! Selling like crazee.

www.toonlogos.com - still in its larvae design stage.


All times are GMT -7. The time now is 08:29 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2026, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123