View Single Post
Old 09-06-2004, 12:49 PM  
JulianSosa
Confirmed User
 
Join Date: Aug 2003
Posts: 3,042
Quote:
Originally posted by justsexxx
I have 2 huge txt files with galleries. One with valid url's, and one with url's which don't work anymore, cheated etc.

SInce they contain over 1000 url's each, it is a lot of work to find what are double etc.

Is there an easy way to find which url's are double. They should be removed from the "good" list.

so good.txt bad.txt and url's listed on bad.txt should be removed from good.txt in case they are listed there.

How todo this quick and easy?

Andre

I think this works

cat old.txt | uniq > new.txt
JulianSosa is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote