Quote:
Originally posted by justsexxx
I have 2 huge txt files with galleries. One with valid url's, and one with url's which don't work anymore, cheated etc.
SInce they contain over 1000 url's each, it is a lot of work to find what are double etc.
Is there an easy way to find which url's are double. They should be removed from the "good" list.
so good.txt bad.txt and url's listed on bad.txt should be removed from good.txt in case they are listed there.
How todo this quick and easy?
Andre
|
I think this works
cat old.txt | uniq > new.txt