![]() |
2 txt files, how to remove doubles?
I have 2 huge txt files with galleries. One with valid url's, and one with url's which don't work anymore, cheated etc.
SInce they contain over 1000 url's each, it is a lot of work to find what are double etc. Is there an easy way to find which url's are double. They should be removed from the "good" list. so good.txt bad.txt and url's listed on bad.txt should be removed from good.txt in case they are listed there. How todo this quick and easy? Andre |
Any basic programmers out there can do this very quick!
|
I need one that is ready to use. Money is not an issue:)
|
I'll help ya, check icq.
|
Quote:
Andre |
Quote:
I think this works cat old.txt | uniq > new.txt |
Quote:
I have a macro in excel to do that for me. |
To remove the entries in bad.txt from good.txt:
cat good.txt | fgrep -vf bad.txt > newgood.txt I think that should do the trick. |
All times are GMT -7. The time now is 08:05 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123