![]() |
![]() |
![]() |
||||
Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums. You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today! If you have any problems with the registration process or your account login, please contact us. |
![]() ![]() |
|
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed. |
|
Thread Tools |
![]() |
#1 |
Too lazy to set a custom title
Join Date: Aug 2001
Location: The Netherlands
Posts: 13,723
|
2 txt files, how to remove doubles?
I have 2 huge txt files with galleries. One with valid url's, and one with url's which don't work anymore, cheated etc.
SInce they contain over 1000 url's each, it is a lot of work to find what are double etc. Is there an easy way to find which url's are double. They should be removed from the "good" list. so good.txt bad.txt and url's listed on bad.txt should be removed from good.txt in case they are listed there. How todo this quick and easy? Andre
__________________
Questions? ICQ: 125184542 |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#2 |
Programming King Pin
Industry Role:
Join Date: Oct 2003
Location: Montreal
Posts: 27,360
|
Any basic programmers out there can do this very quick!
__________________
UUGallery Builder - automated photo/video gallery plugin for Wordpress! ![]() ![]() ![]() ![]() |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#3 |
Too lazy to set a custom title
Join Date: Aug 2001
Location: The Netherlands
Posts: 13,723
|
I need one that is ready to use. Money is not an issue
![]()
__________________
Questions? ICQ: 125184542 |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#4 |
Too lazy to set a custom title
Industry Role:
Join Date: Jul 2001
Posts: 59,204
|
I'll help ya, check icq.
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#5 | |
Too lazy to set a custom title
Join Date: Aug 2001
Location: The Netherlands
Posts: 13,723
|
Quote:
Andre
__________________
Questions? ICQ: 125184542 |
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#6 | |
Confirmed User
Join Date: Aug 2003
Posts: 3,042
|
Quote:
I think this works cat old.txt | uniq > new.txt |
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#7 | |
Confirmed User
Industry Role:
Join Date: Aug 2004
Location: Montreal, Canada
Posts: 5,600
|
Quote:
I have a macro in excel to do that for me. |
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#8 |
Too lazy to set a custom title
Join Date: Mar 2002
Location: Australia
Posts: 17,393
|
To remove the entries in bad.txt from good.txt:
cat good.txt | fgrep -vf bad.txt > newgood.txt I think that should do the trick. |
![]() |
![]() ![]() ![]() ![]() ![]() |