![]() |
Any perl people in the house?
I'm trying to develop a script that searches through a simple .txt file and returns all the records matching a search query.
This is the general format of the .txt file xxx----somecategory----123Mb yyy----someothercategory----456Mb zzz----onemorecategory---789Mb Then, someone specifies a query through my site and I need to check whether any of the records in the .txt file match the query by scanning through the xxx, yyy, zzz parts. Here's the snag, I have 50,000 entries in the .txt file so a simple foreach loop takes way too long. Any ideas? |
The simplest solution would be to stop using a txt file and throw the data in a db..
|
Quote:
|
[QUOTE=gezzed]Here's the snag, I have 50,000 entries in the .txt file so a simple foreach loop takes way too long./QUOTE]
Its not the foreach that's taking the time. Its the pattern matching on 50,000 entries. The easiest way would be with a database, properly indexed. If you're stuck on storing your data in text files, you could create a new set of textfiles to index what's in the main text file. |
Probably with grep. It's inherently line-based and fast with regex. I dont have a sample, but that should get you on the right path I think.
|
Quote:
|
Looks like grep is the answer. Thanks again Tom.
|
|
Is the xxx,yyy,zzz a set length (like always 3 chars long)?
|
Using a %hash you could search 50k text file easy
|
use regular expressions
hit me up if you want on aim freebsdteks |
All times are GMT -7. The time now is 03:04 AM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123