Sujet : Re: Does cheating produce faster searches?
De : sdeacon (at) *nospam* us.socionext.com (Shaun Deacon)
Groupes : comp.lang.tclDate : 27. Sep 2024, 02:46:36
Autres entêtes
Organisation : A noiseless patient Spider
Message-ID : <vd52pu$ibfo$1@dont-email.me>
References : 1
User-Agent : Mozilla/5.0 (Windows NT 10.0; WOW64; rv:91.0) Gecko/20100101 Firefox/91.0 SeaMonkey/2.53.19
Luc wrote:
Suppose I have a large list. Very large list. Then I want to search
for an item in that string:
% lsearch $list "string"
Now, suppose I have many lists instead. One list contains all the items
that begin with the letter a, another list contains all the items that
begin with the letter b, another list contains all the items that begin
with the letter c, and so on. Then I see what the first character in
"string" is and only search for it in the one corresponding list.
Would that be faster? I somehow suspect the answer is 'no.'
Bonus question: what about sqlite operations? Would they be faster if
I had one separate table for each initial letter/character?
TIA
As you've probably discovered lsearch can be slow for huge lists, and if you're doing basic searches on a big list a lot, it can be noticeable.
Depending very much on what you want to do with the result from your search, and speed is your primary concern, there may be another way to approach this...
If you're just checking for whether a word exists in some word list, have you considered creating array variables ?
foreach word $bigWordList {
set words($word) 1
}
...
set string "foo"
if { [info exists words($string)] } {
puts "$string is in my list"
} else {
puts "$string is not in my list"
}
The above test would be quick and you just build the initial 'words' array once (and then add new words by setting new variables). Of course, "set words($word) xxx" could be set to something other than 1...