More than 15 years ago, I made a quick'n'dirty text editor prototype that opened text files like this:
----
if {$_fileOpen != ""} {
$::t1::text delete 1.0 end
set _channel [open $_fileOpen r]
set _slurp [read $_channel]
close $_channel
$::t1::text insert end $_slurp
focus $::t1::text
}
----
And it worked, but when I tested that on a large file (I don't know what
my idea of "large" was at the time), the application froze for a while
before it would display the content of the file. It froze for long enough
to be annoying.
I asked for help and someone taught me this:
----
proc p.openstream {argTextWidget argChannel {argStreamSize 10000}} {
$argTextWidget insert end [read $argChannel $argStreamSize]
if {[eof $argChannel]} {
close $argChannel
} else {after idle [list after 0 [info level 0]]}
}
if {$_fileOpen != ""} {
$::t1::text delete 1.0 end
set _channel [open $_fileOpen r]
p.openstream $::t1::text $_channel 1000
focus $::t1::text
}
----
Not only did it work, the application wouldn't freeze anymore. The content would load instantly.
So many years later, I dug up that code and decided to build upon it.
It still works, but I noticed that the "openstream" method adds a bunch of empty lines to the content, which is, of course, not desired.
I don't know how to fix that. Either way, I decided to replace it with the older and dumber "slurp" method. Then I tested it on a 200K-line, 16MB file.
It loaded instantly. Well, that's good, but more than 15 years later, I have
a much more powerful machine than I did in the early to mid oughts.
My question is, has anything been changed in Tcl -internally- since then
that makes the "slurp" method inherently more efficient?
What is going to happen if someone tries to load a very large file in a
much less powerful machine?
I don't want to leave anyone behind. I refuse to be the jerk that just says "upgrade your gear and shut up" and shrugs it away.
TIA
--
Luc
--- SoupGate-Win32 v1.05
* Origin: fsxNet Usenet Gateway (21:1/5)