Hello,
Regarding my previous post about the memory leak:
It's in rl_json. The current version 0.11.0 does obviously have amemory leak. I've run my critical loop quite a few times, with several statements commented out, and watched the memory usage in the MSVC
analyzer. Its constantly increasing about 186 kb with every iteration,
until I specifically disable all calls to rl_json. Then it stops growing between iterations. More concrete, I have one call to [rl_json::json
get], which must be the trouble maker.
It's even more complicated. Rl_json itself has no problem. When I read a
json file and do [rl_json::json get] very often, there is no increased memory.
But my flow involves http. I use [http::get] to retrieve the data from a webservice and then [rl_json::json get] to turn it into Tcl data. And I
am aware of [http::cleanup] after the http::geturl.
When I use http::geturl *only*, without json get afterwards, the memory
usage is not constant, but increasing and dropping. I don't know whether
this is good or bad...
As soon as I use http::geturl to retrieve the data and then convert it
to a Tcl list with [rl_json::json get], there is a leak. Here is a
script to reproduce:
package re http 2.9.5
package re tls 1.7.22
package re rl_json 0.11.0
namespace import ::rl_json::*
http::register https 443 [list ::tls::socket -autoservername yes]
for {set i 0} {$i < 100000} {incr i} {
puts "iteration $i ..."
set tk [http::geturl https://api.kraken.com/0/public/Assets]
set data [http::data $tk]
rl_json::json get $data
http::cleanup $tk
after 3000
}
When I run "tclsh", attach it via debugger to MSVC and then source the
above script, it can be seen that the memory usage increases by a more
or less constant amount with every iteration.(*)(**)
If I do instead substitute the [http::geturl] calls by [exec curl], then
the memory does not change among iterations:
for {set i 0} {$i < 100000} {incr i} {
puts "iteration $i ..."
set data [exec curl -s https://api.kraken.com/0/public/Assets]
rl_json::json get $data
after 3000
}
Could it be that [http::cleanup] has a bug and does not clean up
properly? How could one work around this issue? (except by using curl,
which is an option that comes closer now...).
Or has it maybe to do with tls?
(*) The memory analyzer in MSVC is in the window that opens to the right
when you attach the debugger to a running process. You can click on
"memory usage" and then "snapshot" the memory at different times. If you snapshot after each iteration, i.e. every 3 seconds, the differences
become obvious. They are shown as red up-arrows or green down-arrows
below the usage graph (with the first version of the test script there
are just red up-arrows).
(**) I guess it would also possible to use valgrind on linux
When I use http::geturl *only*, without json get afterwards, the memory
usage is not constant, but increasing and dropping. I don't know whether
this is good or bad...
It's even more complicated. Rl_json itself has no problem. When I read a
json file and do [rl_json::json get] very often, there is no increased memory.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 498 |
Nodes: | 16 (2 / 14) |
Uptime: | 37:13:50 |
Calls: | 9,798 |
Files: | 13,751 |
Messages: | 6,189,336 |