I have an gawk program that takes as input Events Stream
$CURL -s https://domain.com/path-to-stream | prog.awk
prog.awk
----
while ((getline line < "/dev/stdin") > 0) {
<does a lot of stuff>
}
So far it is working great (yay awk), but I wonder because "does a
lot of stuff" might take a while some seconds, and I don't want to
drop any incoming stream, which can be a lot. Not even sure what is
doing the buffering - the OS via the pipe? Curl? Awk? Can data be
lost without knowing? Thanks for any insight.
I have an gawk program that takes as input Events StreamAwk? Can data be lost without knowing? Thanks for any insight.
$CURL -s https://domain.com/path-to-stream | prog.awk
prog.awk
----
while ((getline line < "/dev/stdin") > 0) {
<does a lot of stuff>
}
So far it is working great (yay awk), but I wonder because "does a lot of stuff" might take a while some seconds, and I don't want to drop any incoming stream, which can be a lot. Not even sure what is doing the buffering - the OS via the pipe? Curl?
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 498 |
Nodes: | 16 (2 / 14) |
Uptime: | 22:33:53 |
Calls: | 9,828 |
Calls today: | 7 |
Files: | 13,761 |
Messages: | 6,191,776 |