Avoid buffering when parsing stdout with Perl -
i want parse output of external program (some shell command) line line using perl. command runs continuously, put thread , use shared variables communicate main routine.
up code looks similar to
#!/usr/bin/perl use warnings; use strict; use threads; use threads::shared; $var :shared; $var=""; threads->create( sub { # command writes stdout each ~100ms $cmd = "<long running command> |"; open(readme, $cmd) or die "can't run program: $!\n"; while(<readme>) { $line = $_; # extract information line $var = <some value>; print "debug\n"; } close(readme); } ); while(1) { # evaluate variable each ~second print "$var\n"; sleep 1; }
for commands works fine , lines processed come in. output similar to:
... debug debug ... <value 1> ... debug debug ... <value 2> ...
however, other commands, behaves strange , lines being processed block wise. $var
doesn't updated , debug
not printed either time. then, the output (similar to):
... <value 1> <value 1> <value 1> ... debug debug debug ... <value 20>
and $var
set last/current value. repeats. parsing delayed , done in blocks while $var
not updated in between.
first of all: there better/propper way parse output of external program (line line!) besides using pipe?
if not, how can avoid behaviour?
i've read, using autoflush(1);
or $|=1;
might solution "currently selected output channel". how use in context?
thank in advance.
in general case, script cannot change buffering of child process' output. in specific cases may able starting appropriate switches, that's it.
i recommend instead of writing own code running , reading, re-write script use ipc::run
module. exists solve sort of problem. documentation isn't best ever, module well-tested , solid.
Comments
Post a Comment