Python logging redirecting stdout from multiple processes -


i trying capture stderr , stdout of number of processes , write outputs log file using python logging module. code below seems acheive this. presently poll each processes stdout , write logger if there data. there better way of doing this.

also have master log of individual processese activity, in other words want automatically (without polling) write stdout/stderr each process master logger. possible?

thanks

class myprocess: def __init__(self, process_name , param):     self.param = param     self.logfile = logs_dir + "display_" + str(param) + ".log"     self.args = [process_name, str(param)]     self.logger_name = process_name + str(param)     self.start()     self.logger = self.initlogger()  def start(self):     self.process = popen(self.args, bufsize=1, stdout=pipe, stderr=stdout) #line buffered     # make each processes stdout non-blocking     fd = self.process.stdout     fl = fcntl.fcntl(fd, fcntl.f_getfl)     fcntl.fcntl(fd, fcntl.f_setfl, fl | os.o_nonblock)  def initlogger(self):     f  = logging.formatter("%(levelname)s -%(name)s - %(asctime)s - %(message)s")     fh = logging.handlers.rotatingfilehandler(self.logfile, maxbytes=max_log_file_size, backupcount = 10)     fh.setformatter(f)      logger = logging.getlogger(self.logger_name)     logger.setlevel(logging.debug)     logger.addhandler(fh) #file handler     return logger  def getoutput(self): #non blocking read of stdout     try:         return self.process.stdout.readline()     except:         pass  def writelog(self):     line = self.getoutput()     if line:         self.logger.debug(line.strip())          #print line.strip()    process_name = 'my_prog' num_processes = 10 processes=[]  param in range(num_processes)     processes.append(myprocess(process_name,param))  while(1):     p in processes:         p.writelog()      sleep(0.001) 

your options here are

  • non-blocking i/o: have done :)

  • the select module: can use either poll() or select() dispatch reads different inputs.

  • threads: create thread each file descriptor want monitor , use blocking i/o. not advisable large numbers of file descriptors, @ least works on windows.

  • third-party libraries: apparently, can use twisted or pyevent asynchronous file access, never did that...

for more information, watch video on non-blocking i/o python

since approach seems work, stick it, if imposed processor load not bother you. if does, go select.select() on unix.

as question master logger: because want tee off individual outputs, can't redirect master logger. have manually.


Comments

Popular posts from this blog

asp.net - repeatedly call AddImageUrl(url) to assemble pdf document -

java - Android recognize cell phone with keyboard or not? -

iphone - How would you achieve a LED Scrolling effect? -