pythonsubprocesscapstone

subprocess.Popen eats memory while on loop


I wrote a python script to help me automate a project I'm working on. It executes a Capstone subprocess in a loop, since I couldn't get the binding to install at the time, reading 4 bytes from a binary file and passing it as the Capstone input.

The problem is, the binary data I had was over 30MB, which caused this script to go on for a while. So I let it be for a few minutes, and when I got back, my RAM usage was at 98%.

I've moved on to using the Capstone binding, but I'm curious as to what went wrong here. My code is below. Thanks!

import binascii
import subprocess
import sys

def main():
    if len(sys.argv) != 3:
        print(sys.argv[0] + " [binary file] [output file]")
        sys.exit(1)
    hexdata = ""
    f = open(sys.argv[1], "rb")
    if f == None:
        print("Could not open file!")
        sys.exit(1)
    data = f.read()
    f.close()
    x = 0
    out = open(sys.argv[2], "a+")
    while x < len(data):
        for y in xrange(4):
            hexdata += binascii.hexlify(data[x])
            x += 1
        popen = subprocess.Popen(["cstool", "arm64", hexdata], stdout=subprocess.PIPE, universal_newlines=True)
        for line in iter(popen.stdout.readline, ""):
            out.write(line.rstrip()[13:])
        popen.stdout.close()
        hexdata = ""
    out.close()

if __name__ == "__main__":
    main()

EDIT: Fixed the code. I wrote it back from memory, so I messed it up a tiny bit.


Solution

  • You forgot to add popen.kill() or popen.terminate() to kill/terminate the process after starting it over and over again.

    Adding popen.kill() or popen.terminate() inside the while at the end should do kill the left over process.