3

I have a python script that runs forever but I'd like to restart every day just in case it gets into a bad state, gets shut down, etc. I'm scheduling it daily with task scheduler, which is fine.

The problem is that my script spawns other processes, and those child processes don't get killed when the task is stopped (ex: when I manually end it, or a new instance of the task runs). The main script is killed, but the child processes are not. I have to kill them manually through task manager. If I run my main script on the command line, instead of through task scheduler, closing the console kills all child processes, as expected.

I tried running the task with and without "highest privileges", "if the task does not end when requested, force it to stop", "if the task is already running, stop the existing instance", etc. I've tried running taskkill as the first action of the task. I've tried everything I can think of. I'm running the task as myself, and I'm administrator.

The problem seems to be that the child processes spawned by tasks are access protected. I've tried taskkill, pskill, python scripts like this:

import psutil
for process in psutil.process_iter():
    cmdline = process.cmdline()
    if "myscript.py" in cmdline:
        process.terminate()

I always get an access denied error. Example from taskkill:

c:\>taskkill /f /t /im python.exe
ERROR: The process with PID 14436 (child process of PID 7928) could not be terminated.
Reason: Access is denied.
ERROR: The process with PID 7928 (child process of PID 14324) could not be terminated.
Reason: Access is denied.
...

Hell, I even tried having my main python script kill its own child process when a termination event comes in, but I don't even get that event with task scheduler, with or without force kill! Again, on a normal command line run, I get termination events as expected. But thwarted by task scheduler once again.

This is also very easy to reproduce with a one line batch file that runs a python script. If task scheduler runs the batch file, and you end the task, the python script will not be killed when the task is ended.

Any idea why child processes spawned by task scheduler tasks are not killed and how to work around it?

Thanks!

1 Answer 1

1

If you can edit the python script itself, maybe you can:

  • Set it as to check a signal (e.g. let the scheduler create a file named signal.txt), and if it exist, delete it and kill the processes itself (i.e. the script that runs forever will have a 'Medea' function to kill all children before suicide)
  • Assign the launched processes to a job (see answer in stackoverflow by Nathaniel J. Smith)
  • create a 23.9 hour timer to each process (similar to this other stackoverflow) so making sure they 'die' before the scheduler re-launches them...

Hope it helps...

1
  • Yes, this was the solution I came up with as well. But it seems crazy that there's no direct solution.
    – Steve
    Commented Oct 26, 2019 at 20:16

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .