0

I'd like to be able to kick off a big batch of jobs in the background without using bash scripts and keep working on the same kernel. Is this possible? I am open to architecture changes, but the end user of my library is likely not very sophisticated.

[1] create_batch = my_batch.create(**batch_input)

[2] run_batch = start_async_process(
        # real python... not bash
        sleep(9999)
        #my_batch.execute_jobs()
    )

[3] print("i can still to do stuff while that runs!")

[4] my_batch.get_status()
  • The [] represent ipython cells.
  • python 3.7.6 from within JupyterLab
Kalanos
  • 3,922
  • 3
  • 37
  • 63

1 Answers1

2

You could just create a thread for each job and start each of them? You will need to create a data structure to hold everything, but again, you can probably just do this with some basic threading.

iHowell
  • 1,991
  • 21
  • 43
  • Tested out some threading! The data is the easy part; each thread will just get a sqlite connection. Given that a single job involves training an ML algorithm that already maxes out cores, I fear that introducing multithreading on top of that will melt the cpu =\. – Kalanos Oct 30 '20 at 22:00
  • Hmm. I think I will create 1 separate thread that works its way through a queue. Thanks for the inspiration. – Kalanos Oct 30 '20 at 22:51