#capjamesg[d]sknebel I want to run my web crawler across multiple threads. concurrent.futures is working well (it doesn't crash, it's super fast, is indexing content) but I don't know how to add another job to work on after all initial jobs are complete.