My understanding is that whenever pipe.put(something) is executed it is put into a queue or similar.
If we only want to process the somethings by workers, without caring at all about any return value, we can use disable_result but it seems there is no limit to how much data can get put into the pipeline. If a large number of large data is put into the pipeline, will this cause problems? Is it possible the have only a certain maximum number of items waiting for processing before put(something) blocks?