Managing small batches and continuous data streams

Hello, I am looking for recommendations on how to best handle a workflow where images to be annotated are coming in a regular, continuous stream instead of in convenient chunks.

In our application, new images that need to be annotated are being created all day long 24x7 every day. As such, there are no real convenient checkpoints or signals as to when to chunk things together into batches. Instead we are just uploading new images to our data set, individually as they come into our app.

The trick is we want these new images to go into annotation workflow as soon as they are uploaded, but we are unsure about the idea of making a batch per single image. It feels inefficient and hard to manage.

On the other hand, we also don’t want to wait until the end of the hour/day to make batches either as we also need to maintain decent turn around time on the actual annotation workflow and we do not want an image to wait for very long before being annotated.

Has anyone built a workflow for this situation or have ideas on the best way to batch this sort of streaming data upload? What is the minimum “practical” size of a batch? What happens if I have many many small batches (order of 1-10 items)?