Implement minibatching
We need to implement mini-batching and analyse whether this makes the training faster.
In order to reach the minimum, more than 100,000 events seems to need to be used. However, one epoch can now last more than 5 hours in this setup. Mini-batching was not implemented in the original repository. Although an event includes several examples of edges, mini-batching could still help to speed up training.
However, writing mini-batching without internal loop between mini-batches might be technical.