Skip to content

fix: Change batch splitting in the filter module

Benjamin Huth requested to merge bhuth/commonframework:fix/filter-split into dev

I went into some trouble with the filter module today. The implemented memory_robust_eval escalated into a long sequence of RuntimeErrors and finally ended up trying to allocate 5MiB memory unsuccessfully...

I noticed that as currently implemented, the batch object is cloned a lot of times. Everytime a memory error occurs, the batch is cloned twice, but these copies are not used if again a memory error occurs. This leads to an exponential increasing number of copies of the batch I think.

I tried out a more sequential approach that only does one .clone() at a time, and also caches the number of split's between evaluations. That worked well for me.

Merge request reports