Our batch script updating custom blocks entities (about 4000 items or more).
At first, the script getting list of the blocks ids, then the script uploading each block (in the batch_op function)
Drupal::service('entity.repository')->loadEntityByUuid('block_content', $block_id);
updating field values and then, saving the block
$block->save();.
When the script processed all the items and should display the result message, but i’m getting
"Allowed memory size of 536870912 bytes exhausted"
As i have figure out the script loading each block when the batch has been processed all the items (Call class ContentEntityBase->__construct for each block).
I’m woundering why it happens and how to prevent it?