i’m trying to optimise my setup of a Drupal Batch process.
I am reading in thousands of rows of data from an Excel spreadsheet, from multiple worksheets, and creating nodes from the Rows. My import functions work fine, and I can create a Batch, but at present I am creating a batch operation for each row. This means that my batch is very slow (because I’m having to bootstrap Drupal on every row).
What I can’t seem to figure out is a way to setup my batch to deal with ‘chunks’ rather than individual rows.
An example of my import function (of which there are 18) is:
function governbim_coordinate($value){ module_load_include('inc','phpexcel'); global $user; $project = $_SESSION['governbim']['projnid']; //Load the required worksheet from the excel file passed to the function as $value $result = phpexcel_import($value,TRUE,TRUE,array('setLoadSheetsOnly' => array('Coordinate'))); //Define some common values for our entity $values = array( 'type' => 'coordinate', 'uid' => $user->uid, 'status' => 1, 'comment' => 0, 'promote' => 0, ); //step through each row in the worksheet foreach($result['Coordinate'] as $coordinate){ $title = $coordinate['Name']; //check to see if this is a new row or if we are updating an existing entity $action = governbim_node_actions_node_check('assembly',$title); switch($action['op']){ case 'new': $e = entity_create('node',$values); $entity = entity_metadata_wrapper('node',$e); break; case 'update': $e = entity_create('node',$values); $entity = entity_metadata_wrapper('node',$action['nid']); break; } //*** Here is where we write everything to the database with the entity values and $entity->save() ****/ } }
My batch is setup (currently) as:
function governbim_excel_import_batch(){ drupal_set_message("Importing Data..."); $path = 'public://imports/'; $file = 'COBie-UK-2012-example1.xls'; $datafile = $path.$file; $operations[]=array('governbim_coordinate',array($datafile)); //another 17 $operations[] defined as above $batch = array( 'operations' => $operations, 'init_message' => t('Constructing batch operation...'), 'progress_message' => t('Importing Excel data'), 'finished' => 'governbim_excel_import_batch_finished', 'error_message' => t('Importing of Excel data encountered an error.'), ); return $batch; }
I realise I need to get some $context stuff into the import function, but I can’t seem to figure out how to setup that aspect properly.
Is anyone able to assist me in re-tuning things so that I can get the batch (with 18 function calls) to display progress of the overall batch and get the batch to execute (say) groups of iterations (as in $limit = 10)?