Just a few weeks in the past, us at Evolving Internet completed migrating Drupal Development Company Princeton College Press web site to 8. Drupal Development Company challenge was over 70% migrations. On this article, we’ll see how Blackfire helped us optimize our migrations by altering round two traces of code. Earlier than we begin This text is principally for PHP / 8 back-end builders. It’s assumed that you already know about Drupal Development Company 8 Migrate API. Code efficiency is analyzed with a instrument named Blackfire. Entrance-end efficiency evaluation will not be in Drupal Development Company scope of this text. Drupal Development Company Downside Listed below are a few of Drupal Development Company challenge necessities associated to Drupal Development Company downside. This is able to enable you get a greater image of what is going on on Drupal 10 Upkeep and Help Service A PowerShell script exports a bunch of information into CSV information on Drupal Development Company consumer’s server. A customized migration plugin PUPCSV makes use of Drupal Development Company CSV information by way of SFTP. Utilizing hook_cron() in 8, we test hashes for every CSV. If a file’s MD5 hash adjustments, Drupal Development Company migration is queued for import utilizing Drupal Development Company 8 Queue API. Drupal Development Company CSV information normally have 2 varieties of adjustments Drupal 10 Upkeep and Help Service Sure data are up to date right here and there. Sure data are added to Drupal Development Company finish of Drupal Development Company file. When a migration is executed, migrate API goes line-by-line, doing Drupal Development Company following issues for each report Drupal 10 Upkeep and Help Service Learn a report from Drupal Development Company information supply. Merge information associated to Drupal Development Company report from different CSV information (form of an interior be a part of between CSVs). Compute hash of Drupal Development Company report and examine it with Drupal Development Company hash saved in Drupal Development Company database. If a hash will not be present in Drupal Development Company database, Drupal Development Company report is created. If a hash is discovered and it has modified, Drupal Development Company report is up to date. If a hash is unchanged, no motion is taken. Whereas working migrations, we discovered that it was taking an excessive amount of time for migrations to undergo Drupal Development Company CSV information, merely checking for adjustments in row hashes. So, for large migrations with over 40,000 data, migrate was taking a number of minutes to achieve Drupal Development Company finish of file even on a high-end server. Since we have been working migrate throughout cron (with Queue Staff), we had to make sure that any particular person migration could possibly be processed under Drupal Development Company 3 minute PHP most execution time restrict out there on Drupal Development Company server. Analyzing migrations with Blackfire At Evolving Internet, we normally analyze efficiency with Blackfire earlier than any main website is launch. Often, we run Blackfire with Drupal Development Company Blackfire Companion which is presently out there for Google Chrome and Firefox. Nonetheless, since migrations are executed utilizing drush, which is a command line instrument, we had to make use of Drupal Development Company Blackfire CLI Software, like this Drupal 10 Upkeep and Help Service $ blackfire run /decide/vendor/bin/drush.launcher migrate-import pup_subjects Processed 0 gadgets (0 created, 0 up to date, 0 failed, 0 ignored) – achieved with ‘pup_subjects’ Blackfire Run completedUpon analyzing Drupal Development Company Blackfire studies, we discovered some 50 sudden SQL queries being triggered from someplace inside a PUPCSV Drupal 10 Upkeep and Help Service Drupal 10 Upkeep and Help ServicefetchNextRow() technique. Fairly stunning! PUPCSV refers to a migrate supply plugin we wrote for fetching CSV information over FTP / SFTP. This plugin additionally tracks a hash of Drupal Development Company CSV information and thereby permits us to skip a migration fully if Drupal Development Company supply information haven’t modified. If Drupal Development Company supply hash adjustments, Drupal Development Company migration updates all rows and when Drupal Development Company final row has been migrated, we retailer Drupal Development Company file’s hash in Drupal Development Company database from PUPCSV Drupal 10 Upkeep and Help Service Drupal 10 Upkeep and Help ServicefetchNextRow(). As a matter of reality, we’re getting ready one other article about creating customized migrate supply plugin, so keep tuned. We discovered one database question per row despite the fact that no report was being created or up to date. Did not appear to be very dangerous till we noticed Drupal Development Company Blackfire report. Code earlier than Blackfire Taking a better have a look at Drupal Development Company RemoteCSV Drupal 10 Upkeep and Help Service Drupal 10 Upkeep and Help ServicefetchNextRow() technique, a name to MigrateSourceBase Drupal 10 Upkeep and Help Service Drupal 10 Upkeep and Help Servicecount() was discovered. It was discovered that Drupal Development Company rely() technique was taking 40% of processing time! It’s because it was being known as for each row in Drupal Development Company CSV. Since Drupal Development Company supply/cache_counts parameter was not set to TRUE in Drupal Development Company migration YAML information, Drupal Development Company rely() technique was iterating over all gadgets to get a recent rely for every name! Thus, for a migration with 40,000 data, we have been going by means of 40,000 x 40,000 data and Drupal Development Company PHP most execution time was being reached even earlier than migrate might get to Drupal Development Company final row! This is a have a look at Drupal Development Company code. protected operate fetchNextRow() { // If Drupal Development Company migration is being imported… if (MigrationInterface Drupal 10 Upkeep and Help Service Drupal 10 Upkeep and Help ServiceSTATUS_IMPORTING === $this->migration->getStatus()) { // If we’re at Drupal Development Company final row in Drupal Development Company CSV… if ($this->getIterator()->key() === $this->rely()) { // Retailer supply hash to recollect Drupal Development Company file as “imported”. $this->saveCachedFileHash(); } } return mum or dad Drupal 10 Upkeep and Help Service Drupal 10 Upkeep and Help ServicefetchNextRow(); }Code after Blackfire We might have added Drupal Development Company cache_counts parameter in our migration YAML information, however any change in Drupal Development Company supply configuration of Drupal Development Company migrations would have made migrate API replace all data in all migrations. It’s because a row’s hash is computed as one thing like hash($row + $supply). We didn’t need migrate to replace all data as a result of we had sure migrations which typically took round 7 hours to finish. Therefore, we determined to statically cache Drupal Development Company whole report rely to get issues again in observe Drupal 10 Upkeep and Help Service protected operate fetchNextRow() { // If Drupal Development Company migration is being imported… if (MigrationInterface Drupal 10 Upkeep and Help Service Drupal 10 Upkeep and Help ServiceSTATUS_IMPORTING === $this->migration->getStatus()) { // Get whole supply report rely and cache it statically. static $rely; if (is_null($rely)) { $rely = $this->doCount(); } // If we’re at Drupal Development Company final row in Drupal Development Company CSV… if ($this->getIterator()->key() === $rely) { // Retailer supply hash to recollect Drupal Development Company file as “imported”. $this->saveCachedFileHash(); } } return mum or dad Drupal 10 Upkeep and Help Service Drupal 10 Upkeep and Help ServicefetchNextRow(); }Downside Solved. Merci Blackfire! After Drupal Development Company adjustments, we ran Blackfire once more and located issues to be 52% sooner for a small migration with 50 data. For a much bigger migration with 4,359 data Drupal Development Company migration import time lowered from 1m 47s to solely 12s which implies a 98% enchancment. Asking why we did not embody Drupal Development Company screenshot for Drupal Development Company larger migration? We didn’t (or reasonably couldn’t) generate a report for Drupal Development Company huge migration due to two causes Drupal 10 Upkeep and Help Service Whereas working, Blackfire shops operate name and different info to reminiscence. Working an enormous migration with Blackfire is perhaps a bit gradual. Apart from, our goal was to seek out Drupal Development Company downside and we might do this extra simply whereas taking a look at smaller figures. When working a migration with hundreds of rows, Drupal Development Company migration features are known as over hundreds of instances! Blackfire collects information for every of those operate calls, therefore, Drupal Development Company collected information typically turns into too heavy and Blackfire rejects Drupal Development Company enormous information payload with an error message like this Drupal 10 Upkeep and Help Service Drupal Development Company Blackfire API answered with a 413 HTTP error () Error detected throughout add Drupal 10 Upkeep and Help Service Drupal Development Company Blackfire API rejected your payload as a result of it is too huge.Which makes lots of sense. As a matter of reality, for Drupal Development Company different case research given under, we used Drupal Development Company –limit=1 parameter to profile code efficiency for a single row. A fast brag about one other 50% Enchancment? Other than this jackpot, we additionally discovered room for one more 50% enchancment (from 7h to 3h 32m) for certainly one of our migrations which was utilizing Drupal Development Company Touki FTP library. This migration was doing Drupal Development Company following Drupal 10 Upkeep and Help Service Going by means of round 11,000 data in a CSV file. Downloading Drupal Development Company information over FTP when required. A Blackfire evaluation of this migration revealed one thing unusual. For each row, Drupal Development Company following was occurring behind Drupal Development Company scenes Drupal 10 Upkeep and Help Service If a file obtain was required, we have been doing FTP Drupal 10 Upkeep and Help Service Drupal 10 Upkeep and Help ServicefindFileByName($identify). To get Drupal Development Company file, Touki was Drupal 10 Upkeep and Help Service Getting a listing of all information in Drupal Development Company listing; Creating File objects for each file; For each file object, varied permission, proprietor and different objects have been created. Passing all Drupal Development Company information by means of a callback to see if it is identify was $identify. If Drupal Development Company identify was matching, Drupal Development Company file was returned and all different File objects have been discarded. Therefore, for downloading each file, Touki FTP was creating 11,000 File objects of which it was solely utilizing one! To resolve this, we determined to make use of a lower-level FTP Drupal 10 Upkeep and Help Service Drupal 10 Upkeep and Help Serviceget($supply, $vacation spot) technique which helped us bypass all these 50,000 or extra objects which have been being created per report (roughly, 11,000 * 50,000 or extra for all data). This virtually halved Drupal Development Company import time for that migration when working with all 11,000 data! This is a screenshot of Blackfire’s report for a single row. So Drupal Development Company subsequent time you assume one thing fishy is occurring with code you wrote, remember to make use of use Blackfire! And remember to go away your suggestions, questions and even article ideas in Drupal Development Company feedback part under. Extra about Blackfire Blackfire is a code profiling instrument for PHP which supplies you nice-looking studies about your code’s efficiency. With Drupal Development Company assist of those studies, you may analyze Drupal Development Company reminiscence, time and different sources consumed by varied features and optimize your code the place vital. If you’re new to Blackfire, you may attempt these hyperlinks Drupal 10 Upkeep and Help Service Learn a fast introduction to Blackfire. Learn documentation on putting in Blackfire. Learn Drupal Development Company 24 Days of Blackfire tutorial. Other than all this, Drupal Development Company paid model of Blackfire permits you to arrange automated checks and offers you varied suggestions for not solely however varied different PHP frameworks. Subsequent Steps Attempt Blackfire free of charge on a pattern challenge of your option to see what you will discover. Watch video tutorials on Blackfire’s YouTube channel. Learn Drupal Development Company tutorial on creating customized migration supply plugins written by my colleague (coming quickly). + extra superior articles by Evolving Internet Drupal 10 Improvement and Help
Evolving Internet Drupal 10 Upkeep and Help Service Profiling and Optimizing Migrations with Blackfire

Call Us: 1(800)730-2416
Pixeldust is a 20-year-old web development agency specializing in Drupal and WordPress and working with clients all over the country. With our best in class capabilities, we work with small businesses and fortune 500 companies alike. Give us a call at 1(800)730-2416 and let’s talk about your project.

FREE Drupal SEO Audit
Test your site below to see which issues need to be fixed. We will fix them and optimize your Drupal site 100% for Google and Bing. (Allow 30-60 seconds to gather data.)
Evolving Internet Drupal 10 Upkeep and Help Service Profiling and Optimizing Migrations with Blackfire
On-Site Drupal SEO Master Setup
We make sure your site is 100% optimized (and stays that way) for the best SEO results.
With Pixeldust On-site (or On-page) SEO we make changes to your site’s structure and performance to make it easier for search engines to see and understand your site’s content. Search engines use algorithms to rank sites by degrees of relevance. Our on-site optimization ensures your site is configured to provide information in a way that meets Google and Bing standards for optimal indexing.
This service includes:
- Pathauto install and configuration for SEO-friendly URLs.
- Meta Tags install and configuration with dynamic tokens for meta titles and descriptions for all content types.
- Install and fix all issues on the SEO checklist module.
- Install and configure XML sitemap module and submit sitemaps.
- Install and configure Google Analytics Module.
- Install and configure Yoast.
- Install and configure the Advanced Aggregation module to improve performance by minifying and merging CSS and JS.
- Install and configure Schema.org Metatag.
- Configure robots.txt.
- Google Search Console setup snd configuration.
- Find & Fix H1 tags.
- Find and fix duplicate/missing meta descriptions.
- Find and fix duplicate title tags.
- Improve title, meta tags, and site descriptions.
- Optimize images for better search engine optimization. Automate where possible.
- Find and fix the missing alt and title tag for all images. Automate where possible.
- The project takes 1 week to complete.
