Lullabot: Decoupled Drupal maintenance support plans Hard Problems: Schemas

Texas Born In 2000

Pixeldust Drupal Developers

Pixeldust is an expert software development agency and trusted Pantheon development partner specializing in Drupal Development, security, and support. In business since 2000, we have completed over 500 projects.

Pixeldust is Trusted by the Worlds Greatest Brands

Pixeldust offers premium Drupal development services.

We are committed to developing under Drupal best-practices, ensuring our clients have a stable, maintainable codebase.

Pixeldust is an expert software agency specializing in responsive frameworks, mobile applications, and online marketing services. Using the latest technologies, coupled with a healthy dose of imagination and expertise, we work closely with you to identify your needs, wants and provide a comprehensive, integrated solution to your online communication requirements.

Our Drupal developers are focused on quality, not quantity.

Our aim is to provide a responsive and personal approach to each project to ensure that our clients can benefit from their investment. We see each Drupal development project as an opportunity to grow your business—we aim to help you grow sales and improve retention while offering leading, aesthetically pleasing, and functional designs that suit your needs faultlessly.

What differentiates Pixeldust from other Drupal development is the effort to create a positive return on clients’ investments.

We have devoted years of effort to understanding the variables involved in user experience as well as online marketing strategies. Our team of inspired Web Designers, developers, and marketing specialists help to increase the exposure of your website, as well as provide a unique user engagement. Pixeldust is focused on creative and results-orientated solutions developed to maximize your website’s true earning and traffic potential.

Professional class Drupal Development is absolutely integral to the legitimacy and effectiveness of your online presence.

Our Developers boast superior technical know-how. In business since 1999, we have completed over 500 projects, giving us plenty of experience in developing beautiful, tailored websites while keeping your business interests in mind.

 

How does on-site Drupal SEO work?

Drupal 8 is the CMS of choice for many top enterprise websites because it was built from the jump the extensibility required to optimize every node, view, and code snippet for search engines. However, you have to know how to configure it. 

Search Engine Optimization (on-page Drupal SEO) has been around as long as search engines. If you run a website, you probably have at least a basic understanding of SEO. Drupal is phenomenal for SEO. We’ve worked in Drupal for 14 years and experienced firsthand how positively search engines respond to correctly configured Drupal sites. 

We have helped clients double their traffic, just by switching platforms. Drupal has some distinct technical advantages with on-site optimizations like RDF or AMP. The results are higher ranking, quicker, and more trafficked sites. Since Drupal 8 has scheduled feature releases every six months, you will be thinking in terms of months, not years, when new technology becomes a standard.

FREE Drupal maintenance support plans Security Audit

Why a Drupal site audit?

  • Security – Discover weaknesses in your Drupal implementation.
  • Performance – Identify areas where performance improvements can be made.
  • Site Acquisition – Do this before you buy a business as part of due diligence.
  • Implementation Verification – Check your site before it goes live to avoid critical issues that may appear under load.
  • Vendor Management – Make sure your current developer is doing a good job.
  • Support Transition – When moving to a new developer both sides need to know what they are working with.
FREE Drupal maintenance support plans Security Audit

Case Study: Mahindra USA Inc

Mahindra USA, Inc. manufactures agricultural machinery and equipment. They are the world’s largest selling tractor brand by volume, and the World’s number one tractor maker for over three decades.

Drupal Requirements

  • Support for a company-wide rebranding
  • Migration to a more robust and flexible platform in Drupal 8
  • Integration with third-party customer relationship management applications
  • Internationalization
  • Efficient scalability
  • Integration with sales SAS
  • Read Case Study

Inquiry

Call (512)730-0999 or submit an inquiry.
  • This field is for validation purposes and should be left unchanged.
Lullabot: Decoupled Drupal maintenance support plans Hard Problems: Schemas

30-Day Drupal SEO Blast – On-page and Off-page SEO Overhaul

We start every project off with an introductory discovery call with key stakeholders to create a project plan, establish key contacts, and plan credential transfers. 

Pixeldust’s 30-Day SEO Blast is a campaign to overhaul your Drupal site for immediate improvement in search engine rankings. Think of it as a 30-day boot camp for your website. The process is divided into two areas of focus: on-site and off-site.

On-Site Drupal SEO
On-site (or On-page) SEO:  Changes are made to your site’s structure and performance to make it easier for search engines to see and understand your site’s content. Search engines use algorithms to rank sites by degrees of relevance. On-site optimization ensures your site is configured to provide information in a way that meets Search engine standards for optimal indexing.  

Off-site White Hat Drupal SEO
Off-site (or Off-page) Drupal SEO is the process of making your site more visible to humans across the internet and increasing its relevance. 

Vigilant Drupal Support Plans

When you subscribe to our Unlimited Drupal Support Plan in Houston or Austin, you have the comfort of knowing our trusted team of Drupal admins is at the ready; waiting to fix errors, broken functionality, layout issues, and anything else the pops up. Some of our Unlimited clients don’t even have to log in to their sites anymore. Even if you just need an article posted, or a new product added to your shop, no worries, we got you. If it takes less than 30 minutes per issue, we will take care of it.

We start every project off with an introductory discovery call with key stakeholders to create a project plan, establish key contacts, and plan credential transfers. 

  • Unlimited Repairs & Fixes
  • Unlimited Update Tasks
  • FREE Set-up
  • Same-Day Security Updates
  • Monthly Module Updates
  • Monthly Broken Link Scan
  • Monthly Security Scan
  • Monthly Manual Site Check
  • Monthly Speed Test
  • Offline Updating
  • GIT Version Control
  • Detailed Work Notes
  • Testing After All Updates
  • Security Guarantee
  • Hack/Malware/Down Recovery
  • Uptime Monitoring
  • Daily Offsite Backups
  • Free Basic Website Hosting & SSL
  • Helpdesk Support
REQUEST FOR PROPOSAL

Need a custom quote?

Submit the RFP form below and we will send you a project proposal in 48 hours. If you like what you see, we can schedule a call to discuss the project in greater detail.

Step 1 of 2

  • Contact Information

Lullabot: Decoupled Drupal maintenance support plans Hard Problems: Schemas

Published on January 26, 2019

The Schemata module is our best approach so far in order to provide schemas for our API resources. Unfortunately, this solution is often not good enough. That is because the serialization component in Drupal maintenance support plans is so flexible that we can’t anticipate the final form our API responses will take, meaning the schema that our consumers depend on might be inaccurate. How can we improve this situation?

This article is part of the Decoupled hard problems series. In past articles we talked about request aggregation solutions for performance reasons, and how to leverage image styles in decoupled architectures.

TL;DR

Schemas are key for an API’s self-generated documentation
Schemas are key for the maintainability of the consumer’s data model.
Schemas are generated from Typed Data definitions using the Schemata module. They are expressed in the JSON Schema format.
Schemas are statically generated but normalizers are determined at runtime.

Why Do We Need Schemas?

A database schema is a description of the data a particular table can hold. Similarly an API resource schema is a description of the data a particular resource can hold. In other words, a schema describes the shape of a resource and the datatype of each particular property.

Consumers of data need schemas in order to set their expectations. For instance, the schema tells the consumer that the body property is a JSON object that contains a value that is a string. A schema also tells us that the mail property in the user resource is a string in the e-mail format. This knowledge empowers consumers to add client-side form validation for the mail property. In general, a schema will help consumers to have prior understanding of the data they will be fetching from the API, and what data objects they can write to the API.

We are using the resource schemas in the Docson and Open API to generate automatic documentation. When we enable JSON API and  Open API you get a fully functional and accurately documented HTTP API for your data model. Whenever we make changes to a content type, that will be reflected in the HTTP API and the documentation automatically. All thanks to the schemas.

A consumer could fetch the schemas for all the resources it needs at compile time or fetch them once and cache them for a long time. With that information, the consumer can generate its models automatically without developer intervention. That means that with a single implementation once, all of our consumers’ models are done forever. Probably, there is a library for our consumer’s framework that does this already.

More interestingly, since our schema comes with type information our schemas can be type safe. That is important to many languages like Swift, Java, TypeScript, Flow, Elm, etc. Moreover if the model in the consumer is auto-generated from the schema (one model per resource) then minor updates to the resource are automatically reflected in the model. We can start to use the new model properties in Angular, iOS, Android, etc.

In summary, having schemas for our resources is a huge improvement for the developer experience. This is because they provide auto-generated documentation of the API, and auto-generated models for the consumer application.

How We Are Generating Schemas In Drupal maintenance support plans?

One of Drupal maintenance support plans 8’s API improvements was the introduction of the Typed Data API. We use this API to declare the data types for a particular content structure. For instance, there is a data type for a Timestamp that extends an Integer. The Entity and Field APIs combine these into more complex structures, like a Node.

JSON API and REST in core can expose entity types as resources out of the box. When these modules expose an entity type they do it based on typed data and field API. Since the process to expose entities is known, we can anticipate schemas for those resources.

In fact, assuming resources are a serialization of field API and typed data is the only thing we can do. The base for JSON API and REST in core is Symfony’s serialization component. This component is broken into normalizers, as explained in my previous series. These normalizers transform Drupal maintenance support plans‘s inner data structures into other simpler structures. After this transformation, all knowledge of the data type, or structure is lost. This happens because the normalizer classes do not return the new types and new shapes the typed data has been transformed to. This loss of information is where the big problem lies with the current state of schemas.

The Schemata module provides schemas for JSON API and core REST. It does it by serializing the entity and typed data. It is only able to do this because it knows about the implementation details of these two modules. It knows that the nid property is an integer and it has to be nested under data.attributes in JSON API, but not for core REST. If we were to support another format in Schemata we would need to add an ad-hoc implementation for it.

The big problem is that schemas are static information. That means that they can’t change during the execution of the program. However, the serialization process (which transforms the Drupal maintenance support plans entities into JSON objects) is a runtime operation. It is possible to write a normalizer that turns the number four into 4 or “four” depending if the date of execution ends in an even minute or not. Even though this example is bizarre, it shows that determining the schema upfront without other considerations can lead to errors. Unfortunately, we can’t assume anything about the data after its serialized.

We can either make normalization less flexible—forcing data types to stay true to the pre-generated schemas—or we can allow the schemas to change during runtime. The second option clearly defeats the purpose of setting expectations, because it would allow a resource to potentially differ from the original data type specified by the schema.

The GraphQL community is opinionated on this and drives the web service from their schema. Thus, they ensure that the web service and schema are always in sync.

How Do We Go Forward From Here

Happily, we are already trying to come up with a better way to normalize our data and infer the schema transformations along the way. Nevertheless, whenever a normalizer is injected by a third party contrib module or because of improved normalizations with backwards compatibility the Schemata module cannot anticipate it. Schemata will potentially provide the wrong schema in those scenarios. If we are to base the consumer models on our schemas, then they need to be reliable. At the moment they are reliable in JSON API, but only at the cost of losing flexibility with third party normalizers.

One of the attempts to support data transformations and the impact they have on the schemas are Field Enhancers in JSON API Extras. They represent simple transformations via plugins. Each plugin defines how the data is transformed, and how the schema is affected. This happens for both directions, when the data goes out and when the consumers write back to the API and the transformation needs to be reversed. Whenever we need a custom transformation for a field, we can write a field enhancer instead of a normalizer. That way schemas will remain correct even if the data change implies a change in the schema.

undefined

We are very close to being able to validate responses in JSON API against schemas when Schemata is present. It will only happen in development environments (where PHP’s asserts are enabled). Site owners will be able to validate that schemas are correct for their site, with all their custom normalizers. That way, when a site owner builds an API or makes changes they’ll be able to validate the normalized resource against the purported schema. If there is any misalignment, a log message will be recorded.

Ideally, we want the certainty that schemas are correct all the time. While the community agrees on the best solution, we have these intermediate measures to have reasonable certainty that your schemas are in sync with your responses.

Join the discussion in the #contenta Slack channel or come to the next API-First Meeting and show your interest there!

Hero photo by Oliver Thomas Klein on Unsplash.

Source: New feed

Lullabot: Decoupled Drupal maintenance support plans Hard Problems: Schemas
Shopping Cart
There are no products in the cart!
Continue Shopping
0