Today, we are going to talk about how to manage migrations as configuration entities. This functionality is provided by the Migrate Plus module. First, we will explain the difference between managing migrations as code or configuration. Then, we will show how to convert existing migrations. Finally, we will talk about some important options to include in migration configuration entities. Let’s get started.

So far, we have been managing migrations as code. This is functionality provided out of the box. You write the migration definition file in YAML format. Then, you place it in the migrations directory of your module. If you need to update the migration, you make the modifications to the files and then rebuild caches. More details on the workflow for migrations managed in code can be found in this article.
Migrate Plus offers an alternative to this approach. It allows you to manage migrations as configuration entities. You still use YAML files to write the migration definition files, but their location and workflow is different. They need to be placed in a config/install directory. If you need to update the migration, you make the modifications to the files and then sync the configuration again. More details on this workflow can be found in this article.
There is one thing worth emphasizing. When managing migrations as code you need access to the file system to update and deploy the changes to the file. This is usually done by developers. When managing migrations as configuration, you can make updates via the user interface as long as you have permissions to sync the site’s configuration. This is usually done by site administrators. You might still have to modify files depending on how you manage your configuration. But the point is that file system access to update migrations is optional. Although not recommended, you can write, modify, and execute the migrations entirely via the user interface.
To demonstrate how to transition from code to configuration entities, we are going to convert the JSON migration example. You can get the full code example at https://github.com/dinarcon/ud_migrations The module to enable is UD config JSON source migration whose machine name is udm_config_json_source. It comes with four migrations: udm_config_json_source_paragraph, udm_config_json_source_image, udm_config_json_source_node_local, and udm_config_json_source_node_remote.
The transition to configuration entities is a two step process. First, move the migration definition files from the migrations folder to a config/install folder. Second, rename the files so that they follow this pattern: migrate_plus.migration.[migration_id].yml. For example: migrate_plus.migration.udm_config_json_source_node_local.yml. And that’s it! Files placed in that directory following that pattern will be synced into Drupal’s active configuration when the module is installed for the first time (only). Note that changes to the files require a new synchronization operation for changes to take effect. Changing the files and rebuilding caches does not update the configuration as it was the case with migrations managed in code.
If you have the Migrate Plus module enabled, it will detect the migrations and you will be able to execute them. You can continue using the Drush commands provided the Migrate Run module. Alternatively, you can install the Migrate Tools module which provides Drush commands for running both types of migrations: code and configuration. Migrate Tools also offers a user interface for executing migrations. This user interface is only for migrations defined as configuration though. It is available at /admin/structure/migrate. For now, you can run the migrations using the following Drush command: drush migrate:import udm_config_json_source_node_local --execute-dependencies.
Note: For executing migrations in the command line, choose between Migrate Run or Migrate Tools. You pick one or the other, but not both as the commands provided by the two modules have the same name. Another thing to note is that the example uses Drush 9. There were major refactorings between versions 8 and 9 which included changes to the name of the commands.
When managing migrations as configuration, you can set extra options. Some are exposed by Migrate Plus while others come from Drupal’s configuration management system. Let’s see some examples.
The most important new option is defining a UUID for the migration definition file. This is optional, but adding one will greatly simplify the workflow to update migrations. The UUID is used to keep track of every piece of configuration in the system. When you add new configuration, Drupal will read the UUID value if provided and update that particular piece of configuration. Otherwise, it will create a UUID on the fly, attach it to the configuration definition, and then import it. That is why you want to set a UUID value manually. If changes need to be made, you want to update the same configuration, not create a new one. If no UUID was originally set, you can get the automatically created value by exporting the migration definition. The workflow for this is a bit complicated and error prone so always include a UUID with your migrations. This following snippet shows an example UUID:
uuid: b744190e-3a48-45c7-97a4-093099ba0547
id: udm_config_json_source_node_local
label: 'UD migrations configuration example'The UUID a string of 32 hexadecimal digits displayed in 5 groups. Each is separated by hyphens following this pattern: 8-4-4-4-12. In Drupal, two or more pieces of configuration cannot share the same value. Drupal will check the UUID and the type of configuration in sync operations. In this case the type is signaled by the migrate_plus.migration. prefix in the name of the migration definition file.
When using configuration entities, a single migration is identified by two different options. The uuid is used by the Drupal’s configuration system and the id is used by the Migrate API. Always make sure that this combination is kept the same when updating the files and syncing the configuration. Otherwise you might get hard to debug errors. Also, make sure you are importing the proper configuration type. The latter should not be something to worry about unless you utilize the user interface to export or import single configuration items.
If you do not have a UUID in advance for your migration, you can try one of these commands to generate it:
# Use Drupal's UUID service.
$ drush php:eval "echo \Drupal::service('uuid')->generate(). PHP_EOL;"
# Use a Drush command provided by the Devel module, if enabled.
$ drush devel:uuid
# Use a tool provided by your operating system, if available.
$ uuidgenAlternatively, you can search online for UUID v4 generators. There are many available.
Technical note: Drupal uses UUID v4 (RFC 4122 section 4.4) values which are generated by the `uuid` service. There is a separate class for validation purposes. Drupal might override the UUID service to use the most efficient generation method available. This could be using a PECL extension or a COM implementation for Windows.
By default, configuration remains in the system even if the module that added it gets uninstalled. This can cause problems if your migration depends on custom migration plugins provided by your module. It is possible to enforce that migration entities get removed when your custom module is uninstalled. To do this, you leverage the dependencies option provided by Drupal’s configuration management system. The following snippet shows how to do it:
uuid: b744190e-3a48-45c7-97a4-093099ba0547
id: udm_config_json_source_node_local
label: 'UD migrations configuration example'
dependencies:
enforced:
module:
- ud_migrations_config_json_sourceYou add the machine name of your module to dependencies > enforced > module array. This adds an enforced dependency on your own module. The effect is that the migration will be removed from Drupal’s active configuration when your custom module is uninstalled. Note that the top level dependencies array can have others keys in addition to enforced. For example: config and module. Learning more about them is left as an exercise for the curious reader.
It is important not to confuse the dependencies and migration_dependencies options. The former is provided by Drupal’s configuration management system and was just explained. The latter is provided by the Migrate API and is used to declare migrations that need be imported in advance. Read this article to know more about this feature. The following snippet shows an example:
uuid: b744190e-3a48-45c7-97a4-093099ba0547
id: udm_config_json_source_node_local
label: 'UD migrations configuration example'
dependencies:
enforced:
module:
- ud_migrations_config_json_source
migration_dependencies:
required:
- udm_config_json_source_image
- udm_config_json_source_paragraph
optional: []What did you learn in today’s blog post? Did you know that you can manage migrations in two ways: code or configuration? Did you know that file name and location as well as workflows need to be adjusted depending on which approach you follow? Share your answers in the comments. Also, I would be grateful if you shared this blog post with others.
Next: Workflows and benefits of managing Drupal migrations as configuration entities
This blog post series, cross-posted at UnderstandDrupal.com as well as here on Agaric.coop, is made possible thanks to these generous sponsors. Contact Understand Drupal if your organization would like to support this documentation project, whether it is the migration series or other topics.
Hi friends and collaborators, join us today at 3pm ET (or any subsequent Thursday at 3) as we kick off a series of research, planning, discussion, and building sessions for Visions Unite.
As our primary pro bono project, Agaric is working on Visions Unite, "where people seeking to make the world more whole can share ideas and information and gather the commitment and resources to build power to be the change we need", which a dozen projects have tried to do—what makes this different is sharing power via democratic mass communication.
Here are some initial user stories for Visions Unite.
Help plan and build the interface and underlying technology! (Drupal friends, we have been leaning against Drupal but might do it for the MVP— would love to hear your thoughts for or against.)
Connection info will always be up-to-date at agaric.coop/show (for these sessions we are taking over most of our Show & Tell hour, which is weekly on Thursdays 3pm Eastern).
TL;DR: For PHP Hexadecimals, Decimals and Octals are all Integers, so they must be declared as @param integer
While I was working on a patch I had to write the docblock of a function which received a hexadecimal number and I wasn't sure what I was supposed to put in the @type param.
I went to Drupal's API documentation and comments standards page to see which is the best type for this param and I found the following:
Data types can be primitive types (int, string, etc.), complex PHP built-in types (array, object, resource), or PHP classes.
Alright, a hexadecimal number is not a complex PHP built-in type nor a PHP Class so it must be a primitive type, so I went to the PHP documentation page to see which primitives PHP has and I found the following:
So there wasn't a specific reference for a Hexadecimal number...
The solution:
In the end Pieter Frenssen helped me (Thanks!) with this, and he showed me that in PHP, it doesn't matter what the base number is and it can be an octal, hexadecimal or a decimal, for PHP they all are integers (which makes sense but I wanted to be sure) and he shared this small snippet where we can see that PHP sees the numbers as integers and the base doesn't matter:
$ php -a Interactive shell php > var_dump(gettype(0x0f)); string(7) "integer" php > var_dump(0x08 === 8); bool(true)
So if you are writing the documentation of a function in which one of its params is a hexadecimal number you must declare it as Integer.
The Climate Justice Action Map (CJAM) is a custom mapping tool that pulls 350 events and groups from multiple data sources (eg: ActionKit, EveryAction, CiviCRM) and displays an interactive map supporters can use to get involved.

It can be embedded within websites with many customization options (eg: preset the map center to a location,, show the map’s text and buttons in a different language, show only events related to a particular campaign, etc.).
It uses Mapbox for the map, OpenStreetMaps for the tileset, and Google Maps for the search lookup.
The CJAM Extract, Transform Load (ETL) Application is a data processor written in Python that runs every 15 minutes and pulls in data from those many sources (eg: EveryAction, CiviCRM) via APIs and direct SQL queries. It writes the combined event and group data to a JSON data file hosted on Amazon S3, which is then consumed by the CJAM JavaScript.
We met with 350 in mid-June, with the strikes set for September 20th and organizing pushes in July and August. With tight deadlines, a new team and a new codebase, we quickly got to work understanding the goals of the map, its current implementation and what needed to be done for each milestone.
On projects demanding quick turnarounds it's tempting to dive head first into the issue queue. We know though, that a project is only successful if everyone is aligned on the overall goals of the project. Luckily, the product team already had excellent documentation (they even had a slideshow!) on what the purpose of the climate action map and its key audiences.
350.org had a slideshow detailing the goals and audiences, helping us gain the background knowledge needed to effectively collaborate.
Goals
Key Audiences
These documents were great to have coming into our kickoff call.
Getting familiar with the inner workings of the climate action map was particularly challenging because the code was essentially in two states: the main branch with the original custom JavaScript and a refactor branch where the transition to React.js was happening. React is one of the most popular and widely used frameworks. Converting the application to React made the code easier to maintain and build upon. The original volunteer developer had begun this process of conversion and there were new features written in the new React way, unavailable until the refactoring was complete.
Mauricio and Chris met with him to get clear on how to see the transition to the end. They then set to familiarizing themselves with the codebase and refactoring along the way. By understanding, for example, a long complex function, and then rewriting it into smaller discrete functions, we were able to simplify the code, wrap our head around its inner workings and make it easier to work with for the next developer to join the project.
When first working with a codebase it takes time to understand why a new change isn't sticking or why an error is occurring. Logs are a developer's best friend when it comes to debugging. Unfortunately, the logging available was stark. The ETL had a running log, but wasn't saving to a file for future reference or easy retrieval. Chris made the error log easy to reference and even added Slack integration sending a message to the team whenever an error occurred - helping people quickly respond to issues.
350.org has hundreds of chapters, spread across seven continents, with members speaking dozens of languages. Their mapping tool was built with this diversity in mind. It serves as a powerful storytelling device (goal number one), with a single map conveying the impressive reach of the movement, and not making assumptions as to where a visitor is or what they're looking for.
On the other hand, mobilizing is most effective when it comes from people we know, from communities we're part of. As such, the map can live in more localized contexts, showing just events and groups relevant to a particular scenario. For example, the 350 Colorado chapter can display a map zoomed into the Mountain West, while 350 France can show a map with just events in French.
These custom maps are created using embed parameters. To do this, a 350.org organizer pasted the map onto a page using an iframe, passing in parameters such as language, location and data source by including a query parameter in the url.
However, this approach was cumbersome, technically prohibitive and error prone. We dropped the iframe approach and replaced it with a series of shortcodes, a more intuitive method, that make direct calls to the Climate Action Map API to render a map specific to an organizer's needs.
We added support for the following short codes:
Now organizers can create any number of maps with criteria meeting their campaign or community's specific needs.
With so many different events happening at any given time, the map risked overwhelming visitors looking to get involved. 350.org's designer Matthew Hinders-Anderson came up with the solution of applying different map pin styles to events depending on when they were happening. Past events have a subdued teal, while current and future events have a strong teal. To emphasize the storytelling (goal number) of the map, current events throb.
To accomplish this, we needed to calculate an event's date and time relative to the current time. Unfortunately, many of the events had no timezone associated with them. However, they did all have some form of location available. Chris found a handy Python tool called timezonefinder that calculates an event's timezone based on latitude and longitude.
With the timezone in hand, Mauricio could then apply the different colors (and flashing) based on the event's time relative to now.
We used Python to calculate an event's timezone based on its latitude and longitude.
With so many events organized, we wanted potential strikers to find an event to attend quickly. Map embedders found though, that sometimes searches would result in an empty map, despite events being nearby. This is one of the many challenges of designing interactive maps. One example was a visitor living in a nearby suburb of Boston. A search for Allston would turn up nothing, despite there being multiple events within a 5 mile radius. We fine tuned the zoom in behavior to better show nearby events.
There were still edge cases though. We addressed this by showing a "Zoom out" button if a visitor came up empty. Clicking that zooms a user out to the nearest result.
If a visitor gets no results from their search, they can zoom out to the nearest event or group.
The mobilization plan was to push activists and organizers to plan events from June through August. Then rally as many people to RSVP to the newly created events from August up until the big days: September 20th and September 27th. We rolled out the embed code functionality in August which organizers put to good use, embedding local and regional specific maps on local 350 group pages and climate strike specific websites they had built.
The map was so popular, that other organizations asked if they could embed it on their own sites - increasing the mobilization points and audiences reached. That we were able to do this speaks to the importance of defending the open web and free and open source software that allows for the decentralized sharing and using of tools.
On the first day of the strikes the pin styles came to life, lighting up the many walkouts, rallies and protests happening that day. It was a go to graphic for journalists and supporters on social media to share when reporting on the unprecedented participation.
Ultimately, the numbers we saw was a testament to the long, hard work organizers constantly engage in and the urgency of the moment we are in. However, with tools like the Climate Justice Action Map, built by technology activists alongside the organizers using them, we deepen and widen the mobilizing possible. And in these times of massive wealth inequality, deep political corruption, and closing window of time for the bold action we need, disrupting the status quo is more important than ever before.
Special thanks to the 350.org product team members Kimani Ndegwa, Matthew Hinders-Anderson, Nadia Gorchakova, and suzi grishpul for their vision, management of the project and design and development leadership.
No previous experience with React.js is needed. Familiarity with JavaScript syntax is expected.
A web browser is all that is needed to take the workshop. Installing the React DevTools is highly recommended. They are available for Firefox, Chrome, and (Chromium) Edge. The examples can be executed using a local web server. PHP, Python, Node.js all provide one out the box. It is also possible to run the examples on https://codesandbox.io/
This training will be provided over Zoom. You can ask questions via text chat or audio.

Attendees will receive detailed instructions on how to setup their development environment. In addition, they will be able to join a support video call days before the training event to make the the local development environment is ready. This prevents losing time fixing problems with environment set up during the training.
Modern Drupal is Drupal 8, 9, 10, 11 or above. It is the Drupal of today, with the ability to upgrade in place and powered in part by the Symfony framework. Modern Drupal uses the Twig templating engine. Modern Drupal will not leave your content behind.
Modern Drupal adds major improvements and deprecates old code gracefully.
Modern Drupal is what Agaric does.