Running a content migration

Before diving into the migration, we want to do one more thing: make the Event format field required. The primary reason is that we always want to have information for this with an event, but it will also let us validate that the content migration has been successful later on.
rule.required()
validation rule to the Event format fielddefineField({ name: 'format', type: 'string', title: 'Event format', options: { list: ['in-person', 'virtual'], layout: 'radio', }, validation: (rule) => rule.required(),}),
If you save this change and run the sanity documents validate -y
command again, you should get a lot of errors like this on the event
document type:
# in apps/studionpx sanity@latest documents validate -y
You should get errors on all event documents like:
ERROR event AUoLUkEDo6CVeRx5svBpXH└─ format ........................ ✖ Required
When you finish the content migration, this validation error should disappear. You are now ready to prepare the content migration to move the values over from the old to the new field in all the existing documents.
You are now ready to do a dry run to check how the migration script will affect your content in the dataset. You won't need to add a flag to dry run this, migrations are dry run by default.
# in apps/studiopnpm dlx sanity@latest migration run replace-event-type-with-event-format
If run successfully, this command should output a list of patches for each event document, looking like this:
patch event AUoLUkEDo6CVeRx5svBpBB └─ eventType ..................... unset()
patch event AUoLUkEDo6CVeRx5svBiyh └─ format ........................ setIfMissing("in-person")
It’s a good habit to export the dataset before running a content migration, especially if you do it on production content. This way, you have a backup that you can import if something goes wrong or you make an error (it happens to us all).
# in apps/studiopnpm dlx sanity@latest dataset export production
This will export all your documents into a newline-delineated JSON file (.ndjson
) and assets in their folder as a production.tar.gz
file. If you want to “reset” your dataset to its state before the content migration, you can run the following command:
# in apps/studiopnpm dlx sanity@latest dataset import production.tar.gz production --replace
Note that the --replace
will overwrite documents with the same _id
.
With this backup, you can execute the content migration knowing that you have a way to restore the dataset if something goes wrong.
With the dataset backup in place and having verified that the content migration script outputs the changes we want, you are now ready to move a lot of data in one swoop.
--no-dry-run
flag# in apps/studiopnpm dlx sanity@latest migration run replace-event-type-with-event-format --no-dry-run
If everything went well, then your terminal should have an output like this:
❯ sanity migration run replace-event-type-with-event-format --no-dry-run? This migration will run on the production dataset in your-project-id project. Are you sure? Yes✔ Migration "replace-event-type-with-event-format" completed.
Project id: uxqfcg2d Dataset: production
179 documents processed. 179 mutations generated. 1 transactions committed.
This tells you that you have migrated 179 documents with 179 mutations, and it was all done with one transaction (that is one API request). In other words, it happened almost instantly for all these documents and anyone who had been in the Studio while you ran this.
Run the validation command again to see if there is content for the Event format field.
# in apps/studiopnpm dlx sanity@latest documents validate -y
You should see no more validation errors:
❯ sanity documents validate -y ✔ Loaded workspace 'default' using project 'uxqfcg2d' and dataset 'production' (1.8s)✔ Downloaded 457 documents (0.3s)✔ Checked all references (0.0s)✔ Validated 444 documents (3.4s)
Validation results:✔ Valid: 444 documents✖ Errors: 0 documents, 0 errors⚠ Warnings: 0 documents, 0 warnings