Sorry, no results found for "".

Import and export content > Exporting data and backups

Exporting data and backups

Exporting your DatoCMS data or making offline backups is easy with our Content Management API. We'll show a few simple scripts below.

Before you do that, if you are just looking for an easy way to export content directly from the dashboard, you may want to consider plugins like Project Exporter first.

For a more programmatic approach, here's a quick example script that dumps every record into a records.json file:

import { buildClient } from '@datocms/cma-client-node';
import fs from 'fs/promises';
async function main() {
const client = buildClient({
apiToken: 'YOUR-FULL-ACCESS-API-KEY',
environment: 'YOUR-ENVIRONMENT-NAME',
});
const itemTypes = await client.itemTypes.list();
const models = itemTypes.filter((itemType) => !itemType.modular_block);
const modelIds = models.map((model) => model.id);
const records = [];
for await (const record of client.items.listPagedIterator({
nested: true,
filter: { type: modelIds.join(',') },
})) {
records.push(record);
}
const jsonContent = JSON.stringify(records, null, 2);
await fs.writeFile('backupProduction.json', jsonContent, 'utf8');
}
main();

And here is a simple script that exports all assets, and downloads them locally:

import { buildClient } from '@datocms/cma-client-node';
import fetch from 'node-fetch';
import { writeFile } from 'fs/promises';
async function downloadImage(url) {
const response = await fetch(url);
const buffer = await response.buffer();
const fileName = new URL(url).pathname.split('/').pop();
await writeFile('./' + fileName, buffer);
}
async function main() {
const client = buildClient({
apiToken: 'YOUR-FULL-ACCESS-API-KEY',
environment: 'YOUR-ENVIRONMENT-NAME',
});
const site = await client.site.find();
for await (const upload of client.uploads.listPagedIterator()) {
const imageUrl = 'https://' + site.imgix_host + upload.path;
console.log(`Downloading ${imageUrl}...`);
downloadImage(imageUrl);
}
}
main();

You can then add this script into a cron-job and store the result in a S3 bucket, upload it to another system, or back up the results locally.