Bulk Import Images

How would you one import a collection of images stored locally?
Due to the number of images importing individually us not an option.

Hi @Dee

Is this for a once off import of something like Master Data, or as part of an app workflow that users will need to interact with?

HI @tielman

For now it is a once off thing, but may potentially be required in the near future.

The source of imagery was initially SQL and an export thereof:
I’ve explored the following options, both of which failed:

  1. CSV Import to Journey Backend - images stored as base64 string in csv

  2. Connect to DB via mssql and query images

The return from mssql is of an unfamiliar typing: {“type”:“Buffer”,“data”:[255,216,255,…]} which I am unable to convert access and/or convert the array to base64


  1. You cannot create Images via CSV import (as you’ve seen), but you can use the Backend API to create the images in bulk. I would use the Batch API - just check the Photos Field representation to see how to use the Base64 string
  2. If you are connecting to your SQL via CloudCode you should be able to create Attachments from the Buffer Data you get from SQL. Try the following
var sqlData = {“type”:“Buffer”,“data”:[255,216,255,…]};
var newAttachment = await Attachment.create({
    filename: “example.png”,
    mediaType: “image/png”,
    data: sqlData.data

var newObject = DB.some_object.create();
newObject.photo_field = newAttachment;
await newObject.save();

Depending on how many images you are creating you should also consider saving them in a batch rather than 1-by-1

Let me know if this worked

Thanks Tielman,

It seemed to have worked having the data defined as the entire object, i.e.:
data: sqlData

Glad to hear it.

If possible, would you be willing to share your exact code for the benefit of our community, especially the bits that deal with the connection to SQL, the query and the subsequent writing to the JourneyApps DB? Remove all sensitive/proprietary bits of course

Hi @tielman, While not specific to Images, the below is fairly generic enough to be used if tweaked to suit. I’m yet to apply it to bulk imports of imagery from SQL, and will update accordingly It combines Journey’s Batch() functions and mssql’s Streaming for large sets of data

“mssql”: “^7.3.0”

const sql = require("mssql")

const config = {
    server: '',
    port: '',
    user: '',
    password: '',
    database: '',
    trustServerCertificate: true,
    options: {
        trustedConnection: true,
        encrypt: true,
        enableArithAbort: true,
        trustServerCertificate: true,
        cryptoCredentialsDetails: {
            minVersion: 'TLSv1'

export async function run(params) {
    var batch, jobsInBatch, tableObject
    const batchLimit = 50

    sql.connect(config, err => {

        const request = new sql.Request()
        request.stream = true // You can set streaming differently for each request
        request.query('select * from **sqlTableName**') // or request.execute(procedure)

        request.on('recordset', columns => {
            batch = new DB.Batch();
            jobsInBatch = 0;

        request.on('row', row => {
            **tableObject** = DB.**tableName**.create();
            if(jobsInBatch >= batchLimit){
                batch = processBatch()

        request.on('done', result => {
            // Always emitted as the last one
            if(jobsInBatch > 0){

        function processBatch(){
            jobsInBatch = 0;
            return new DB.Batch();



1 Like

Epic, thanks!