Change DB Storage Type

How is an object be moved from LocalDB to OnlineDB?

Current Workflow requires an an object stored locally and parsed through several views.
At the end of this worklow, the object is required to be saved online

@Dee I would recommend looking at this post Steven made yesterday:

Specifically the persistToDB function. You could use that same function, and just replace DB with OnlineDB. I would however suggest you don’t use OnlineDB and rather use DB, this will be more robust, as it will sync back to the cloud DB when you have connectivity and wont require an immediate internet connection.

If you potentially have more than one object and need to preserve ids and belongs_to ids, then objects can be created in the OnlineDB via backend api requests triaged through CloudCode.

The LocalDB, DB and OnlineDB namespaces cannot set the id fields of objects which are created. The backend API has the ability to set these id fields - which will allow the direct mirror of object ids and foreign belongs to keys.

Interacting with the backend API requires authentication and network connectivity. The auth credentials relating to the deployed environment are easily accessible via CloudCode tasks. Passing the payload of the source object to a CloudCode task can create an exact mirror of a LocalDB object in OnlineDB.

In the view JavaScript:

    var result = CloudCode.callTask('copier', view.local_object.toData());
    if (!result.success) {
        dialog(result.error);
    }

Create a CloudCode task called copier [or whatever else], set it to be triggerable from the app and install node-fetch@2.0.0 as an npm dependency.

In copier/index.js

const fetch = require('node-fetch'); //use node-fetch@2.0.0

export async function run(event) {
    //copy data into backend api format
    const payload = {operations: []};
    const single_object_payload = Object.assign({}, event.attributes);
    single_object_payload.id = event.id;
    single_object_payload.type = event.type;
    Object.keys(event.belongs_to).forEach(field => single_object_payload[`${field}_id`] = event.belongs_to[field]);

    payload.operations.push({
        method: 'put',
        object: single_object_payload
    });

    console.log(payload)

    const url = `${this.backend.url}/batch.json`;
    console.log(`Posting to ${url}`);
    
    const copy_response = await fetch(url, {
        method: 'POST',
        headers: {
            'Content-Type': 'application/json',
            'Authorization': `Bearer ${this.backend.token}`
        },
        body: JSON.stringify(payload)
    });

    console.log(`Status ${copy_response.status}`);
    
    if (copy_response.ok) {
        console.log(`Copied object successfully`);
        return {
            success: true
        };
    } else {
        return {
            success: false,
            error: `Received code ${copy_response.status}`
        };
    }
}

As David mentioned, the above solutions are limited and using DB would provide a better codebase. A properly configured set of sync rules and possibly a scheduled CloudCode task to delete any stale draft objects in the OnlineDB would provide significant advantages over manually attempting to sync objects.