Editing of data in the object-table to sync in memory and not DB.

Is there a way we can load data on to the object-table from a DB and allow the user to edit it without syncing it to the DB but do bulk executing once the user is done working on the screen?
This will mean that if the container is closed unexpectedly or forcefully, the data should still remain on the device.

We trying to resolve an issue of typing in a cell of the object-table and waiting it to sync/reload. We want to be able to continuously edit fields/cells in the object-table without waiting.

This is possible by making use of LocalDB. You can make a local mirror of the objects and bind them to the object-table instead for editing.

In the view xml:

<var name="local_data_set" type="array:data_object" />
<var name="db_data_set" type="array:data_object" />

<object-table label="Nice Table" query="local_data_set" empty-message="Your items will appear here">
        <column heading="Name" display="{name}">
            <edit-text value="$object.name" on-change="$: updateValue($object, newValue, 'name')" />
        </column>
</object-table>

In the JavaScript:

function init() {
    view.db_data_set = DB.data_object.toArray();
    view.local_data_set = view.db_data_set.map(function(item) {
       var local_copy = LocalDB.data_object.create(item); //set all the enumerable fields

       Object.keys(LocalDB.data_object.type.belongsTo).forEach(function(relationship_name){
         local_copy[field_name + '_id'] = item[field_name + '_id']; //copy all belongs_to_id fields
       });

       return local_copy;
    })
}

function updateValue(record, new_value, field_name) {
   record[field_name] = new_value;
   //the save is optional depending on the use case
}

function persistToDB() {
  var batch = new DB.Batch();

  view.local_data_set.forEach(function(local_item, index){
    var db_item = view.db_dataset[index];
    db_item.setAll(local_item);

     Object.keys(LocalDB.data_object.type.belongsTo).forEach(function(relationship_name){
         db_item[field_name + '_id'] = local_item[field_name + '_id']; //copy all belongs_to_id fields
      });

   batch.save(db_item);
  });

  batch.execute();
  
}

Be sure to clear the LocalDB periodically to avoid a large build up of objects.

1 Like

The local mirror to a LocalDB I have been doing it to structure the data for reporting or reading purposes.
var local_copy = LocalDB.data_object.create(item) - for this to work, does the field names need to be the exact for the whole entire data model or it will match the fields to the fields that relate and ignore those that dont?

With the LocalDB the below or as per image attached, wont happen when editing?

The .create and .setAll functions copy only the matching field names. Any fields present in the source and not present in the target are ignored. Belongs to IDs are also not copied.

The saving indicator is an indication of a slow on-change callback being executed. The performance would greatly depend on the logic in the on-change callback. If most of the logic is stripped out after using LocalDB copies and only performed in bulk later on - then there should be a performance increase.

1 Like

Thanks got it.
Most of our on-change callback is validating the new value is the same as the old value, meaning we would then ignore the object.save() if its the same and only save to the DB once we get something different from the old value. But the saving indicator will still show (using a DB and not a LocalDB as the logic for the LocalDB still needs to be implemented)

The most common cause of a slow callback is usually performing many DB queries. OnlineDB queries are particularly slow in some circumstances and should be batched whenever possible.

A useful tool to investigate the queries being executed is to enable verbose logging in the dev tools.

Examining the amount of queries and the time taken to execute each query can help pinpoint potentially unoptimised code.