Refresh dev data from production… with a large amount of container data

Following the guides here: Refresh Dev | OttoFMS

I have a question about container data when doing a “dev refresh”.
The goal is to do a deployment from dev to prod, then immediately when finished, refresh the dev data with fresh production data. Since we will be doing this right away, we are using the “simple” Solution 1 from the linked guide, and just replacing the dev files with a copy of the production files.

But… we also have a fair amount of container data that we want to refresh. Let’s assume that 95% or more of the container data does not change between deployments. To clarify: I believe that none of the existing container files will change, but there will be some new container data that we want to make sure we get as part of the refresh.
It won’t hurt anything to replace it all… but it will take a long time to transfer all of that data. We’ve seen issues where using OttoDeploy to transfer container data is extremely slow (read: days to complete).
Is there a way to tell it to only copy “new” container data? In the past, we’ve used CyberDuck’s “Compare” option to copy the container data instead… but it would be nice to do this right in the refresh deployment.
Thanks!

Hey Shawn,

OttoFMS does not have a way to do this right now. I’ll think on whether we want to add something like this or not, it adds a fair bit of overhead in the build process to compare the two file structures, especially if there is a significant quantity of container files. It also makes the build reliant on going to a specific server, which is not something they handle at all right now.

Given that, I would recommend using something like a cyberduck compare, you could also use offsite backups of the external container folders and a compare to s3 to do something similar.

-Kyle