Copy a space - workflows, datasets, exports, snapshots within the same server
At moment l have setup a space for single view of customer. Business would like to copy the full space to another space in the server and make one dev and the other test.
- how do you go about copying a space to another space?
- Is there any issues or concerns in doing this copy? Do l lose the datasets and data and l have to reload it all
- If we then do changes in Dev space, how do you go about updating the Test space? Is it a copy and override? Is the steps the same for doing the copy and override? Do you lose datasets and data if you do this
Hey! You will be signed out in 60 seconds due to inactivity. Click here to continue using the site.
Download .dmx file (export Space)
Select all the metadata objects in the Space and click on Download in browser:
This will download a dmx file to your computer. Note that datasets are NOT downloaded but the schemas are. You need to manually re-import your datasets to the new environment.
Create a new environment:
Click on “manage environments” (you’ll need to be logged in as the admin to do this)
Click on Create Environment
Give it a name –
Click in to the new Environment –
Go to System > Create a new space or use the default one and click on Synchronize:
NOTE: You must use Synchronize when you first do this if you want to keep your Spaces synchronized going forward:
Browse to the .dmx we just exported and open it. It should import all the metadata. You just need to hook up the Datasets again and you are good to go.
You can then see in the screenshot above the menu option to “synchronise”. If you make changes in your dev environment and then download the dmx file again you can synchronise it to your new environment (Don’t Import)
The current version that you are on does not support import/export of data along with the workflows etc. It does export the schemas. Once you have done import of the .dmx file you will need ot hook up all your datasets again. Data Studio V2.2, due in October, will support the export/import of data as part of the dmx file.
Synchronising of a new version of the space will not over-write the data once you have hooked it up (unless you specify it to do so in v2.2) so it is a one off job.
Hope that helps.
@Carolyn I had to edit the above to make it clearer that you need to Synchronize the Spaces from the very beginning. If you Import to the Space at the beginning you do not get the necessary link between the two Spaces.
@Chris Hope you are well? do l need to create another environment if the space will be on the same server? Is the reason for making another environment on the same server is to separate dev and test? Do l lose any functionality etc if l do another environment? Please advise
hi also does this copy all the alerts, notifications, workflows, snapshots and export to database steps. Will they also be copied and just data needs to be reloaded.
Hey Carolyn. Very well thanks. Melting in the heat over here. We're not used to it!
You cannot sync a space in the same environment so you will need to create a new one.
You shouldn't lose any functionality (although, i'm not sure a understand the question fully).
Remember that your license restrictions span environments so you share users, row counts, Dataset limits across environments.
This has worked well, thank you. Very excited. I will test synchronising next week when l have time.Just a question with environment naming. As we now have a Test environment, if l changed the name of original one from Default will this cause me any issues. Like to rename it to know its Dev.
@Carolyn There shouldn't be any issue changing the name of your environments.
@Chris Downer Hi, l have a problem with the export and synchronising.
I completed this task, no issues as mentioned above. Once l loaded all the data sets, it started to play up. Creating snaps, multiple times etc.
I did some final work in DEV, tested syncing again to TEST, to see how this would work and l got a blank white page. Couldnt get into Experian. So stopped and started server and got a space disk error.
We decided this morning, as the TEST playing up, to delete the space in TEST, redo the export and import. Looks great. I reloaded the data and fixed notifications and schedules if relying on the dataset.
When l go to add the dataset to the workflow, as they had dropped off it says its done it, look at snap on screen, looks okay, l run job l get
as dataset dropped out again. We looked at datastudio logs and can see this:
Seeing some errors in the datastudio.log like the following "2020-08-14 15:16:27,703 ERROR c.e.d.r.RepositoryService [qtp1368949449-186] Error running action Update Workflowcom.experian.datastudio.repository.exception.ActionValidationFailure: 2097: Version id 0 cannot be edited..."
I look at the workflow and it seems l have published only no draft and l think this might be issue, which l vaguely saw at last step of synchronising.
Did l do something wrong when synchronising this?
Any assistance asap on this would be helpful
Hi Carolyn - sorry to hear you are having these issues!
The first thing i need to check is that when you first do the export and import you are NOT using "import". You need to use "synchronize" the first time and every time you do the synchronize. I know that i called this out above, but given it's the simplest known issue it is worth checking.
If you are confident you did do this then this issue looks too complicated to comment on here. I think that you are working with a fairly complex Space and we would need our support team to try and replicate the issue and then look at the logs.
Sorry i can't be of more immediate assistance.
No l used export and then on the Test environment used synchronised and then it asks for file.
Ok thanks. there was a step just before it completed that asked about version it was quick, had like v1.0 for the workflows, just wondering if this should have been changed
I will see if l can do a small space and see if l get the step that l think might have caused it
oh sorry you are not going to get the picture. I will upload
Hi just found on export an tick box that states do you want draft
Just trying this
Ok what l found:
Think because we select sync it, its assuming no change should happen unless in the environment it was sourced from
Not allowing any changes to workflow, as version is blocked out. This is an issue as when you do this, you are disconnected from the dataset in the workflow and you need to reconnect the dataset and not able to. This is whats causing issue.
When l did a sync and then did another sync to update a change to workflow, it didn't update the workflow. It will for dataset, but l think its stopping the version to be updated therefore doesn't change it.
Hi @Carolyn You have lost me a bit. Some more things to note... You can't use draft versions when synchronizing...
You need all the objects that you are synchronizing to be published and they need to remain with published versions when you synch or they will be lost. This is because we expect the published versions to be approved for production and we consider the drafts have not been explicitly signed off by the user for synchronization.
Sorry lm confusing you. I can see that's what you need to do but when you sync the workflows the dataset will drop off as you haven't loaded dataset.I load dataset and then try to update the source for the workflow it wont allow you to update this.
I think due to the inability to do another version
Please see attached
You shouldn't be losing the source when you sync. Check out this video -
So it sounds like something is going awry on the initial sync if you are going in to those workflows and the source is no longer mapped?
thank you. What you have done is exactly what l have done
I have lost the source - it was there as the header line like yours
I uploaded the data and then lost the source - having said this, l didn't do an upload for this one
Its a new test datasource, so l had to create a new dataset from test datasource. In the DEV environment its connected to DEV datasource. Maybe this the issue
Same server, just a different database so we could just have a limited number of records
thanks for helping me with this