This is a prototype of how to share a DT.
graph LR
ExternalDatabase[External Database] --> SQLDataloader[SQL Dataloader]
ExternalDatabase --> PostGisDataloader[Post-GIS Dataloader]
subgraph ODTP
SQLDataloader --> ODTPMobilitySimulation[ODTP Mobility Simulation]
PostGisDataloader --> ODTPMobilitySimulation
ODTPMobilitySimulation --> ODTpMetrics[ODTP Mobility Metrics]
ODTPMobilitySimulation --> ODTpNextLocationPrediction[ODTP Next Location Prediction]
ODTpNextLocationPrediction --> PYGWalker[pyGWalker CSV Visualization]
ODTpMetrics --> PYGWalker
end
Model[Model] --> ODTpNextLocationPrediction
In order to adapt this to the current ODTP pipeline, the workflow is executed in this order:
- SQL-Dataloader
- Post-GIS Dataloader
- Mobility Simulation
- Mobility Metrics
- Next Location Prediction
- pyGWalker
- Clone this repository
- Edit
dt-mobility-causal-intervention.sh
with the ODTP user email and the desired digital twin and execution name. - Configure the parameters in
parameters
- Configure the secrets in
secrets
. Rename001.secrets.dist
and002.secrets.dist
to001.secrets
and002.secrets
. Add your credentials. - Run the bash script in your odtp instance:
sh dt-mobility-causal-intervention.sh
How to remove the execution? This command will delete any mongoDB entry and the content of the execution folder, allowing for a fresh execution.
odtp execution delete --execution-name execution --project-path $(pwd)/dt-mobility-causal-intervention/execution
-
v0.2.0
- Updated components
- Added docker-compose for testing purposes.
-
v0.1.0
- Initial implementation