ArsenyFinkelsteinLab DataJoint workflow - cloud deployment
This workflow is fully containerized using Docker and is currently setup for automatic deployment to AWS EC2 instance that is already provisioned. The credentials to access this EC2 instance is set as GitHub secrets (see GitHub Actions secrets for details)
Once the required credentials are added, initialization of this workflow is done by manually trigger a predefined GitHub Actions named start_process
(see code here)
Your GitHub username needs to be specified in the allow-list for you to be able to trigger the GitHub Actions. If you are not in the list yet, make changes to the list and issue a PR for Arseny's approval.
- From any browser, navigate to the GitHub repository for this workflow - make sure to log in to your GitHub account
- Click on the Actions tab, in which you will see the list of available "GitHub workflows" on the left pannel. There is only one for this repo, named
start_process
- Click on the GitHub workflow
start_process
, then on right side, from the dropdown menu "Run Workflow", click the green button "Run Workflow" to trigger one round of operation
Once a new round of operation is triggered with the GitHub Actions above, the pre-provisioned EC2 instance will start, and then the following steps are performed
- Clone this repository - this is recloned everytime the flow is triggered, ensuring the latest version of the code is used
- Rebuild the Docker image - again, docker image is rebuilt everytime, ensuring latest version of the workflow is deployed
- Lauch the Docker container with the newly built image - this will then run all of the DataJoint computation, essentially all
.populate()
calls defined here - Once finished (successful or not), the Docker container will terminate itself, thus trigger the EC2 instance to shutdown
For installation and deployment on any other resources (e.g. local or other cloud resources), see local deployment