scrape_all_data
Timeline
-
Created by the backend:
-
Moved to the pending or running state on the backend:
-
Failed running on the backend:
-
Time spent running on the backend: 00:00:11
-
Last update by backend:
These timestamps are generated and stored using the UTC timezone on the TPP backend.
Job information
- Status
-
Failed
Job exited with an error
If you have VPN access you can view the log output in Airlock, in the workspace file:
metadata/scrape_all_data.log
- Job identifier
-
vwyxbwmrpgg7tlgf
- Job request
- 20443
- Requested by
- Alasdair Henderson
- Branch
- main
- Backend
- TPP
- Action
-
scrape_all_data
- Run command
-
ehrql:v0 generate-dataset analysis/scrape_data_response_dates.py --output output/openprompt_all.csv --dummy-tables output/dummydata/dummy_edited
- Git commit hash
- 543eb33