Job request: 13855
- Organisation:
- Bennett Institute
- Workspace:
- ethnicity-short-data-report-notebook
- ID:
- zwyqugbap42gdgnq
This page shows the technical details of what happened when the authorised researcher Colm Andrews requested one or more actions to be run against real patient data within a secure environment.
By cross-referencing the list of jobs with the pipeline section below, you can infer what security level the outputs were written to.
The output security levels are:
-
highly_sensitive
- Researchers can never directly view these outputs
- Researchers can only request code is run against them
-
moderately_sensitive
- Can be viewed by an approved researcher by logging into a highly secure environment
- These are the only outputs that can be requested for public release via a controlled output review service.
Jobs
-
- Job identifier:
-
c5wi6p5qpi23lyxl
-
- Job identifier:
-
hn27mknowucqsyd6 - Error:
- nonzero_exit: Job exited with an error: Ran out of memory (limit for this job was 128.00GB)
-
- Job identifier:
-
t34lrwnjdsjuvxey - Error:
- dependency_failed: Not starting as dependency failed
-
- Job identifier:
-
irlcvucpnz33kpv7 - Error:
- dependency_failed: Not starting as dependency failed
-
- Job identifier:
-
iwe52prisuua7j3g - Error:
- dependency_failed: Not starting as dependency failed
-
- Job identifier:
-
xnq6goxm6kk6y6q3 - Error:
- dependency_failed: Not starting as dependency failed
-
- Job identifier:
-
aktxrkwczx5ujuci - Error:
- dependency_failed: Not starting as dependency failed
-
- Job identifier:
-
vlxwplctqtyuza2f - Error:
- dependency_failed: Not starting as dependency failed
Pipeline
Show project.yaml
### CTV3, PRIMIS, SNOMED analysis
version: '3.0'
expectations:
population_size: 1000
actions:
split_codelist:
run: r:latest analysis/00_trim_snomed_codelist.r
outputs:
highly_sensitive:
data: codelists/ethnicity_*.csv
generate_study_population:
run: cohortextractor:latest generate_cohort --study-definition study_definition --output-file output/extract/input.feather --output-format feather
needs: [split_codelist]
outputs:
highly_sensitive:
cohort: output/extract/input.feather
generate_dataset_report:
run: >
dataset-report:v0.0.9
--input-files output/extract/input.feather
--output-dir output/extract
needs: [generate_study_population]
outputs:
moderately_sensitive:
dataset_report: output/extract/input.html
execute_simple_validation_analyses:
run: python:latest python analysis/simple_validation_script.py
needs: [generate_study_population]
outputs:
moderately_sensitive:
tables: output/simplified_output/5_group/tables/*.csv
execute_simple_validation_analyses_16:
run: python:latest python analysis/simple_validation_script_16.py
needs: [generate_study_population]
outputs:
moderately_sensitive:
tables: output/simplified_output/16_group/tables/simpl*.csv
combine_ons:
run: r:latest analysis/combine_ons.R
needs: [generate_study_population]
outputs:
moderately_sensitive:
data: output/ons/ethnic_group*.csv
data2: output/ons/ethnic_group_NA*.csv
data3: output/ons/data_check*.csv
ons_plot:
run: r:latest analysis/ons_plots.R
needs: [combine_ons]
outputs:
moderately_sensitive:
plot_na: output/ons/na_removed/ethnicity_*.png
# ## Only to be run locally
# generate_report:
# run: python:latest jupyter nbconvert /workspace/notebooks_jupyter/draft_report_ethnicity.ipynb --execute --to html --template basic --output-dir=/workspace/output --ExecutePreprocessor.timeout=86400 --no-input
# needs: [execute_simple_validation_analyses,execute_simple_validation_analyses_16]
# outputs:
# moderately_sensitive:
# notebook: output/draft_report_ethnicity.html
# local_simple_run:
# run: python:latest python analysis/local_simple_run.py
# outputs:
# moderately_sensitive:
# tables: output/from_jobserver/release_2022_11_18/made_locally/*.csv
# ######## SUS
# execute_simple_validation_analyses_sus:
# run: python:latest python analysis/sus/simple_validation_script_sus.py
# needs: [generate_study_population]
# outputs:
# moderately_sensitive:
# tables: output/sus/simplified_output/5_group/tables/*.csv
# # figures: output/sus/simplified_output/5_group/figures/*.png
# execute_simple_validation_analyses_16_sus:
# run: python:latest python analysis/sus/simple_validation_script_16_sus.py
# needs: [generate_study_population]
# outputs:
# moderately_sensitive:
# tables: output/sus/simplified_output/16_group/tables/simpl*.csv
# combine_ons:
# run: r:latest analysis/combine_ons.R
# needs: [generate_study_population]
# outputs:
# moderately_sensitive:
# data: output/ons/ethnic_group*.csv
# data2: output/ons/ethnic_group_NA*.csv
# data3: output/ons/data_check*.csv
# ons_plot:
# run: r:latest analysis/ons_plots.R
# needs: [combine_ons]
# outputs:
# moderately_sensitive:
# plot_na: output/ons/na_removed/ethnicity_*.png
# iqr:
# run: r:latest analysis/iqr.R
# needs: [generate_study_population]
# outputs:
# moderately_sensitive:
# full: output/simplified_output/5_group/tables/range_fullset.csv
# registered: output/simplified_output/5_group/tables/range_registered.csv
# figures: output/simplified_output/5_group/figures/density*.svg
# ## Only to be run locally
# generate_report:
# run: python:latest jupyter nbconvert /workspace/notebooks_jupyter/draft_report_ethnicity.ipynb --execute --to html --template basic --output-dir=/workspace/output --ExecutePreprocessor.timeout=86400 --no-input
# needs: [execute_simple_validation_analyses,execute_simple_validation_analyses_16]
# outputs:
# moderately_sensitive:
# notebook: output/draft_report_ethnicity.html
# local_simple_run:
# run: python:latest python analysis/local_simple_run.py
# outputs:
# moderately_sensitive:
# tables: output/from_jobserver/release_2022_11_18/made_locally/*.csv
################## over time
time_tables:
run: r:latest analysis/time/times.R
needs: [generate_study_population]
outputs:
moderately_sensitive:
data: output/time/*.csv
plots: output/time/*.png
Timeline
-
Created:
-
Started:
-
Finished:
-
Runtime: 12:43:21
These timestamps are generated and stored using the UTC timezone on the TPP backend.
Job request
- Status
-
Failed
- Backend
- TPP
- Workspace
- ethnicity-short-data-report-notebook
- Requested by
- Colm Andrews
- Branch
- notebook
- Force run dependencies
- Yes
- Git commit hash
- fc51411
- Requested actions
-
-
run_all
-
Code comparison
Compare the code used in this job request