Skip to content

Job request: 4233

Organisation:
Bennett Institute
Workspace:
hba1c-levels
ID:
7mpjnzpeuaujq3ts

This page shows the technical details of what happened when the authorised researcher Robin Park requested one or more actions to be run against real patient data within a secure environment.

By cross-referencing the list of jobs with the pipeline section below, you can infer what security level the outputs were written to.

The output security levels are:

  • highly_sensitive
    • Researchers can never directly view these outputs
    • Researchers can only request code is run against them
  • moderately_sensitive
    • Can be viewed by an approved researcher by logging into a highly secure environment
    • These are the only outputs that can be requested for public release via a controlled output review service.

Jobs

Pipeline

Show project.yaml
version: '3.0'

expectations:
  population_size: 10000

actions:

  generate_study_population:
    run: cohortextractor:latest generate_cohort --study-definition study_definition --index-date-range "2019-01-01 to 2021-06-01 by month" --output-dir=output/data #--output-format=feather
    outputs:
      highly_sensitive:
        cohort: output/data/input_*.csv
        
  generate_study_population_ethnicity:
    run: cohortextractor:latest generate_cohort --study-definition study_definition_ethnicity --output-dir=output/data #--output-format=feather
    outputs:
      highly_sensitive:
        cohort: output/data/input_ethnicity.csv
        
  join_ethnicity:
    run: python:latest python analysis/join_ethnicity.py #--output-format=feather
    needs: [generate_study_population, generate_study_population_ethnicity]
    outputs:
      highly_sensitive:
        cohort: output/data/input*.csv

  generate_data_description:
    run: jupyter:latest jupyter nbconvert /workspace/notebooks/data_description.ipynb --execute --to html --template basic --output-dir=/workspace/output --ExecutePreprocessor.timeout=86400 --no-input
    needs: [join_ethnicity]
    outputs:
      moderately_sensitive:
        notebook: output/data_description.html

  generate_charts:
    run: jupyter:latest jupyter nbconvert /workspace/notebooks/charts.ipynb --execute --to html --template basic --output-dir=/workspace/output --ExecutePreprocessor.timeout=86400 --no-input
    needs: [join_ethnicity]
    outputs:
      moderately_sensitive:
        notebook: output/charts.html

  # cohort_report:
  #   run: cohort-report:v2.0.1 output/data/input_joined.feather
  #   needs: [join_ethnicity]
  #   config:
  #     output_path: output/data
  #   outputs:
  #     moderately_sensitive:
  #       cohort_report: output/data/descriptives_input_joined.html
        
  # calculate_measures:
  #   run: cohortextractor:latest generate_measures --study-definition study_definition --output-dir=output/data
  #   needs: [join_ethnicity]
  #   outputs:
  #     moderately_sensitive:
  #       measure_csv: output/data/measure_*.csv
        
  # generate_abnormal_nb:
  #   run: jupyter:latest jupyter nbconvert /workspace/notebooks/abnormal_results.ipynb --execute --to html --template basic --output-dir=/workspace/output --ExecutePreprocessor.timeout=86400 --no-input
  #   needs: [calculate_measures]
  #   outputs:
  #     moderately_sensitive:
  #       notebook: output/abnormal_results.html
        
  # generate_levels_nb:
  #   run: jupyter:latest jupyter nbconvert /workspace/notebooks/hba1c_levels.ipynb --execute --to html --template basic --output-dir=/workspace/output --ExecutePreprocessor.timeout=86400 --no-input
  #   needs: [join_ethnicity]
  #   outputs:
  #     moderately_sensitive:
  #       notebook: output/hba1c_levels.html

Timeline

  • Created:

  • Finished:

  • Runtime:

These timestamps are generated and stored using the UTC timezone on the TPP backend.

Job request

Status
Succeeded
Backend
TPP
Workspace
hba1c-levels
Requested by
Robin Park
Branch
study-def
Force run dependencies
No
Git commit hash
7744ae7
Requested actions
  • generate_charts

Code comparison

Compare the code used in this job request