Skip to content

Job request: 10414

Organisation:
NHSE/I
Workspace:
nhs_at_home_main
ID:
5og7hvx5wbnxjgiz

This page shows the technical details of what happened when the authorised researcher Alexandra Benson requested one or more actions to be run against real patient data in the project, within a secure environment.

By cross-referencing the list of jobs with the pipeline section below, you can infer what security level various outputs were written to. Researchers can never directly view outputs marked as highly_sensitive ; they can only request that code runs against them. Outputs marked as moderately_sensitive can be viewed by an approved researcher by logging into a highly secure environment. Only outputs marked as moderately_sensitive can be requested for release to the public, via a controlled output review service.

Jobs

  • Action:
    generate_study_definition_static
    Status:
    Status: Succeeded
    Job identifier:
    ew5ewclov4yautjd
  • Action:
    generate_study_population_bp
    Status:
    Status: Failed
    Job identifier:
    6req5bm5wt5m7doh
    Error:
    nonzero_exit: Job exited with error code 5: Ran out of memory (limit for this job was 128.00GB)
  • Action:
    generate_study_population_proactive
    Status:
    Status: Failed
    Job identifier:
    o6xys5kw2qr2rmue
    Error:
    nonzero_exit: Job exited with error code 5: Ran out of memory (limit for this job was 128.00GB)
  • Action:
    generate_study_population_oximetry
    Status:
    Status: Failed
    Job identifier:
    i2qxdn7ubnjnrufh
    Error:
    nonzero_exit: Job exited with error code 5: Ran out of memory (limit for this job was 128.00GB)
  • Action:
    generate_oximetry_regional_timeseries
    Status:
    Status: Failed
    Job identifier:
    scsnhupl5tdyrk3s
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    generate_oximetry_timeseries
    Status:
    Status: Failed
    Job identifier:
    ochvboxfh3ffa7ri
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    join_cohorts_bp
    Status:
    Status: Failed
    Job identifier:
    uwsh6e2rkund325g
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    generate_proactive_codes_analysis
    Status:
    Status: Failed
    Job identifier:
    lnjxhze25jdxqveu
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    generate_proactive_regional_timeseries
    Status:
    Status: Failed
    Job identifier:
    ru2ofg37uitczeoz
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    generate_bp_timeseries
    Status:
    Status: Failed
    Job identifier:
    xplkevwxxyq6h4df
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    generate_bp_regional_timeseries
    Status:
    Status: Failed
    Job identifier:
    nvtjh4xqhd4nejj5
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    generate_oximetry_breakdowns
    Status:
    Status: Failed
    Job identifier:
    i2neqw3njovthooj
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    generate_proactive_breakdowns
    Status:
    Status: Failed
    Job identifier:
    r2onzb3xtsb2ah6b
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    generate_proactive_timeseries
    Status:
    Status: Failed
    Job identifier:
    x2duil6er7veeg2d
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    join_cohorts_oximetry
    Status:
    Status: Failed
    Job identifier:
    ybekzvhrzxkjlpl6
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    join_cohorts_proactive
    Status:
    Status: Failed
    Job identifier:
    rrt4mpzckq7zjgrn
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    generate_bp_breakdowns
    Status:
    Status: Failed
    Job identifier:
    vm2owxrvyp4zub54
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    generate_bp_codes_analysis
    Status:
    Status: Failed
    Job identifier:
    nfxur2zregsyaptb
    Error:
    dependency_failed: Not starting as dependency failed
  • Action:
    generate_oximetry_codes_analysis
    Status:
    Status: Failed
    Job identifier:
    qf2qtlvfj4zpznuf
    Error:
    dependency_failed: Not starting as dependency failed

Pipeline

Show project.yaml
version: "3.0"

expectations:
  population_size: 1000

actions:
  generate_study_definition_static:
    run: cohortextractor:latest generate_cohort
      --study-definition study_definition_static
      --with-end-date-fix
    outputs:
      highly_sensitive:
        cohort: output/input_static.csv

  # Oximetry
  generate_study_population_oximetry:
    run: cohortextractor:latest generate_cohort
      --study-definition study_definition_oximetry
      --index-date-range "2019-04-01 to 2022-05-30 by week"
      --with-end-date-fix
    outputs:
      highly_sensitive:
        oximetry_cohort: output/oximetry/0.1_generate_study_population/input_oximetry*.csv

  join_cohorts_oximetry:
    run: cohort-joiner:v0.0.30
      --lhs output/oximetry/0.1_generate_study_population/input_oximetry*.csv
      --rhs output/input_static.csv 
      --output-dir output/oximetry/0.2_join_cohorts/
    needs:
      [generate_study_population_oximetry, generate_study_definition_static]
    outputs:
      highly_sensitive:
        cohort: output/oximetry/0.2_join_cohorts/input_oximetry*.csv

  # BP
  generate_study_population_bp:
    run: cohortextractor:latest generate_cohort
      --study-definition study_definition_bp
      --index-date-range "2019-04-01 to 2022-05-30 by week"
      --with-end-date-fix
    outputs:
      highly_sensitive:
        bp_cohort: output/bp/0.1_generate_study_population/input_bp*.csv

  join_cohorts_bp:
    run: cohort-joiner:v0.0.30
      --lhs output/bp/0.1_generate_study_population/input_bp*.csv
      --rhs output/input_static.csv 
      --output-dir output/bp/0.2_join_cohorts/
    needs: [generate_study_population_bp, generate_study_definition_static]
    outputs:
      highly_sensitive:
        cohort: output/bp/0.2_join_cohorts/input_bp*.csv

  # Proactive
  generate_study_population_proactive:
    run: cohortextractor:latest generate_cohort
      --study-definition study_definition_proactive
      --index-date-range "2019-04-01 to 2022-05-30 by week"
      --with-end-date-fix
    outputs:
      highly_sensitive:
        proactive_cohort: output/proactive/0.1_generate_study_population/input_proactive*.csv

  join_cohorts_proactive:
    run: cohort-joiner:v0.0.30
      --lhs output/proactive/0.1_generate_study_population/input_proactive*.csv
      --rhs output/input_static.csv 
      --output-dir output/proactive/0.2_join_cohorts/
    needs:
      [generate_study_population_proactive, generate_study_definition_static]
    outputs:
      highly_sensitive:
        cohort: output/proactive/0.2_join_cohorts/input_proactive*.csv

  # Analysis

  # Oximetry
  generate_oximetry_timeseries:
    run: python:latest python analysis/analysis_oximetry_timeseries.py
    needs: [join_cohorts_oximetry]
    outputs:
      moderately_sensitive:
        oximetry_table_counts: output/oximetry/0.3_analysis_outputs/oximetry_table_counts.csv
        oximetry_plot_timeseries: output/oximetry/0.3_analysis_outputs/oximetry_plot_timeseries.png

  generate_oximetry_regional_timeseries:
    run: python:latest python analysis/analysis_oximetry_region.py
    needs: [join_cohorts_oximetry]
    outputs:
      moderately_sensitive:
        oximetry_table_counts_region: output/oximetry/0.3_analysis_outputs/oximetry_table_counts_*.csv
        oximetry_plot_timeseries_region: output/oximetry/0.3_analysis_outputs/oximetry_plot_timeseries_region_*.png

  generate_oximetry_breakdowns:
    run: python:latest python analysis/analysis_oximetry_breakdowns.py
    needs: [join_cohorts_oximetry]
    outputs:
      moderately_sensitive:
        oximetry_table_breakdowns: output/oximetry/0.3_analysis_outputs/oximetry_table_code_*.csv
        oximetry_plot_breakdowns: output/oximetry/0.3_analysis_outputs/oximetry_plot_code_*.png

  generate_oximetry_codes_analysis:
    run: python:latest python analysis/analysis_oximetry_codes.py
    needs: [join_cohorts_oximetry]
    outputs:
      moderately_sensitive:
        oximetry_table_code_counts: output/oximetry/0.3_analysis_outputs/oximetry_table_code_counts_*.csv
        oximetry_table_code_combinations: output/oximetry/0.3_analysis_outputs/oximetry_table_code_combinations.csv
        oximetry_table_code_populations: output/oximetry/0.3_analysis_outputs/oximetry_patient_id_total_*.csv

  # Blood pressure
  generate_bp_timeseries:
    run: python:latest python analysis/analysis_bp_timeseries.py
    needs: [join_cohorts_bp]
    outputs:
      moderately_sensitive:
        bp_table_counts: output/bp/0.3_analysis_outputs/bp_table_counts.csv
        bp_plot_timeseries: output/bp/0.3_analysis_outputs/bp_plot_timeseries.png

  generate_bp_regional_timeseries:
    run: python:latest python analysis/analysis_bp_region.py
    needs: [join_cohorts_bp]
    outputs:
      moderately_sensitive:
        bp_table_counts_region: output/bp/0.3_analysis_outputs/bp_table_counts_*.csv
        bp_plot_timeseries_region: output/bp/0.3_analysis_outputs/bp_plot_timeseries_region_*.png

  generate_bp_breakdowns:
    run: python:latest python analysis/analysis_bp_breakdowns.py
    needs: [join_cohorts_bp]
    outputs:
      moderately_sensitive:
        bp_table_breakdowns: output/bp/0.3_analysis_outputs/bp_table_code_*.csv
        bp_plot_breakdowns: output/bp/0.3_analysis_outputs/bp_plot_code_*.png

  generate_bp_codes_analysis:
    run: python:latest python analysis/analysis_bp_codes.py
    needs: [join_cohorts_bp]
    outputs:
      moderately_sensitive:
        bp_table_code_counts: output/bp/0.3_analysis_outputs/bp_table_code_counts_*.csv
        bp_table_code_combinations: output/bp/0.3_analysis_outputs/bp_table_code_combinations.csv
        bp_table_code_populations: output/bp/0.3_analysis_outputs/bp_patient_id_total_*.csv

  # Proactive Care
  generate_proactive_timeseries:
    run: python:latest python analysis/analysis_proactive_timeseries.py
    needs: [join_cohorts_proactive]
    outputs:
      moderately_sensitive:
        proactive_table_counts: output/proactive/0.3_analysis_outputs/proactive_table_counts.csv
        proactive_plot_timeseries: output/proactive/0.3_analysis_outputs/proactive_plot_timeseries.png

  generate_proactive_regional_timeseries:
    run: python:latest python analysis/analysis_proactive_region.py
    needs: [join_cohorts_proactive]
    outputs:
      moderately_sensitive:
        proactive_table_counts_region: output/proactive/0.3_analysis_outputs/proactive_table_counts_*.csv
        proactive_plot_timeseries_region: output/proactive/0.3_analysis_outputs/proactive_plot_timeseries_region_*.png

  generate_proactive_breakdowns:
    run: python:latest python analysis/analysis_proactive_breakdowns.py
    needs: [join_cohorts_proactive]
    outputs:
      moderately_sensitive:
        proactive_table_breakdowns: output/proactive/0.3_analysis_outputs/proactive_table_code_*.csv
        proactive_plot_breakdowns: output/proactive/0.3_analysis_outputs/proactive_plot_code_*.png

  generate_proactive_codes_analysis:
    run: python:latest python analysis/analysis_proactive_codes.py
    needs: [join_cohorts_proactive]
    outputs:
      moderately_sensitive:
        proactive_table_code_counts: output/proactive/0.3_analysis_outputs/proactive_table_code_counts_*.csv
        proactive_table_code_combinations: output/proactive/0.3_analysis_outputs/proactive_table_code_combinations.csv
        proactive_table_code_populations: output/proactive/0.3_analysis_outputs/proactive_patient_id_total_*.csv

Timeline

  • Created:

  • Started:

  • Finished:

  • Runtime: 56:12:14

These timestamps are generated and stored using the UTC timezone on the TPP backend.

Job information

Status
Failed
Backend
TPP
Workspace
nhs_at_home_main
Requested by
Alexandra Benson
Branch
main
Force run dependencies
Yes
Git commit hash
354bdd5
Requested actions
  • run_all

Code comparison

Compare the code used in this Job Request