Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert "[DEPLOY] Fix for microbes and GSE75083" #2157

Merged
merged 1 commit into from
Feb 25, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1,660 changes: 0 additions & 1,660 deletions .circleci/codecov.sh

This file was deleted.

19 changes: 8 additions & 11 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@ jobs:

# Install Nomad
- run: sudo ./scripts/install_nomad.sh

# Start Nomad and register jobs.
- run:
command: sudo -E ./scripts/run_nomad.sh -e test
Expand All @@ -42,21 +43,19 @@ jobs:
# Running these in the same job as the common tests is good
# because their dockerfiles are very similar so a lot of the
# build time is saved by only building those layers once.
- run: sudo chown -R circleci:circleci workers/test_volume/
- run:
command: .circleci/filter_tests.sh -t downloaders
no_output_timeout: 1h

# Run Foreman Tests
- run: mkdir -p test_volume && chmod -R a+rw test_volume && sudo chown -R circleci:circleci test_volume
- run: mkdir -p test_volume && chmod -R a+rw test_volume

# The foreman includes the end-to-end tests, but some of these
# require docker images which are not built in this
# workflow. Therefore we exclude salmon, affymetrix, and
# transcriptome and let those end-to-end tests get run in the
# workflows that include building those images.
- run: ./foreman/run_tests.sh --exclude-tag=salmon --exclude-tag=transcriptome --exclude-tag=affymetrix
- run: .circleci/upload_test_coverage.sh foreman

# Run NO_OP tests
- run: sudo chown -R circleci:circleci workers/test_volume/
Expand Down Expand Up @@ -91,9 +90,7 @@ jobs:
- run: ./scripts/update_models.sh

# Run Common Tests.
- run: mkdir -p test_volume && chmod -R a+rw test_volume && sudo chown -R circleci:circleci test_volume
- run: ./common/run_tests.sh
- run: .circleci/upload_test_coverage.sh common

- run: ./scripts/prepare_image.sh -i smasher -s workers -d localhost:5000

Expand Down Expand Up @@ -158,16 +155,15 @@ jobs:
- run: ./scripts/rebuild_es_index.sh

# Run API Tests.
- run: mkdir -p test_volume && chmod -R a+rw test_volume && sudo chown -R circleci:circleci test_volume
- run: ./api/run_tests.sh
- run: .circleci/upload_test_coverage.sh api

- run:
command: .circleci/filter_tests.sh -t salmon
no_output_timeout: 1h

# Install Nomad
- run: sudo ./scripts/install_nomad.sh

# Start Nomad and register jobs.
- run:
command: sudo -E ./scripts/run_nomad.sh -e test
Expand All @@ -182,9 +178,9 @@ jobs:
- run: docker push localhost:5000/dr_salmon

# Containers run as a different user so we need to give them permission to the test directory.
- run: mkdir -p test_volume && chmod -R a+rw test_volume && sudo chown -R circleci:circleci test_volume
- run: mkdir -p test_volume && chmod -R a+rw test_volume

- run: ./foreman/run_tests.sh --tag=salmon --tag=transcriptome
- run: .circleci/upload_test_coverage.sh foreman

tx_illumina_tests:
working_directory: ~/refinebio
Expand Down Expand Up @@ -266,6 +262,7 @@ jobs:

# Install Nomad
- run: sudo ./scripts/install_nomad.sh

# Start Nomad and register jobs.
- run:
command: sudo -E ./scripts/run_nomad.sh -e test
Expand All @@ -282,12 +279,12 @@ jobs:
- run: docker push localhost:5000/dr_affymetrix

# Containers run as a different user so we need to give them permission to the test directory.
- run: mkdir -p test_volume && chmod -R a+rw test_volume && sudo chown -R circleci:circleci test_volume
- run: mkdir -p test_volume && chmod -R a+rw test_volume

- run:
command: ./foreman/run_tests.sh --tag=affymetrix
# This takes more than 10 minutes, but not much.
no_output_timeout: 20m
- run: .circleci/upload_test_coverage.sh foreman

deploy:
machine: true
Expand Down
2 changes: 0 additions & 2 deletions .circleci/filter_tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -9,5 +9,3 @@ else
echo "Running all tests..";
./workers/run_tests.sh "$@"
fi

./.circleci/upload_test_coverage.sh workers
39 changes: 0 additions & 39 deletions .circleci/upload_test_coverage.sh

This file was deleted.

1 change: 0 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,6 @@ infrastructure/.terraform/
# We download a lot of files into the test_volume directory when
# running tests, which we don't need to track.
workers/test_volume/*
!workers/test_volume/test_download_files
# However for a couple tests we do store the data in the repo and need to track them.
!workers/test_volume/raw/TEST/TRANSCRIPTOME_INDEX/aegilops_tauschii_short.gtf.gz
!workers/test_volume/raw/TEST/NO_OP/test.txt
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Refine.bio [![Build Status](https://circleci.com/gh/AlexsLemonade/refinebio/tree/dev.svg?&style=shield)](https://circleci.com/gh/AlexsLemonade/refinebio/) [![codecov](https://codecov.io/gh/AlexsLemonade/refinebio/branch/master/graph/badge.svg)](https://codecov.io/gh/AlexsLemonade/refinebio)
# Refine.bio [![Build Status](https://circleci.com/gh/AlexsLemonade/refinebio/tree/dev.svg?&style=shield)](https://circleci.com/gh/AlexsLemonade/refinebio/)

<!-- This section needs to be drastically improved -->
Refine.bio harmonizes petabytes of publicly available biological data into
Expand Down
1 change: 1 addition & 0 deletions api/data_refinery_api/views.py
Original file line number Diff line number Diff line change
Expand Up @@ -536,6 +536,7 @@ def perform_update(self, serializer):
requests.post(
settings.ENGAGEMENTBOT_WEBHOOK,
json={
"channel": "ccdl-general", # Move to robots when we get sick of these
"username": "EngagementBot",
"icon_emoji": ":halal:",
"attachments": [
Expand Down
2 changes: 1 addition & 1 deletion api/requirements.in
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
coverage
django==2.2.10
django==2.2.9
psycopg2-binary
boto3
requests>=2.20.0
Expand Down
2 changes: 1 addition & 1 deletion api/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ django-elasticsearch-dsl==6.4.2
django-filter==2.0.0
django-hstore==1.4.2 # via djangorestframework-hstore
django-nine==0.2.2 # via django-elasticsearch-dsl-drf
django==2.2.10
django==2.2.9
djangorestframework-hstore==1.3
djangorestframework==3.9.4
docutils==0.14 # via botocore
Expand Down
8 changes: 0 additions & 8 deletions api/run_tests.sh
Original file line number Diff line number Diff line change
Expand Up @@ -20,13 +20,6 @@ if ! [ "$(docker ps --filter name=drdb -q)" ]; then
exit 1
fi

project_root=$(pwd) # "cd .." called above
volume_directory="$project_root/test_volume"
if [ ! -d "$volume_directory" ]; then
mkdir "$volume_directory"
chmod -R a+rwX "$volume_directory"
fi

./scripts/prepare_image.sh -i api_local -s api

. ./scripts/common.sh
Expand All @@ -39,5 +32,4 @@ docker run \
--add-host=nomad:"$HOST_IP" \
--add-host=elasticsearch:"$ES_HOST_IP" \
--env-file api/environments/test \
--volume "$volume_directory":/home/user/data_store \
-it ccdlstaging/dr_api_local bash -c "$(run_tests_with_coverage "$@")"
10 changes: 0 additions & 10 deletions common/data_refinery_common/constants.py

This file was deleted.

68 changes: 25 additions & 43 deletions common/data_refinery_common/models/__init__.py
Original file line number Diff line number Diff line change
@@ -1,55 +1,37 @@
from data_refinery_common.models.api_token import APIToken # noqa
from data_refinery_common.models.associations.compendium_result_organism_association import ( # noqa
CompendiumResultOrganismAssociation,
from data_refinery_common.models.command_progress import ( # noqa
CdfCorrectedAccession,
SurveyedAccession,
)
from data_refinery_common.models.associations.downloaderjob_originalfile_association import ( # noqa
DownloaderJobOriginalFileAssociation,
from data_refinery_common.models.jobs import ( # noqa
DownloaderJob,
ProcessorJob,
SurveyJob,
SurveyJobKeyValue,
)
from data_refinery_common.models.associations.experiment_organism_association import ( # noqa
from data_refinery_common.models.models import ( # noqa
APIToken,
CompendiumResult,
CompendiumResultOrganismAssociation,
ComputationalResult,
ComputationalResultAnnotation,
ComputedFile,
Dataset,
DownloaderJobOriginalFileAssociation,
Experiment,
ExperimentAnnotation,
ExperimentOrganismAssociation,
)
from data_refinery_common.models.associations.experiment_result_association import ( # noqa
ExperimentResultAssociation,
)
from data_refinery_common.models.associations.experiment_sample_association import ( # noqa
ExperimentSampleAssociation,
)
from data_refinery_common.models.associations.original_file_sample_association import ( # noqa
OrganismIndex,
OriginalFile,
OriginalFileSampleAssociation,
)
from data_refinery_common.models.associations.processorjob_dataset_association import ( # noqa
Pipeline,
Processor,
ProcessorJobDatasetAssociation,
)
from data_refinery_common.models.associations.processorjob_originalfile_association import ( # noqa
ProcessorJobOriginalFileAssociation,
)
from data_refinery_common.models.associations.sample_computed_file_association import ( # noqa
Sample,
SampleAnnotation,
SampleComputedFileAssociation,
)
from data_refinery_common.models.associations.sample_result_association import ( # noqa
SampleResultAssociation,
)
from data_refinery_common.models.command_progress import ( # noqa
CdfCorrectedAccession,
SurveyedAccession,
)
from data_refinery_common.models.compendium_result import CompendiumResult # noqa
from data_refinery_common.models.computational_result import ComputationalResult # noqa
from data_refinery_common.models.computational_result_annotation import ( # noqa
ComputationalResultAnnotation,
)
from data_refinery_common.models.computed_file import ComputedFile # noqa
from data_refinery_common.models.dataset import Dataset # noqa
from data_refinery_common.models.experiment import Experiment # noqa
from data_refinery_common.models.experiment_annotation import ExperimentAnnotation # noqa
from data_refinery_common.models.jobs.downloader_job import DownloaderJob # noqa
from data_refinery_common.models.jobs.processor_job import ProcessorJob # noqa
from data_refinery_common.models.jobs.survey_job import SurveyJob # noqa
from data_refinery_common.models.jobs.survey_job_key_value import SurveyJobKeyValue # noqa
from data_refinery_common.models.organism import Organism # noqa
from data_refinery_common.models.organism_index import OrganismIndex # noqa
from data_refinery_common.models.original_file import OriginalFile # noqa
from data_refinery_common.models.pipeline import Pipeline # noqa
from data_refinery_common.models.processor import Processor # noqa
from data_refinery_common.models.sample import Sample # noqa
from data_refinery_common.models.sample_annotation import SampleAnnotation # noqa
32 changes: 0 additions & 32 deletions common/data_refinery_common/models/api_token.py

This file was deleted.

Empty file.

This file was deleted.

This file was deleted.

This file was deleted.

This file was deleted.

Loading