Skip to content

Commit

Permalink
Merge pull request #77 from inspirehep/add-file-name
Browse files Browse the repository at this point in the history
add filename to workflow extradata
  • Loading branch information
MJedr authored Sep 12, 2023
2 parents dcbe9e0 + 03ea18a commit 459a3be
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 6 deletions.
7 changes: 1 addition & 6 deletions .github/workflows/build-and-release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -41,12 +41,6 @@ jobs:
matrix:
python-version: [2.7.17]
steps:
- name: Checkout
uses: actions/checkout@v2
with:
python-version: ${{ matrix.python-version }}
fetch-depth: 0

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
Expand All @@ -55,6 +49,7 @@ jobs:

- name: Install python dependencies
run: |
python -m pip install --upgrade pip
pip install --upgrade pip setuptools
pip install twine wheel coveralls
pip install -r requirements.txt
Expand Down
2 changes: 2 additions & 0 deletions inspire_crawler/tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -145,6 +145,7 @@ def submit_results(job_id, errors, log_file, results_uri, spider_name, results_d

record = crawl_result.pop('record')
crawl_errors = crawl_result['errors']
file = crawl_result['file_name']

current_app.logger.debug('Parsing record: {}'.format(record))
engine = WorkflowEngine.with_name(job.workflow)
Expand All @@ -159,6 +160,7 @@ def submit_results(job_id, errors, log_file, results_uri, spider_name, results_d
extra_data = {
'crawler_job_id': job_id,
'crawler_results_path': results_path,
'source_file': file
}
record_extra = record.pop('extra_data', {})
if record_extra:
Expand Down

0 comments on commit 459a3be

Please sign in to comment.