Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DEVOPS-1440 Add Precommit #195

Closed
wants to merge 21 commits into from
Closed
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .flake8
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
[flake8]
per-file-ignores = tests/test_app.py:F841
20 changes: 20 additions & 0 deletions .github/workflows/precommit.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.2.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files

- repo: https://github.com/psf/black
rev: 23.7.0
hooks:
- id: black
language_version: python3.12

- repo: https://github.com/pycqa/flake8
rev: 6.1.0
hooks:
- id: flake8
args: [ --config, .flake8 ]
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -104,4 +104,4 @@ data/
.DS_Store
.virtualenvs/
test.env
run_tests_single.sh
run_tests_single.sh
43 changes: 43 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
hooks:
- id: trailing-whitespace
files: '\.py$'
- id: end-of-file-fixer
files: '\.py$'
- id: check-yaml
- id: check-added-large-files
files: '\.py$'

- repo: https://github.com/psf/black
rev: 24.3.0
hooks:
bio-boris marked this conversation as resolved.
Show resolved Hide resolved
- id: black
language_version: python3.9
files: '\.py$'
args:
- --line-length=120
- --exclude=staging_service/auth2Client.py
- --exclude=import_specifications/clients/baseclient\.py
bio-boris marked this conversation as resolved.
Show resolved Hide resolved

- repo: https://github.com/pycqa/flake8
rev: 7.0.0
hooks:
- id: flake8
args:
- --ignore=E203,W503
- --max-line-length=120
- --config=.flake8
files: '\.py$'
additional_dependencies: [ flake8 ]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ruff replaces flake8 - no need to have both



# - repo: https://github.com/astral-sh/ruff-pre-commit
# # Ruff version.
# rev: v0.4.1
# hooks:
# # Run the linter.
# - id: ruff
# # Run the formatter.
# - id: ruff-format
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,4 @@ after_success:
env:
bio-boris marked this conversation as resolved.
Show resolved Hide resolved
global:
- secure: "mH9OrGjZYV021EfQPjx0hnwi7/UlGVNTnE6kC81O1/9/lOWihnDKBb/GXuWyj7hbiCZgA2ofDblkm+bIZQxU/+40k8ZrQyk4xe9Bm3jM9MCZ7y6r3g3HZnKXGbzdQTACLoJwdHRWEnYWcBC+aEnDsJjm/uh+gFE3WFXABHO5nOBVBJ1vnZoqczPnLIqOtTWwybV58eYxXOm7J6GvxrTWxdqoEOKsLcjdbm9dej/i9xsxHr6x0I/fiB8mFsIgl1QS9WobpTjs1AQghShz8oFO4YI70mJF4b9tU0FbfF1Wm6K4fe69ff65RwUhT9o8ZgwBt/mzrDFTLMI9f/3J8v6Bu5UxQ1yEPc9QP78IWW/g2G20WxjLzmbKogsGZkSyPEz1TLq8CgEGvU5IOoDSW1NjQJr9XQaE8A1ILDEfi1xVG/d6o8uhEMEiloveRa8iKdNFoGv5O0fbS8RC6KVs9SPP0nkApmv29MhWf0injEROZNXiDJr6Da4HiU+RkDwdM8rbIOS0586t/czVIRremmon5Xb9NDZapLbqdIX/zd//IC6WGM0ytX36t8FgU0Du+V+KfaAc7XP3ff+GLUL/sxMGGjdl+gCCdUCrTPCl2+D9P54/HRB5Q5xKNnO/OVou9WOe1HanthH5n2QGelzLisU2OxCiWl/iZ15pl+CGntF9+Ek="
- secure: "sHLRR/xLA6fiPkJB6IolHUaJxNJJYcfSKG3MmRAU7v0LxxoqUkneBfk9jyk2GdDdtOR+RocOQmwie0Vf6ik5bIxcs/Bse1tR0+yhmg9EpgClokfgZN8815yNVclGkVFWZS7BhRjm7mt0K3Vo7Of0mKikp5L/tYmwQy1hesJ+mTa83igeQKpFU3QgkRJB2ALgSI54ruhO88x+9HtLqkhN8KqHj5ykGzS7ShagFywflFDVFWJoBK76NideU0gTU7+2dNV0+EPmnRWs9QoAh5LSUDa8v7fo+I2WHFCOLfiKzaNMc85+Gc72qyxvkpWl2UCJpiFVK1X4DFGvHYU9FzM0W9VuhrBAa3h1XFncYk52WzKNGVvQnOZE965iNrl/xSh2oTUOfyVIfX5JuldEWffU/vhAjvnqfsoq17geRxLyqbRx/4ZvNq6jf3Ef6kLwKlnPzcw85zZs2jsy3qC5UGJf0xqchWfsqxACvmGhVyv+KxHJxaFq48UHp6Ezb/X2DweD6/6WPfwXI6cOwXy6Uj3uvkYSsKE7HJu3K9aMeh158ZMrW/JeyH5dCRAv1qeAyLN0yOw/lPUapdrbB9nXkLbHIOhV8b6A2cOtkkuM6HXXBxICJ5BIHmVzO+4W7VXueXfVjAVY2+xs0ouFsPIOINfR5hgsglncDY+r/wU8LrUHxHg="
- secure: "sHLRR/xLA6fiPkJB6IolHUaJxNJJYcfSKG3MmRAU7v0LxxoqUkneBfk9jyk2GdDdtOR+RocOQmwie0Vf6ik5bIxcs/Bse1tR0+yhmg9EpgClokfgZN8815yNVclGkVFWZS7BhRjm7mt0K3Vo7Of0mKikp5L/tYmwQy1hesJ+mTa83igeQKpFU3QgkRJB2ALgSI54ruhO88x+9HtLqkhN8KqHj5ykGzS7ShagFywflFDVFWJoBK76NideU0gTU7+2dNV0+EPmnRWs9QoAh5LSUDa8v7fo+I2WHFCOLfiKzaNMc85+Gc72qyxvkpWl2UCJpiFVK1X4DFGvHYU9FzM0W9VuhrBAa3h1XFncYk52WzKNGVvQnOZE965iNrl/xSh2oTUOfyVIfX5JuldEWffU/vhAjvnqfsoq17geRxLyqbRx/4ZvNq6jf3Ef6kLwKlnPzcw85zZs2jsy3qC5UGJf0xqchWfsqxACvmGhVyv+KxHJxaFq48UHp6Ezb/X2DweD6/6WPfwXI6cOwXy6Uj3uvkYSsKE7HJu3K9aMeh158ZMrW/JeyH5dCRAv1qeAyLN0yOw/lPUapdrbB9nXkLbHIOhV8b6A2cOtkkuM6HXXBxICJ5BIHmVzO+4W7VXueXfVjAVY2+xs0ouFsPIOINfR5hgsglncDY+r/wU8LrUHxHg="
2 changes: 1 addition & 1 deletion .vscode/launch.json
Original file line number Diff line number Diff line change
Expand Up @@ -75,4 +75,4 @@
"host": "localhost"
}
]
}
}
2 changes: 1 addition & 1 deletion .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@
"files.autoSave": "afterDelay",
"files.autoSaveDelay": 500,
"python.pythonPath": "python3"
}
}
18 changes: 9 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -632,7 +632,7 @@ cannot decompress a <file extension> file

## Add Globus ACL

After authenticating at this endpoint, AUTH is queried to get your filepath and globus id file for
After authenticating at this endpoint, AUTH is queried to get your filepath and globus id file for
linking to globus.

**URL** : `ci.kbase.us/services/staging_service/add-acl`
Expand Down Expand Up @@ -675,7 +675,7 @@ Error Connecting to auth service ...
**Content**
```
{
'success': False,
'success': False,
'error_type': 'TransferAPIError',
'error': "Can't create ACL rule; it already exists",
'error_code': 'Exists', 'shared_directory_basename': '/username/'
Expand All @@ -684,7 +684,7 @@ Error Connecting to auth service ...

## Remove Globus ACL

After authenticating at this endpoint, AUTH is queried to get your filepath and globus id file for
After authenticating at this endpoint, AUTH is queried to get your filepath and globus id file for
linking to globus.

**URL** : `ci.kbase.us/services/staging_service/remove-acl`
Expand All @@ -704,7 +704,7 @@ linking to globus.
```
{
"message": "{\n \"DATA_TYPE\": \"result\",\n \"code\": \"Deleted\",
"message\": \"Access rule 'KBASE-examplex766ada0-x8aa-x1e8-xc7b-xa1d4c5c824a' deleted successfully\",
"message\": \"Access rule 'KBASE-examplex766ada0-x8aa-x1e8-xc7b-xa1d4c5c824a' deleted successfully\",
"request_id\": \"x2KFzfop05\",\n \"resource\": \"/endpoint/KBaseExample2a-5e5b-11e6-8309-22000b97daec/access/KBaseExample-ada0-d8aa-11e8-8c7b-0a1d4c5c824a\"}",
"Success": true
}
Expand All @@ -727,7 +727,7 @@ Error Connecting to auth service ...
**Content**
```
{
'success': False,
'success': False,
'error_type': 'TransferAPIError',
'error': "Can't create ACL rule; it already exists",
'error_code': 'Exists', 'shared_directory_basename': '/username/'
Expand Down Expand Up @@ -801,7 +801,7 @@ Reponse:
in the data file. Each data file row is provided in order for each type. Each row is
provided in a mapping of `spec.json` ID to the data for the row. Lines > 3 in the templates are
user-provided data, and each line corresponds to a single import or analysis.

### Error Response

Error reponses are of the general form:
Expand Down Expand Up @@ -985,7 +985,7 @@ Reponse:
* `files` contains a mapping of each provided data type to the output template file for that type.
In the case of Excel, all the file paths will be the same since the data types are all written
to different tabs in the same file.

### Error Response

Method specific errors have the form:
Expand All @@ -1012,7 +1012,7 @@ This endpoint returns:
For example,
* if we pass in nothing we get a response with no mappings
* if we pass in a list of files, such as ["file1.fasta", "file2.fq", "None"], we would get back a
response that maps to Fasta Importers and FastQ Importers, with a weight of 0 to 1
response that maps to Fasta Importers and FastQ Importers, with a weight of 0 to 1
which represents the probability that this is the correct importer for you.
* for files for which there is no predicted app, the return is a null value
* this endpoint is used to power the dropdowns for the staging service window in the Narrative
Expand Down Expand Up @@ -1071,7 +1071,7 @@ Response:

**Content**
```
must provide file_list field
must provide file_list field
```

## Get importer filetypes
Expand Down
2 changes: 1 addition & 1 deletion build/push2dockerhub.sh
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
#!/bin/bash
#
#
# This script is intended to be run in the deploy stage of a travis build
# It checks to make sure that this is a not a PR, and that we have the secure
# environment variables available and then checks if this is either the master
Expand Down
2 changes: 1 addition & 1 deletion deployment/bin/entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,4 @@ if [ -d "kb/deployment/lib/staging_service" ]; then
PYTHONPATH="kb/deployment/lib/staging_service"
# environment variable for KB_DEPLOYMENT_CONFIG set in docker-compose.yml
fi
python3 -m staging_service
python3 -m staging_service
2 changes: 1 addition & 1 deletion deployment/conf/deployment.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,4 @@ META_DIR = /kb/deployment/lib/src/data/metadata/
DATA_DIR = /kb/deployment/lib/src/data/bulk/
AUTH_URL = https://ci.kbase.us/services/auth/api/V2/token
CONCIERGE_PATH = /kbaseconcierge
FILE_EXTENSION_MAPPINGS = /kb/deployment/conf/supported_apps_w_extensions.json
FILE_EXTENSION_MAPPINGS = /kb/deployment/conf/supported_apps_w_extensions.json
2 changes: 1 addition & 1 deletion deployment/conf/local.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@
META_DIR = ./data/metadata/
DATA_DIR = ./data/bulk/
AUTH_URL = https://ci.kbase.us/services/auth/api/V2/token
CONCIERGE_PATH = /kbaseconcierge
CONCIERGE_PATH = /kbaseconcierge
2 changes: 1 addition & 1 deletion deployment/conf/supported_apps_w_extensions.json
Original file line number Diff line number Diff line change
Expand Up @@ -1126,4 +1126,4 @@
]
}
}
}
}
2 changes: 1 addition & 1 deletion deployment/conf/testing.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@ DATA_DIR = ./data/bulk/
AUTH_URL = https://ci.kbase.us/services/auth/api/V2/token
CONCIERGE_PATH = /kbaseconcierge
FILE_EXTENSION_MAPPINGS = ./deployment/conf/supported_apps_w_extensions.json
;FILE_EXTENSION_MAPPINGS_PYCHARM = ../deployment/conf/supported_apps_w_extensions.json
;FILE_EXTENSION_MAPPINGS_PYCHARM = ../deployment/conf/supported_apps_w_extensions.json
14 changes: 7 additions & 7 deletions docs/import_specifications.ADR.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ to import one or more files in the staging area to KBase as KBase data types.

## Front end changes

The design introduces a new StS data type, `import_specification`. The FE's current
The design introduces a new StS data type, `import_specification`. The FE's current
behavior is to display any data types returned from the StS in the file dropdown, but silently
ignore user-selected files for which the selected data type is unknown to the narrative, a bug.
The FE will be updated to ignore unknown data types returned from the StS, allowing for phased,
Expand Down Expand Up @@ -47,7 +47,7 @@ The file, by row, is:
* The version allows us to update the file format and increment the version,
allowing backwards compatibility - the staging service can process the file appropriately
depending on the version number.
2. The IDs of the app inputs from the `spec.json` file.
2. The IDs of the app inputs from the `spec.json` file.
3. The corresponding human readable names of the app inputs from the `display.yaml` file.
4. (and beyond) Import specifications. Each line corresponds to a single import.

Expand Down Expand Up @@ -89,7 +89,7 @@ The front end will be expected to either
## User operations

* The user uploads the import specification files to the staging area along with all the files
inluded in the specification.
inluded in the specification.
* The user selects the `Import Specification` type for the specification files.
* The user may also select other files in the staging area to include in the import along
with the files listed in the specification.
Expand Down Expand Up @@ -277,16 +277,16 @@ Note in this case the service MUST log the stack trace along with the filename f
Dynamic scientific name to taxon lookup may be added to the Genbank (and the currently
out of scope, but trivial to add GFF/FASTA Genome) importer in the near future. If that occurs,
for the purpose of xSV import the user will be expected to provide the entire, correct,
scientific name as returned from the taxon API.
scientific name as returned from the taxon API.

* The user could get this name by starting a genome import and running the query from the
* The user could get this name by starting a genome import and running the query from the
import app cell configuration screen.
* This will be documented in the README.md for the template files.
* As part of the UI work we could theoretically provide a landing page for looking up valid
scientific names.
* Presumably the UI would need to run the dynamic query and report an error to the user if the
dynamic service returns 0 or > 1 entries.
* Providing the scientific name vs. the taxon ID seems simpler because the machinery already
* Providing the scientific name vs. the taxon ID seems simpler because the machinery already
exists to perform the query and is part of the spec.
* Expect these plans to change as it becomes more clear how dynamic fields will work in the
context of bulk import.
context of bulk import.
1 change: 0 additions & 1 deletion globus.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,3 @@ transfer_token = "Get this refresh token"
auth_token = "Get this refresh token"
endpoint_id = c3c0a65f-5827-4834-b6c9-388b0b19953a
client_id = 26d64c4c-fcc2-4f7c-b056-62f185875af6

51 changes: 26 additions & 25 deletions import_specifications/clients/authclient.py
Original file line number Diff line number Diff line change
@@ -1,18 +1,20 @@
'''
"""
Created on Aug 1, 2016

A very basic KBase auth client for the Python server.

@author: [email protected]
'''
"""

import hashlib
import threading as _threading
import time as _time

import requests as _requests
import threading as _threading
import hashlib


class TokenCache(object):
''' A basic cache for tokens. '''
"""A basic cache for tokens."""

_MAX_TIME_SEC = 5 * 60 # 5 min

Expand All @@ -24,7 +26,7 @@ def __init__(self, maxsize=2000):
self._halfmax = maxsize / 2 # int division to round down

def get_user(self, token):
token = hashlib.sha256(token.encode('utf-8')).hexdigest()
token = hashlib.sha256(token.encode("utf-8")).hexdigest()
with self._lock:
usertime = self._cache.get(token)
if not usertime:
Expand All @@ -37,17 +39,14 @@ def get_user(self, token):

def add_valid_token(self, token, user):
if not token:
raise ValueError('Must supply token')
raise ValueError("Must supply token")
if not user:
raise ValueError('Must supply user')
token = hashlib.sha256(token.encode('utf-8')).hexdigest()
raise ValueError("Must supply user")
token = hashlib.sha256(token.encode("utf-8")).hexdigest()
with self._lock:
self._cache[token] = [user, _time.time()]
if len(self._cache) > self._maxsize:
sorted_items = sorted(
list(self._cache.items()),
key=(lambda v: v[1][1])
)
sorted_items = sorted(list(self._cache.items()), key=(lambda v: v[1][1]))
for i, (t, _) in enumerate(sorted_items):
if i <= self._halfmax:
del self._cache[t]
Expand All @@ -56,39 +55,41 @@ def add_valid_token(self, token, user):


class KBaseAuth(object):
'''
"""
A very basic KBase auth client for the Python server.
'''
"""

_LOGIN_URL = 'https://kbase.us/services/auth/api/legacy/KBase/Sessions/Login'
_LOGIN_URL = "https://kbase.us/services/auth/api/legacy/KBase/Sessions/Login"

def __init__(self, auth_url=None):
'''
"""
Constructor
'''
"""
self._authurl = auth_url
if not self._authurl:
self._authurl = self._LOGIN_URL
self._cache = TokenCache()

def get_user(self, token):
if not token:
raise ValueError('Must supply token')
raise ValueError("Must supply token")
user = self._cache.get_user(token)
if user:
return user

d = {'token': token, 'fields': 'user_id'}
d = {"token": token, "fields": "user_id"}
ret = _requests.post(self._authurl, data=d)
if not ret.ok:
try:
err = ret.json()
except Exception as e:
except Exception:
ret.raise_for_status()
raise ValueError('Error connecting to auth service: {} {}\n{}'
.format(ret.status_code, ret.reason,
err['error']['message']))
raise ValueError(
"Error connecting to auth service: {} {}\n{}".format(
ret.status_code, ret.reason, err["error"]["message"]
)
)

user = ret.json()['user_id']
user = ret.json()["user_id"]
self._cache.add_valid_token(token, user)
return user
Loading
Loading