Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

#425: fixed type errors found by mypy #426

Merged
merged 60 commits into from
Nov 25, 2024
Merged
Show file tree
Hide file tree
Changes from 21 commits
Commits
Show all changes
60 commits
Select commit Hold shift + click to select a range
505c9f2
#425: Fixed type errors found by MyPy
tomuben Nov 19, 2024
cc1bd69
Activated CI type checks
tomuben Nov 19, 2024
072f394
Updared changelog
tomuben Nov 19, 2024
c807369
Fixed build_config.py
tomuben Nov 19, 2024
e44b134
Fixed spawn_test_container.py
tomuben Nov 19, 2024
ec7117e
Fixed type error for protocol class
tomuben Nov 20, 2024
73cc903
Fixed imports
tomuben Nov 20, 2024
b928e9e
Fixed import
tomuben Nov 20, 2024
2a397a3
[run all tests]
tomuben Nov 20, 2024
97e8340
Fixed docker_image_analyze_task.py
tomuben Nov 20, 2024
a33a9ad
[run all tests]
tomuben Nov 20, 2024
576ce4e
Changed order in common.py
tomuben Nov 20, 2024
7a05968
[run all tests]
tomuben Nov 20, 2024
28e994c
Fixed conversion of ints in shell_variables.py
tomuben Nov 20, 2024
2ce0296
[run all tests]
tomuben Nov 20, 2024
d7f85e0
Fixed import in test_doctor.py
tomuben Nov 20, 2024
55f3d6f
[run all tests]
tomuben Nov 20, 2024
7ff8ffd
Added type hints to luigi config classes
tomuben Nov 20, 2024
6090b92
[run all tests]
tomuben Nov 20, 2024
f2b18e7
Integrated suggestsions from review
tomuben Nov 20, 2024
7bc8cc3
[run all tests]
tomuben Nov 20, 2024
ff0f0d2
Fixes from review
tomuben Nov 20, 2024
c9761fe
[run all tests]
tomuben Nov 20, 2024
e76c3ac
Fixes from review
tomuben Nov 20, 2024
79e533e
[run all tests]
tomuben Nov 20, 2024
c735fd4
[run all tests]
tomuben Nov 21, 2024
40697ae
Fixed AbstractSpawnTestEnvironment._default_bridge_ip_address()
tomuben Nov 21, 2024
bf0aa0b
[run all tests]
tomuben Nov 21, 2024
4a3e4fd
Fixed Generator return types
tomuben Nov 21, 2024
e8a4e81
Fixes from review findings
tomuben Nov 21, 2024
989ea22
Fixed return types in abstract_spawn_test_environment.py
tomuben Nov 21, 2024
fcd71bb
Fixed types in test_container_parameter.py
tomuben Nov 21, 2024
82af67b
Removed type ignore in test_api_logging.py
tomuben Nov 21, 2024
9b500ed
Changed type hints for ssh access
tomuben Nov 21, 2024
99064ab
Fixed wrong parameter in exaslct_test_environment.py
tomuben Nov 21, 2024
40c8980
Removed unecessary type ignore in spawn_test_container.py
tomuben Nov 21, 2024
e5f57f8
1. Fixed return type in spawn_test_database.py
tomuben Nov 21, 2024
fc52074
[run all tests]
tomuben Nov 21, 2024
2724a72
Fixed missing import in spawn_test_database.py
tomuben Nov 21, 2024
cba4a36
[run all tests]
tomuben Nov 21, 2024
19e16ad
Fixed type hint which was incompatible with Python3.9
tomuben Nov 21, 2024
a1b081f
[run all tests]
tomuben Nov 21, 2024
659ee73
Fixed type hint which was incompatible with Python3.9
tomuben Nov 21, 2024
02bbb59
[run all tests]
tomuben Nov 21, 2024
5342429
Fixed missing import in spawn_test_database.py
tomuben Nov 21, 2024
b064b96
[run all tests]
tomuben Nov 21, 2024
e25dce2
Use Union for type in spawn_test_container.py as it was incompatible …
tomuben Nov 21, 2024
e864f88
[run all tests]
tomuben Nov 21, 2024
8481cb4
Removed assertion which broke test: test_test_env_reuse.TestContainer…
tomuben Nov 21, 2024
95cdb49
[run all tests]
tomuben Nov 21, 2024
40604ab
Fixed variable name in AbstractSpawnTestEnvironment._create_ssl_certi…
tomuben Nov 22, 2024
ee5cf20
Fixed return type in CreateSSLCertificatesTask.build_image()
tomuben Nov 22, 2024
0049e3d
Fixed return type in conftest.py
tomuben Nov 22, 2024
c36d74d
Fixed types for str.join()
tomuben Nov 22, 2024
f1a37e8
Fixed types in run_minimal_tests in noxfile.py
tomuben Nov 22, 2024
5314cad
Added comment in test_populate_data.py for type ignore
tomuben Nov 22, 2024
35946b2
[run all tests]
tomuben Nov 22, 2024
c6a9beb
Added a comment in common.py
tomuben Nov 25, 2024
faca385
Fixed type hints in test_doctor.py
tomuben Nov 25, 2024
b62912a
[run all tests]
tomuben Nov 25, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/checks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ jobs:
python-version: ${{ matrix.python-version }}

- name: Run type-check
run: poetry run nox -s lint:typing || echo ignoring...
run: poetry run nox -s lint:typing

Security:
name: Security Checks (Python-${{ matrix.python-version }})
Expand All @@ -109,7 +109,7 @@ jobs:
python-version: ${{ matrix.python-version }}

- name: Run security linter
run: poetry run nox -s lint:security || echo ignoring...
run: poetry run nox -s lint:security

- name: Upload Artifacts
uses: actions/[email protected]
Expand Down
3 changes: 2 additions & 1 deletion doc/changes/unreleased.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,5 @@ Code name:
* #119: Refactored `pkg_resources` usage to `importlib.resources`
* #420: Added file `py.typed` to enable mypy to find project specific types
* #418: Use exasol/python-toolbox
* #411: Removed usage of exasol-bucketfs
* #411: Removed usage of exasol-bucketfs
* #425: Fixed type checks found by MyPy
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ def _handle_failure(self, task_error: TaskRuntimeError):
def _print_task_failures(task_error: TaskRuntimeError):
print_err()
print_err("Task failure message: %s" % task_error.msg)
print_err(task_error.__cause__.args[0])
print_err(task_error.__cause__.args[0]) # type: ignore
print_err()

def _handle_success(self):
Expand Down
2 changes: 1 addition & 1 deletion exasol_integration_test_docker_environment/doctor.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,5 +84,5 @@ def health_checkup() -> Iterable[error.ExaError]:
]
for is_fine, diagnosis in examinations:
if not is_fine():
for error_code in diagnosis():
for error_code in diagnosis(): # type: ignore
tkilias marked this conversation as resolved.
Show resolved Hide resolved
yield error_code
11 changes: 7 additions & 4 deletions exasol_integration_test_docker_environment/lib/api/api_errors.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,13 +12,16 @@ class HealthProblem(RuntimeError):
class TaskFailures(Exception):
"""Represents a potential cause of a TaskRuntimeError"""

def __init__(self, inner: List[str] = None):
def __init__(self, inner: Optional[List[str]] = None):
super().__init__(self._construct_exception_message(inner))
self.inner = inner

def _construct_exception_message(self, failures: Iterable[str]) -> str:
formatted_task_failures = "\n".join(failures)
return f"Following task failures were caught during the execution:\n{formatted_task_failures}"
def _construct_exception_message(self, failures: Optional[Iterable[str]]) -> str:
if failures is not None:
formatted_task_failures = "\n".join(failures)
return f"Following task failures were caught during the execution:\n{formatted_task_failures}"
else:
return f"No task failures were caught during the execution:"


class TaskRuntimeError(RuntimeError):
Expand Down
33 changes: 15 additions & 18 deletions exasol_integration_test_docker_environment/lib/api/common.py
tomuben marked this conversation as resolved.
Show resolved Hide resolved
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@

class JobCounterSingleton(object):
"""
We use here a Singleton to avoid a unprotected global variable.
We use here a Singleton to avoid an unprotected global variable.
However, this counter needs to be global counter to guarantee unique job ids.
This is needed in case of task that finish in less than a second, to avoid duplicated job ids
"""
Expand All @@ -42,18 +42,18 @@ def __new__(cls):
return cls._instance

def get_next_value(self) -> int:
self._counter += 1
return self._counter
self._counter += 1 # type: ignore
tomuben marked this conversation as resolved.
Show resolved Hide resolved
return self._counter # type: ignore


def set_build_config(force_rebuild: bool,
force_rebuild_from: Tuple[str, ...],
force_pull: bool,
log_build_context_content: bool,
output_directory: str,
temporary_base_directory: str,
cache_directory: str,
build_name: str, ):
temporary_base_directory: Optional[str],
cache_directory: Optional[str],
build_name: Optional[str], ):
luigi.configuration.get_config().set('build_config', 'force_rebuild', str(force_rebuild))
luigi.configuration.get_config().set('build_config', 'force_rebuild_from', json.dumps(force_rebuild_from))
luigi.configuration.get_config().set('build_config', 'force_pull', str(force_pull))
Expand All @@ -72,9 +72,8 @@ def set_output_directory(output_directory):
luigi.configuration.get_config().set('build_config', 'output_directory', output_directory)


def set_docker_repository_config(docker_password: str, docker_repository_name: str, docker_username: str,
tag_prefix: str,
kind: str):
def set_docker_repository_config(docker_password: Optional[str], docker_repository_name: Optional[str],
docker_username: Optional[str], tag_prefix: str, kind: str):
config_class = f'{kind}_docker_repository_config'
luigi.configuration.get_config().set(config_class, 'tag_prefix', tag_prefix)
if docker_repository_name is not None:
Expand Down Expand Up @@ -105,8 +104,8 @@ def import_build_steps(flavor_path: Tuple[str, ...]):
path_to_build_steps = Path(path).joinpath("flavor_base/build_steps.py")
module_name_for_build_steps = extract_modulename_for_build_steps(path)
spec = importlib.util.spec_from_file_location(module_name_for_build_steps, path_to_build_steps)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
module = importlib.util.module_from_spec(spec) # type: ignore
tkilias marked this conversation as resolved.
Show resolved Hide resolved
spec.loader.exec_module(module) # type: ignore


def generate_root_task(task_class, *args, **kwargs) -> DependencyLoggerBaseTask:
Expand All @@ -119,7 +118,7 @@ def generate_root_task(task_class, *args, **kwargs) -> DependencyLoggerBaseTask:

def run_task(task_creator: Callable[[], DependencyLoggerBaseTask], workers: int = 2,
task_dependencies_dot_file: Optional[str] = None,
log_level: str = None, use_job_specific_log_file: bool = False) \
log_level: Optional[str] = None, use_job_specific_log_file: bool = False) \
-> Any:
setup_worker()
task = task_creator()
Expand All @@ -141,7 +140,6 @@ def run_task(task_creator: Callable[[], DependencyLoggerBaseTask], workers: int
f"The detailed log of the integration-test-docker-environment can be found at: {log_file_path}")
task.cleanup(success)


def _run_task_with_logging_config(
task: DependencyLoggerBaseTask,
log_file_path: Path,
Expand All @@ -153,7 +151,6 @@ def _run_task_with_logging_config(
local_scheduler=True, **run_kwargs)
return no_scheduling_errors


def _handle_task_result(
no_scheduling_errors: bool,
success: bool,
Expand Down Expand Up @@ -181,7 +178,7 @@ def _configure_logging(
use_job_specific_log_file: bool) -> Iterator[Dict[str, str]]:
with get_luigi_log_config(log_file_target=log_file_path,
log_level=log_level,
use_job_specific_log_file=use_job_specific_log_file) as luigi_config:
use_job_specific_log_file=use_job_specific_log_file) as luigi_config: # type: Path
tkilias marked this conversation as resolved.
Show resolved Hide resolved
no_configure_logging, run_kwargs = _configure_logging_parameter(
log_level=log_level,
luigi_config=luigi_config,
Expand All @@ -199,7 +196,7 @@ def _configure_logging(
yield run_kwargs


def _configure_logging_parameter(log_level: str, luigi_config: Path, use_job_specific_log_file: bool) \
def _configure_logging_parameter(log_level: Optional[str], luigi_config: Path, use_job_specific_log_file: bool) \
-> Tuple[bool, Dict[str, str]]:
if use_job_specific_log_file:
no_configure_logging = False
Expand All @@ -226,13 +223,13 @@ def generate_graph_from_task_dependencies(task: DependencyLoggerBaseTask, task_d


def collect_dependencies(task: DependencyLoggerBaseTask) -> Set[TaskDependency]:
dependencies = set()
dependencies : Set[TaskDependency] = set()
for root, directories, files in os.walk(task._get_dependencies_path_for_job()):
for file in files:
file_path = Path(root).joinpath(file)
with open(file_path) as f:
for line in f.readlines():
task_dependency = TaskDependency.from_json(line) # type: TaskDependency
task_dependency : TaskDependency = TaskDependency.from_json(line) # type: ignore
if task_dependency.state == DependencyState.requested.name:
dependencies.add(task_dependency)
return dependencies
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import functools
from typing import Tuple, Optional, Callable
from typing import Tuple, Optional, Callable, Any
import humanfriendly

from exasol_integration_test_docker_environment.lib.api.common import (
Expand All @@ -26,8 +26,10 @@


def _cleanup(environment_info: EnvironmentInfo) -> None:
remove_docker_container([environment_info.database_info.container_info.container_name])
remove_docker_volumes([environment_info.database_info.container_info.volume_name])
if environment_info.database_info.container_info is not None:
remove_docker_container([environment_info.database_info.container_info.container_name])
if environment_info.database_info.container_info.volume_name is not None:
remove_docker_volumes([environment_info.database_info.container_info.volume_name])
remove_docker_networks([environment_info.network_info.network_name])


Expand Down Expand Up @@ -69,7 +71,7 @@ def spawn_test_environment(
raises: TaskRuntimeError if spawning the test environment fails

"""
def str_or_none(x: any) -> str:
def str_or_none(x: Any) -> Optional[str]:
return str(x) if x is not None else None

parsed_db_mem_size = humanfriendly.parse_size(db_mem_size)
Expand All @@ -78,6 +80,8 @@ def str_or_none(x: any) -> str:
parsed_db_disk_size = humanfriendly.parse_size(db_disk_size)
if parsed_db_disk_size < humanfriendly.parse_size("100 MiB"):
raise ArgumentConstraintError("db_disk_size", "needs to be at least 100 MiB")
db_os_access_value = DbOsAccess[db_os_access] if db_os_access else DbOsAccess.DOCKER_EXEC

set_build_config(False,
tuple(),
False,
Expand All @@ -101,7 +105,7 @@ def str_or_none(x: any) -> str:
docker_runtime=docker_runtime,
docker_db_image_version=docker_db_image_version,
docker_db_image_name=docker_db_image_name,
db_os_access=DbOsAccess[db_os_access],
db_os_access=db_os_access_value,
db_user="sys",
db_password="exasol",
bucketfs_write_password="write",
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import functools
from typing import Tuple, Optional, Callable
from typing import Tuple, Optional, Callable, Any
import humanfriendly

from exasol_integration_test_docker_environment.lib.api.common import set_build_config, set_docker_repository_config, \
Expand All @@ -22,14 +22,15 @@
.docker_db_test_environment_parameter import DbOsAccess

def _cleanup(environment_info: EnvironmentInfo) -> None:
if environment_info.test_container_info is None:
remove_docker_container([environment_info.database_info.container_info.container_name])
else:
remove_docker_container([environment_info.test_container_info.container_name,
environment_info.database_info.container_info.container_name])
remove_docker_volumes([environment_info.database_info.container_info.volume_name])
remove_docker_networks([environment_info.network_info.network_name])
if test_container_info := environment_info.test_container_info:
tkilias marked this conversation as resolved.
Show resolved Hide resolved
remove_docker_container([test_container_info.container_name])

if db_container_info := environment_info.database_info.container_info:
remove_docker_container([db_container_info.container_name])
if name := db_container_info.volume_name:
remove_docker_volumes([name])

remove_docker_networks([environment_info.network_info.network_name])

@no_cli_function
def spawn_test_environment_with_test_container(
Expand Down Expand Up @@ -71,7 +72,7 @@ def spawn_test_environment_with_test_container(
raises: TaskRuntimeError if spawning the test environment fails
"""
def str_or_none(x: any) -> str:
def str_or_none(x: Any) -> Optional[str]:
return str(x) if x is not None else None
parsed_db_mem_size = humanfriendly.parse_size(db_mem_size)
if parsed_db_mem_size < humanfriendly.parse_size("1 GiB"):
Expand Down
10 changes: 5 additions & 5 deletions exasol_integration_test_docker_environment/lib/base/base_task.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
import logging
import shutil
from pathlib import Path
from typing import Dict, List, Generator, Any, Union, Set, Optional
from typing import Dict, List, Generator, Any, Union, Set

import luigi
import six
Expand Down Expand Up @@ -85,8 +85,8 @@ def get_output(self) -> Any:


class BaseTask(Task):
caller_output_path = luigi.ListParameter([], significant=False, visibility=ParameterVisibility.HIDDEN)
job_id = luigi.Parameter()
caller_output_path : List[str] = luigi.ListParameter([], significant=False, visibility=ParameterVisibility.HIDDEN) # type: ignore
job_id : str = luigi.Parameter() # type: ignore
tkilias marked this conversation as resolved.
Show resolved Hide resolved

def __init__(self, *args, **kwargs):
self._registered_tasks = []
Expand Down Expand Up @@ -153,7 +153,7 @@ def get_output_path(self) -> Path:

def _get_output_path_for_job(self) -> Path:
return Path(build_config().output_directory,
"jobs", self.job_id)
"jobs", self.job_id) # type: ignore
tkilias marked this conversation as resolved.
Show resolved Hide resolved

def _extend_output_path(self):
extension = self.extend_output_path()
Expand Down Expand Up @@ -184,7 +184,7 @@ def get_log_path(self) -> Path:
return path

def get_cache_path(self) -> Path:
path = Path(build_config().output_directory, "cache")
path = Path(str(build_config().output_directory), "cache")
path.mkdir(parents=True, exist_ok=True)
return path

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
import docker
import time
from docker import DockerClient
from typing import Protocol, runtime_checkable
from typing import Protocol, runtime_checkable, Optional
from docker.models.containers import Container, ExecResult
from exasol_integration_test_docker_environment \
.lib.base.ssh_access import SshKey
Expand Down Expand Up @@ -37,19 +37,18 @@ class DbOsExecutor(Protocol):
concrete implementations in sub-classes ``DockerExecutor`` and
``SshExecutor``.
"""
@abstractmethod
def exec(self, cmd: str) -> ExecResult:
...

def prepare(self):
pass
...


class DockerExecutor(DbOsExecutor):
def __init__(self, docker_client: DockerClient, container_name: str):
self._client = docker_client
self._container_name = container_name
self._container = None
self._container : Optional[Container] = None

def __enter__(self):
self._container = self._client.containers.get(self._container_name)
Expand All @@ -62,8 +61,12 @@ def __del__(self):
self.close()

def exec(self, cmd: str) -> ExecResult:
assert self._container
return self._container.exec_run(cmd)

def prepare(self):
pass

def close(self):
self._container = None
if self._client is not None:
Expand All @@ -75,7 +78,7 @@ class SshExecutor(DbOsExecutor):
def __init__(self, connect_string: str, key_file: str):
self._connect_string = connect_string
self._key_file = key_file
self._connection = None
self._connection : Optional[fabric.Connection] = None

def __enter__(self):
key = SshKey.read_from(self._key_file)
Expand All @@ -92,6 +95,7 @@ def __del__(self):
self.close()

def exec(self, cmd: str) -> ExecResult:
assert self._connection
result = self._connection.run(cmd, warn=True, hide=True)
output = result.stdout.encode("utf-8")
return ExecResult(result.exited, output)
Expand Down Expand Up @@ -143,9 +147,11 @@ def executor(self) -> DbOsExecutor:
return DockerExecutor(client, self._container_name)



class SshExecFactory(DbOsExecFactory):
@classmethod
def from_database_info(cls, info: DatabaseInfo):
assert info.ssh_info
return SshExecFactory(
f"{info.ssh_info.user}@{info.host}:{info.ports.ssh}",
info.ssh_info.key_file,
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
from pathlib import Path
from typing import Dict, Any
from typing import Dict, Any, List

import luigi

Expand All @@ -8,7 +8,7 @@


class FlavorsBaseTask(DependencyLoggerBaseTask):
flavor_paths = luigi.ListParameter()
flavor_paths : List[str] = luigi.ListParameter() # type: ignore

def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
Expand Down
Loading
Loading