Skip to content

Commit

Permalink
Feature / Update to Apache Arrow 13 (#393)
Browse files Browse the repository at this point in the history
* Update Python requirements to Apache Arrow 13

* Update Java dependencies to Apache Arrow 13

* Bump postgresql version due to minor compliance warning

* Remove compliance exception for old version of Netty (upgrade depended on Arrow 13)

* Change the oldest tested Python version in CI to 3.8.0

* Change Python requirement is setup.cfg

* Update README for Python versions

* Add a false positive for CVE in Postgres (CVE is for the server, not JDBC driver)
  • Loading branch information
Martin Traverse authored Aug 25, 2023
1 parent c3f937e commit 6181d84
Show file tree
Hide file tree
Showing 7 changed files with 16 additions and 16 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -102,9 +102,9 @@ jobs:

# Oldest supported versions, force testing against .0 for Python, Pandas and PySpark
# If those don't work due to bugs, we need to update README for the supported versions
- { ENV_NAME: "Linux, 3.7, Pandas 1.2, PySpark 3.0",
- { ENV_NAME: "Linux, 3.8, Pandas 1.2, PySpark 3.0",
PLATFORM: 'ubuntu-latest',
PYTHON_VERSION: "3.7",
PYTHON_VERSION: "3.8",
PANDAS_VERSION: "== 1.2.0",
PYSPARK_VERSION: "== 3.0.0" }

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ For more information see the

With TRAC D.A.P. you can build and run production-ready models right on your desktop!
All you need is an IDE, Python and the tracdap-runtime Python package.
TRAC D.A.P. requires Python 3.7 or later.
TRAC D.A.P. requires Python 3.8 or later.

The [modelling tutorial](https://tracdap.finos.org/en/stable/modelling/tutorial/chapter_1_hello_world.html)
shows you how to get set up and write your first models. You can write models locally using
Expand Down
10 changes: 5 additions & 5 deletions dev/compliance/owasp-false-positives.xml
Original file line number Diff line number Diff line change
Expand Up @@ -220,12 +220,12 @@
<vulnerabilityName>CVE-2023-35116</vulnerabilityName>
</suppress>

<!-- Currently TRAC uses Netty version 4.1.93.Final -->
<!-- This is the latest version compatible with Apache Arrow 12 -->
<!-- There appears to be a recent fix in Netty, we need Apache Arrow 13 before we can update -->
<!-- More NVD Noise - this is a vulnerability in Postgres itself, not the JDBC driver -->
<!-- CVE itself is very weak, assumes attacker is on the server console and can send signals to the DB process -->

<suppress>
<packageUrl regex="true">^pkg:maven/io\.netty/netty\-.*@4.1.93.Final</packageUrl>
<vulnerabilityName>CVE-2023-34462</vulnerabilityName>
<packageUrl regex="true">^pkg:maven/org\.postgresql/postgresql@.*$</packageUrl>
<vulnerabilityName>CVE-2020-21469</vulnerabilityName>
</suppress>

</suppressions>
6 changes: 3 additions & 3 deletions gradle/versions.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@ ext {
proto_plugin_version = "0.8.13"

// Core platform technologies
netty_version = '4.1.93.Final'
netty_version = '4.1.97.Final'
guava_version = '32.0.1-jre'
proto_version = '3.23.2'
grpc_version = '1.57.2'
gapi_version = '2.20.0'

// Data technologies
arrow_version = '12.0.1'
arrow_version = '13.0.0'
jackson_version = '2.15.2'
jackson_databind_version = '2.15.2'

Expand All @@ -44,7 +44,7 @@ ext {
h2_version = '2.1.214'
mysql_version = '8.0.33'
mariadb_version = '3.0.8'
postgresql_version = '42.5.1'
postgresql_version = '42.6.0'
sqlserver_version = '9.4.1.jre11' // Update to SqlServer 10.x driver series is a breaking change
oracle_version = '19.14.0.0'

Expand Down
4 changes: 2 additions & 2 deletions tracdap-runtime/python/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,11 +16,11 @@ Documentation for the TRAC platform is available on our website at

The TRAC runtime for Python has these requirements:

* Python: 3.7 up to 3.11.x
* Python: 3.8 up to 3.11.x
* Pandas: 1.2 up to 1.5.x
* PySpark 3.0 up to 3.4.x (optional)

Not every combination of versions will work, e.g. using PySpark 3 requires Python 3.8.
3rd party libraries may impose additional constraints on supported versions of Python, Pandas or PySpark.


## Installing the runtime
Expand Down
2 changes: 1 addition & 1 deletion tracdap-runtime/python/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
protobuf == 4.23.2

# Core data framework is based on Arrow
pyarrow == 12.0.1
pyarrow == 13.0.0

# PyYAML is used to load config supplied in YAML format
pyyaml == 6.0
Expand Down
4 changes: 2 additions & 2 deletions tracdap-runtime/python/setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -54,11 +54,11 @@ package_dir =

# Support a range of Python versions
# (These versions are tested in CI)
python_requires = >= 3.7, < 3.12
python_requires = >= 3.8, < 3.12

install_requires =
protobuf == 4.23.2
pyarrow == 12.0.1
pyarrow == 13.0.0
pyyaml == 6.0.0
dulwich == 0.21.5
requests == 2.31.0
Expand Down

0 comments on commit 6181d84

Please sign in to comment.