Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
62 changes: 30 additions & 32 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -249,35 +249,33 @@ jobs:
file: ./coverage.xml
token: ${{ secrets.CODECOV_TOKEN }}

# The authentication of these tests is currently failing.
# These tests should be reenabled asap.

# bigquery:
# if: ${{ contains(github.event.pull_request.labels.*.name, 'bigquery') || contains(github.event.pull_request.labels.*.name, 'ready') || github.ref == 'refs/heads/main' }}
# name: "BigQuery"
# runs-on: ubuntu-latest
# strategy:
# fail-fast: false
# matrix:
# env:
# - bigquery-py38
# - bigquery-sa1
# steps:
# - name: 'Authenticate to Google Cloud'
# uses: 'google-github-actions/auth@71fee32a0bb7e97b4d33d548e7d957010649d8fa'
# with:
# credentials_json: '${{ secrets.GCP_KEY }}'
# - name: Checkout branch
# uses: actions/checkout@v6
# - name: Set up pixi
# uses: prefix-dev/setup-pixi@v0.9.3
# with:
# environments: ${{ matrix.env }}
# - run: |
# pixi run -e ${{ matrix.env }} postinstall
# pixi run -e ${{ matrix.env }} coverage
# - name: Generate code coverage report
# uses: codecov/codecov-action@v5.5.2
# with:
# file: ./coverage.xml
# token: ${{ secrets.CODECOV_TOKEN }}
bigquery:
if: ${{ contains(github.event.pull_request.labels.*.name, 'bigquery') || contains(github.event.pull_request.labels.*.name, 'ready') || github.ref == 'refs/heads/main' }}
name: "BigQuery"
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
env:
- bigquery-py310
- bigquery-sa1
- bigquery-sa2
steps:
- name: Checkout branch
uses: actions/checkout@v6
- name: "Authenticate to Google Cloud"
uses: "google-github-actions/auth@71fee32a0bb7e97b4d33d548e7d957010649d8fa"
with:
credentials_json: "${{ secrets.GCP_KEY }}"
- name: Set up pixi
uses: prefix-dev/setup-pixi@v0.9.3
with:
environments: ${{ matrix.env }}
- run: |
pixi run -e ${{ matrix.env }} postinstall
pixi run -e ${{ matrix.env }} coverage
- name: Generate code coverage report
uses: codecov/codecov-action@v5.5.2
with:
file: ./coverage.xml
token: ${{ secrets.CODECOV_TOKEN }}
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Changelog

## 1.13.0 - 2026.01.21

- Deprecate `sqlalchemy` <2.0.0.

## 1.12.0 - 2026.01.12

- Drop support impala as a backend.
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Express and test specifications against data from database.

`datajudge` can either be installed via pypi with `pip install datajudge` or via conda-forge with `conda install datajudge -c conda-forge`.

Please refer to the [Getting Started](https://datajudge.readthedocs.io/en/latest/getting_started.html) section of our documentation for details.
Please refer to the [Getting Started](https://datajudge.readthedocs.io/en/latest/getting-started/) section of our documentation for details.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unrelated correction of a URL that hasn't been updated since the mkdocs migration.


Expressing an expectations between different tables from a database may look as such:

Expand Down
7 changes: 5 additions & 2 deletions docs/development.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,8 +24,8 @@ To run integration tests against Postgres, first start a docker container with a
./start_postgres.sh
```

In your current environment, install the `psycopg2` package.
After this, you may execute integration tests as follows:
Then, you can run tests against the database you just started with one of the Postgres-specific
pixi environments, e.g.:

```bash
pixi run -e postgres-py312 test
Expand All @@ -50,3 +50,6 @@ pixi run -e mssql-py312 test_freetds
```

depending on the driver you'd like to use.

Please note that running tests against Snowflake and BigQuery requires authentication to be
set up properly.
715 changes: 697 additions & 18 deletions pixi.lock

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions pixi.toml
Original file line number Diff line number Diff line change
Expand Up @@ -159,5 +159,6 @@ duckdb-sa2 = ["duckdb", "sa2", "test"]

bigquery-py310 = ["bigquery", "py310", "test"]
bigquery-sa1 = ["bigquery", "sa1", "test"]
bigquery-sa2 = ["bigquery", "sa2", "test"]

lint = { features = ["lint"], no-default-feature = true }
14 changes: 13 additions & 1 deletion src/datajudge/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,17 @@
"""datajudge allows to assess whether data from database complies with referenceinformation."""

import warnings

import sqlalchemy as sa

if sa.__version__.startswith("1."):
warnings.warn(
"SQLAlchemy 1.x is deprecated and will no longer be supported in future "
"versions of datajudge. Please upgrade to SQLAlchemy 2.x.",
FutureWarning,
stacklevel=2,
)

from .condition import Condition
from .constraints.base import Constraint
from .data_source import DataSource
Expand All @@ -14,4 +26,4 @@
"WithinRequirement",
]

__version__ = "1.12.0"
__version__ = "1.13.0"
1 change: 0 additions & 1 deletion tests/integration/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,6 @@ def get_engine(backend) -> sa.engine.Engine:
connection_string = f"snowflake://{user}@{account}/datajudge/DBO?warehouse=datajudge&role=accountadmin"
connect_args["private_key"] = pkb
elif "bigquery" in backend:
# gcp_project = os.environ.get("GOOGLE_CLOUD_PROJECT", "scratch-361908")
connection_string = "bigquery://"
elif backend == "duckdb":
connection_string = "duckdb:///:memory:"
Expand Down
Loading