Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

INSERT/REPLACE dimension target column types are validated against source input expressions #15962

Merged
merged 20 commits into from
Mar 25, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
acfa4ca
* address remaining comments from https://github.com/apache/druid/pul…
zachjsh Feb 24, 2024
e6c3b3b
Merge remote-tracking branch 'apache/master' into validate-target-col…
zachjsh Feb 26, 2024
d30a729
* address remaining comments from https://github.com/apache/druid/pu…
zachjsh Feb 27, 2024
4947aba
* add test that exposes relational algebra issue
zachjsh Mar 6, 2024
50bef96
* simplify test exposing issue
zachjsh Mar 6, 2024
23c6173
* fix
zachjsh Mar 6, 2024
f58a417
Merge remote-tracking branch 'apache/master' into validate-target-col…
zachjsh Mar 7, 2024
841dba5
* add tests for sealed / non-sealed
zachjsh Mar 7, 2024
d771bdb
* update test descriptions
zachjsh Mar 7, 2024
cd4bfa9
* fix test failure when -Ddruid.generic.useDefaultValueForNull=true
zachjsh Mar 8, 2024
2d3e86e
* check type assignment based on natice Druid types
zachjsh Mar 8, 2024
651ad56
* add tests that cover missing jacoco coverage
zachjsh Mar 13, 2024
1460bea
* add replace tests
zachjsh Mar 13, 2024
ddf8189
Merge remote-tracking branch 'apache/master' into validate-target-col…
zachjsh Mar 13, 2024
f85c21a
* add more tests and comments about column ordering
zachjsh Mar 13, 2024
7544ac9
* simplify tests
zachjsh Mar 13, 2024
162bd09
* review comments
zachjsh Mar 14, 2024
55b3e90
* remove commented line
zachjsh Mar 14, 2024
bb5b8c0
Merge remote-tracking branch 'apache/master' into validate-target-col…
zachjsh Mar 19, 2024
a89b4a3
* STRING family types should be validated as non-null
zachjsh Mar 19, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/

package org.apache.druid.catalog.sql;

import org.apache.druid.catalog.CatalogException;
import org.apache.druid.catalog.model.TableMetadata;
import org.apache.druid.catalog.model.facade.DatasourceFacade;
import org.apache.druid.catalog.model.table.ClusterKeySpec;
import org.apache.druid.catalog.model.table.TableBuilder;
import org.apache.druid.catalog.storage.CatalogStorage;
import org.apache.druid.catalog.storage.CatalogTests;
import org.apache.druid.catalog.sync.CachedMetadataCatalog;
import org.apache.druid.catalog.sync.MetadataCatalog;
import org.apache.druid.metadata.TestDerbyConnector.DerbyConnectorRule5;
import org.apache.druid.sql.calcite.CalciteCatalogInsertTest;
import org.apache.druid.sql.calcite.planner.CatalogResolver;
import org.apache.druid.sql.calcite.table.DatasourceTable;
import org.apache.druid.sql.calcite.util.SqlTestFramework;
import org.junit.jupiter.api.extension.RegisterExtension;

import static org.junit.Assert.fail;

/**
* Test the use of catalog specs to drive MSQ ingestion.
*/
public class CatalogInsertTest extends CalciteCatalogInsertTest
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

there are a lots of duplication with CatalogInsertTest and CatalogReplaceTest (4 lines differ)

and also CatalogQueryTest is pretty similar - seems like it has a different method order...

could you put all these into some common place (a rule?) and reuse that everywhere?

{
@RegisterExtension
public static final DerbyConnectorRule5 DERBY_CONNECTION_RULE = new DerbyConnectorRule5();

private static CatalogStorage storage;

@Override
public CatalogResolver createCatalogResolver()
{
CatalogTests.DbFixture dbFixture = new CatalogTests.DbFixture(DERBY_CONNECTION_RULE);
storage = dbFixture.storage;
MetadataCatalog catalog = new CachedMetadataCatalog(
storage,
storage.schemaRegistry(),
storage.jsonMapper()
);
return new LiveCatalogResolver(catalog);
}

@Override
public void finalizeTestFramework(SqlTestFramework sqlTestFramework)
{
super.finalizeTestFramework(sqlTestFramework);
buildDatasources();
}

public void buildDatasources()
{
resolvedTables.forEach((datasourceName, datasourceTable) -> {
DatasourceFacade catalogMetadata = ((DatasourceTable) datasourceTable).effectiveMetadata().catalogMetadata();
TableBuilder tableBuilder = TableBuilder.datasource(datasourceName, catalogMetadata.segmentGranularityString());
catalogMetadata.columnFacades().forEach(
columnFacade -> {
tableBuilder.column(columnFacade.spec().name(), columnFacade.spec().dataType());
}
);

if (catalogMetadata.hiddenColumns() != null && !catalogMetadata.hiddenColumns().isEmpty()) {
tableBuilder.hiddenColumns(catalogMetadata.hiddenColumns());
}

if (catalogMetadata.isSealed()) {
tableBuilder.sealed(true);
}

if (catalogMetadata.clusterKeys() != null && !catalogMetadata.clusterKeys().isEmpty()) {
tableBuilder.clusterColumns(catalogMetadata.clusterKeys().toArray(new ClusterKeySpec[0]));
}

createTableMetadata(tableBuilder.build());
});
DatasourceFacade catalogMetadata =
((DatasourceTable) resolvedTables.get("foo")).effectiveMetadata().catalogMetadata();
TableBuilder tableBuilder = TableBuilder.datasource("foo", catalogMetadata.segmentGranularityString());
catalogMetadata.columnFacades().forEach(
columnFacade -> {
tableBuilder.column(columnFacade.spec().name(), columnFacade.spec().dataType());
}
);
}

private void createTableMetadata(TableMetadata table)
{
try {
storage.tables().create(table);
}
catch (CatalogException e) {
fail(e.getMessage());
}
}
}
Loading
Loading