Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vastly refactored property --> jdbc value mapping api #1517

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -125,6 +125,15 @@ default <R> R readAndResolve(Class<R> type, RowDocument source, Identifier ident
*/
SQLType getTargetSqlType(RelationalPersistentProperty property);

/**
* The SQL type constant used when using this property as a parameter for a SQL statement.
*
* @return Must not be {@code null}.
* @see java.sql.Types
* @since 2.0
*/
SQLType getTargetSqlType(Class<?> property);

@Override
RelationalMappingContext getMappingContext();

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,6 @@
import java.sql.SQLType;
import java.util.Iterator;
import java.util.Map;
import java.util.Optional;
import java.util.function.Function;

import org.apache.commons.logging.Log;
Expand Down Expand Up @@ -152,6 +151,11 @@ public SQLType getTargetSqlType(RelationalPersistentProperty property) {
return JdbcUtil.targetSqlTypeFor(getColumnType(property));
}

@Override
public SQLType getTargetSqlType(Class<?> property) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not quite sure this is the right approach. Right below, there's a getColumnType method that operates on the level of a property.

Conversion is supposed to take custom converters into account and depending on a database dialect, a SQLType might change.

I suppose we need a larger revision on the converter to ensure that we sketch out the possible cases for obtaining SQLType (string-based query, in the context of a property) and define a path forward regarding necessary changes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me break my change into several parts:

The reason I introduced this method is to optimize the target SQLType search. The public getTargetSqlType(RelationalPersistentProperty<?> property) is first resolving the Class of the given property, and then it searches the SQLType for the given Class. This method exists mainly if we already know, what Class actually backs the property, so we do not need to do discover it over and over again.

Now, let me explain my thoughts on your suggestions. I generally agree with you, that certainly makes sense, but this PR is solving another problem, and this problem of eliminating the incorrect conversions that are applied at the wrong times. See the bugs linked. I think we should do the following - if we recognize, that there is a need to determine the SQLType depending on the dialect (from some issue for instance), we're going to do so, but for now, I suggest leaving it as is.

What are your thoughts on this, @mp911de?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My main concern is that we're continuing to add functionality in terms of putting patches around something that is not designed in the right way instead of fixing the core issue.

I think @schauder should decide how to proceed here.

return JdbcUtil.targetSqlTypeFor(property);
}

@Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
return doGetColumnType(property);
Expand Down Expand Up @@ -205,13 +209,13 @@ public Object readValue(@Nullable Object value, TypeInformation<?> type) {

@Override
@Nullable
public Object writeValue(@Nullable Object value, TypeInformation<?> type) {
public Object writeValue(@Nullable Object value, TypeInformation<?> targetType) {

if (value == null) {
return null;
}

return super.writeValue(value, type);
return super.writeValue(value, targetType);
}

private boolean canWriteAsJdbcValue(@Nullable Object value) {
Expand All @@ -236,8 +240,9 @@ private boolean canWriteAsJdbcValue(@Nullable Object value) {
return true;
}

Optional<Class<?>> customWriteTarget = getConversions().getCustomWriteTarget(value.getClass());
return customWriteTarget.isPresent() && customWriteTarget.get().isAssignableFrom(JdbcValue.class);
return getCustomConversions().getCustomWriteTarget(value.getClass())
.filter(it -> it.isAssignableFrom(JdbcValue.class))
.isPresent();
}

@Override
Expand Down Expand Up @@ -353,7 +358,7 @@ public <T> T getPropertyValue(RelationalPersistentProperty property) {

AggregatePath aggregatePath = this.context.aggregatePath();

if (getConversions().isSimpleType(property.getActualType())) {
if (getCustomConversions().isSimpleType(property.getActualType())) {
return (T) delegate.getValue(aggregatePath);
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -381,10 +381,12 @@ private JdbcValue convertToJdbcValue(RelationalPersistentProperty property, @Nul
*/
private JdbcValue getWriteValue(RelationalPersistentProperty property, Object value) {

Class<?> columnType = converter.getColumnType(property);

return converter.writeJdbcValue( //
value, //
converter.getColumnType(property), //
converter.getTargetSqlType(property) //
columnType, //
converter.getTargetSqlType(columnType) //
);
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,6 @@
*/
package org.springframework.data.jdbc.core.mapping;

import java.util.Objects;

import org.springframework.lang.Nullable;
import org.springframework.util.Assert;

Expand All @@ -27,6 +25,7 @@
* @param <ID> the type of the id of the referenced aggregate root.
* @author Jens Schauder
* @author Myeonghyeon Lee
* @author Mikhail Polivakha
* @since 1.0
*/
public interface AggregateReference<T, ID> {
Expand All @@ -48,42 +47,15 @@ static <T, ID> AggregateReference<T, ID> to(ID id) {
* @param <T>
* @param <ID>
*/
class IdOnlyAggregateReference<T, ID> implements AggregateReference<T, ID> {

private final ID id;

public IdOnlyAggregateReference(ID id) {
record IdOnlyAggregateReference<T, ID>(ID id) implements AggregateReference<T, ID> {

public IdOnlyAggregateReference {
Assert.notNull(id, "Id must not be null");

this.id = id;
}

@Override
public ID getId() {
return id;
}

@Override
public boolean equals(@Nullable Object o) {

if (this == o)
return true;
if (o == null || getClass() != o.getClass())
return false;
IdOnlyAggregateReference<?, ?> that = (IdOnlyAggregateReference<?, ?>) o;
return id.equals(that.id);
}

@Override
public int hashCode() {
return Objects.hash(id);
}

@Override
public String toString() {

return "IdOnlyAggregateReference{" + "id=" + id + '}';
return id();
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ public <R> R read(Class<R> type, Row row, @Nullable RowMetadata metadata) {
return type.cast(row);
}

if (getConversions().hasCustomReadTarget(Row.class, rawType)
if (getCustomConversions().hasCustomReadTarget(Row.class, rawType)
&& getConversionService().canConvert(Row.class, rawType)) {
return getConversionService().convert(row, rawType);
}
Expand Down Expand Up @@ -170,7 +170,7 @@ public void write(Object source, OutboundRow sink) {

Class<?> userClass = ClassUtils.getUserClass(source);

Optional<Class<?>> customTarget = getConversions().getCustomWriteTarget(userClass, OutboundRow.class);
Optional<Class<?>> customTarget = getCustomConversions().getCustomWriteTarget(userClass, OutboundRow.class);
if (customTarget.isPresent()) {

OutboundRow result = getConversionService().convert(source, OutboundRow.class);
Expand Down Expand Up @@ -212,7 +212,7 @@ private void writeProperties(OutboundRow sink, RelationalPersistentEntity<?> ent
continue;
}

if (getConversions().isSimpleType(value.getClass())) {
if (getCustomConversions().isSimpleType(value.getClass())) {
writeSimpleInternal(sink, value, isNew, property);
} else {
writePropertyInternal(sink, value, isNew, property);
Expand Down Expand Up @@ -286,7 +286,7 @@ private List<Object> writeCollectionInternal(Collection<?> source, @Nullable Typ

Class<?> elementType = element == null ? null : element.getClass();

if (elementType == null || getConversions().isSimpleType(elementType)) {
if (elementType == null || getCustomConversions().isSimpleType(elementType)) {
collection.add(getPotentiallyConvertedSimpleWrite(element,
componentType != null ? componentType.getType() : Object.class));
} else if (element instanceof Collection || elementType.isArray()) {
Expand All @@ -306,7 +306,7 @@ private void writeNullInternal(OutboundRow sink, RelationalPersistentProperty pr

private Class<?> getPotentiallyConvertedSimpleNullType(Class<?> type) {

Optional<Class<?>> customTarget = getConversions().getCustomWriteTarget(type);
Optional<Class<?>> customTarget = getCustomConversions().getCustomWriteTarget(type);

if (customTarget.isPresent()) {
return customTarget.get();
Expand Down Expand Up @@ -353,7 +353,7 @@ private Object getPotentiallyConvertedSimpleWrite(@Nullable Object value, Class<
}
}

Optional<Class<?>> customTarget = getConversions().getCustomWriteTarget(value.getClass());
Optional<Class<?>> customTarget = getCustomConversions().getCustomWriteTarget(value.getClass());

if (customTarget.isPresent()) {
return getConversionService().convert(value, customTarget.get());
Expand Down Expand Up @@ -393,7 +393,7 @@ public Object getArrayValue(ArrayColumns arrayColumns, RelationalPersistentPrope
@Override
public Class<?> getTargetType(Class<?> valueType) {

Optional<Class<?>> writeTarget = getConversions().getCustomWriteTarget(valueType);
Optional<Class<?>> writeTarget = getCustomConversions().getCustomWriteTarget(valueType);

return writeTarget.orElseGet(() -> {
return Enum.class.isAssignableFrom(valueType) ? String.class : valueType;
Expand All @@ -402,7 +402,7 @@ public Class<?> getTargetType(Class<?> valueType) {

@Override
public boolean isSimpleType(Class<?> type) {
return getConversions().isSimpleType(type);
return getCustomConversions().isSimpleType(type);
}

// ----------------------------------
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -852,7 +852,7 @@ public <T> RowsFetchSpec<T> getRowsFetchSpec(DatabaseClient.GenericExecuteSpec e

// Bridge-code: Consider Converter<Row, T> until we have fully migrated to RowDocument
if (converter instanceof AbstractRelationalConverter relationalConverter
&& relationalConverter.getConversions().hasCustomReadTarget(Row.class, resultType)) {
&& relationalConverter.getCustomConversions().hasCustomReadTarget(Row.class, resultType)) {

ConversionService conversionService = relationalConverter.getConversionService();
rowMapper = (row, rowMetadata) -> (T) conversionService.convert(row, resultType);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
import static org.assertj.core.api.Assertions.*;
import static org.mockito.Mockito.*;

import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;

import org.springframework.beans.factory.ListableBeanFactory;
Expand All @@ -26,6 +27,8 @@
import org.springframework.data.r2dbc.repository.R2dbcRepository;
import org.springframework.data.spel.EvaluationContextProvider;
import org.springframework.data.spel.ReactiveExtensionAwareEvaluationContextProvider;
import org.springframework.data.repository.query.ReactiveExtensionAwareQueryMethodEvaluationContextProvider;
import org.springframework.data.repository.query.ReactiveQueryMethodEvaluationContextProvider;
import org.springframework.r2dbc.core.DatabaseClient;
import org.springframework.test.util.ReflectionTestUtils;

Expand Down Expand Up @@ -56,7 +59,5 @@ void shouldConfigureReactiveExtensionAwareQueryMethodEvaluationContextProvider()

static class Person {}

interface PersonRepository extends R2dbcRepository<Person, String>

{}
interface PersonRepository extends R2dbcRepository<Person, String> {}
}
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ public ConversionService getConversionService() {
return conversionService;
}

public CustomConversions getConversions() {
public CustomConversions getCustomConversions() {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a breaking change.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree. Idk, I just renamed it for readability, to not confuse it in the code with ConversionService. Do you think it should be rolled back, or we're going to introduce this pr in the next major release, so it can wait?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please do not introduce breaking changes if you're looking for inclusion in a minor release.

return conversions;
}

Expand Down
Loading
Loading