forked from apache/druid
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix flaky test in WikiParquetReaderTest.java
- Loading branch information
Showing
118 changed files
with
3,433 additions
and
1 deletion.
There are no files selected for viewing
91 changes: 91 additions & 0 deletions
91
...RvZjZWBMvmAZavjv+Vzdg=/TEST-org.apache.druid.data.input.parquet.WikiParquetReaderTest.xml
Large diffs are not rendered by default.
Oops, something went wrong.
11 changes: 11 additions & 0 deletions
11
...nsions-core/parquet-extensions/.nondex/EifZ2gTEwmY1laASNAqKzRvZjZWBMvmAZavjv+Vzdg=/config
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,11 @@ | ||
nondexFilter=.* | ||
nondexMode=FULL | ||
nondexSeed=974622 | ||
nondexStart=0 | ||
nondexEnd=9223372036854775807 | ||
nondexPrintstack=false | ||
nondexDir=/home/wenxiuw2/druid/extensions-core/parquet-extensions/.nondex | ||
nondexJarDir=/home/wenxiuw2/druid/extensions-core/parquet-extensions/.nondex | ||
nondexExecid=EifZ2gTEwmY1laASNAqKzRvZjZWBMvmAZavjv+Vzdg= | ||
nondexLogging=CONFIG | ||
test= |
Empty file.
2 changes: 2 additions & 0 deletions
2
...s-core/parquet-extensions/.nondex/EifZ2gTEwmY1laASNAqKzRvZjZWBMvmAZavjv+Vzdg=/invocations
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
COUNT:2367 | ||
SHUFFLES:2367 |
68 changes: 68 additions & 0 deletions
68
...ZjZWBMvmAZavjv+Vzdg=/org.apache.druid.data.input.parquet.WikiParquetReaderTest-output.txt
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,68 @@ | ||
2023-11-30T22:09:03,956 WARN [main] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | ||
2023-11-30T22:09:04,334 WARN [main] org.apache.parquet.CorruptStatistics - Ignoring statistics because created_by could not be parsed (see PARQUET-251): parquet-mr (build 6aa21f8776625b5fa6b18059cfebe7549f2e00cb) | ||
org.apache.parquet.VersionParser$VersionParseException: Could not parse created_by: parquet-mr (build 6aa21f8776625b5fa6b18059cfebe7549f2e00cb) using format: (.*?)\s+version\s*(?:([^(]*?)\s*(?:\(\s*build\s*([^)]*?)\s*\))?)? | ||
at org.apache.parquet.VersionParser.parse(VersionParser.java:109) ~[parquet-common-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.CorruptStatistics.shouldIgnoreStatistics(CorruptStatistics.java:72) ~[parquet-column-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.format.converter.ParquetMetadataConverter.fromParquetStatisticsInternal(ParquetMetadataConverter.java:814) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.format.converter.ParquetMetadataConverter.fromParquetStatistics(ParquetMetadataConverter.java:834) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.format.converter.ParquetMetadataConverter.buildColumnChunkMetaData(ParquetMetadataConverter.java:1502) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.format.converter.ParquetMetadataConverter.fromParquetMetadata(ParquetMetadataConverter.java:1592) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.format.converter.ParquetMetadataConverter.readParquetMetadata(ParquetMetadataConverter.java:1490) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:582) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.hadoop.ParquetFileReader.<init>(ParquetFileReader.java:790) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.hadoop.ParquetFileReader.open(ParquetFileReader.java:657) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.hadoop.ParquetReader.initReader(ParquetReader.java:162) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.hadoop.ParquetReader.read(ParquetReader.java:135) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.druid.data.input.parquet.ParquetReader$1.hasNext(ParquetReader.java:113) ~[classes/:?] | ||
at org.apache.druid.java.util.common.parsers.CloseableIteratorWithMetadata$1.hasNext(CloseableIteratorWithMetadata.java:71) ~[druid-processing-28.0.0-SNAPSHOT.jar:28.0.0-SNAPSHOT] | ||
at org.apache.druid.data.input.IntermediateRowParsingReader$1.hasNext(IntermediateRowParsingReader.java:66) ~[druid-processing-28.0.0-SNAPSHOT.jar:28.0.0-SNAPSHOT] | ||
at java.util.Iterator.forEachRemaining(Iterator.java:132) ~[?:?] | ||
at org.apache.druid.data.input.parquet.BaseParquetReaderTest.readAllRows(BaseParquetReaderTest.java:64) ~[test-classes/:?] | ||
at org.apache.druid.data.input.parquet.WikiParquetReaderTest.testWiki(WikiParquetReaderTest.java:55) ~[test-classes/:?] | ||
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] | ||
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?] | ||
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?] | ||
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?] | ||
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.run(ParentRunner.java:413) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.Suite.runChild(Suite.java:128) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.Suite.runChild(Suite.java:27) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.run(ParentRunner.java:413) ~[junit-4.13.2.jar:4.13.2] | ||
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:49) ~[surefire-junit47-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:120) ~[surefire-junit47-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:95) ~[surefire-junit47-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75) ~[surefire-junit47-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:69) ~[surefire-junit47-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:146) ~[surefire-junit47-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:385) ~[surefire-booter-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) ~[surefire-booter-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:507) ~[surefire-booter-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:495) ~[surefire-booter-3.1.2.jar:3.1.2] | ||
2023-11-30T22:09:04,541 INFO [main] org.apache.parquet.hadoop.InternalParquetRecordReader - RecordReader initialized will read a total of 5 records. | ||
2023-11-30T22:09:04,542 INFO [main] org.apache.parquet.hadoop.InternalParquetRecordReader - at row 0. reading next block | ||
2023-11-30T22:09:04,594 INFO [main] org.apache.hadoop.io.compress.CodecPool - Got brand-new decompressor [.gz] | ||
2023-11-30T22:09:04,601 INFO [main] org.apache.parquet.hadoop.InternalParquetRecordReader - block read in memory in 59 ms. row count = 5 | ||
2023-11-30T22:09:04,907 INFO [main] org.apache.parquet.hadoop.InternalParquetRecordReader - RecordReader initialized will read a total of 5 records. | ||
2023-11-30T22:09:04,908 INFO [main] org.apache.parquet.hadoop.InternalParquetRecordReader - at row 0. reading next block | ||
2023-11-30T22:09:04,909 INFO [main] org.apache.hadoop.io.compress.CodecPool - Got brand-new decompressor [.gz] | ||
2023-11-30T22:09:04,913 INFO [main] org.apache.parquet.hadoop.InternalParquetRecordReader - block read in memory in 5 ms. row count = 5 |
4 changes: 4 additions & 0 deletions
4
...NAqKzRvZjZWBMvmAZavjv+Vzdg=/org.apache.druid.data.input.parquet.WikiParquetReaderTest.txt
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,4 @@ | ||
------------------------------------------------------------------------------- | ||
Test set: org.apache.druid.data.input.parquet.WikiParquetReaderTest | ||
------------------------------------------------------------------------------- | ||
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.374 s -- in org.apache.druid.data.input.parquet.WikiParquetReaderTest |
91 changes: 91 additions & 0 deletions
91
...PpVGMCO6Rb3cKks2zs+jc=/TEST-org.apache.druid.data.input.parquet.WikiParquetReaderTest.xml
Large diffs are not rendered by default.
Oops, something went wrong.
11 changes: 11 additions & 0 deletions
11
...nsions-core/parquet-extensions/.nondex/FbrTZyMkrGPJrvSqloW2ePpVGMCO6Rb3cKks2zs+jc=/config
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,11 @@ | ||
nondexFilter=.* | ||
nondexMode=FULL | ||
nondexSeed=974622 | ||
nondexStart=0 | ||
nondexEnd=9223372036854775807 | ||
nondexPrintstack=false | ||
nondexDir=/home/wenxiuw2/druid/extensions-core/parquet-extensions/.nondex | ||
nondexJarDir=/home/wenxiuw2/druid/extensions-core/parquet-extensions/.nondex | ||
nondexExecid=FbrTZyMkrGPJrvSqloW2ePpVGMCO6Rb3cKks2zs+jc= | ||
nondexLogging=CONFIG | ||
test= |
Empty file.
2 changes: 2 additions & 0 deletions
2
...s-core/parquet-extensions/.nondex/FbrTZyMkrGPJrvSqloW2ePpVGMCO6Rb3cKks2zs+jc=/invocations
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
COUNT:2370 | ||
SHUFFLES:2370 |
68 changes: 68 additions & 0 deletions
68
...VGMCO6Rb3cKks2zs+jc=/org.apache.druid.data.input.parquet.WikiParquetReaderTest-output.txt
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,68 @@ | ||
2023-11-30T22:02:56,371 WARN [main] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | ||
2023-11-30T22:02:57,813 WARN [main] org.apache.parquet.CorruptStatistics - Ignoring statistics because created_by could not be parsed (see PARQUET-251): parquet-mr (build 6aa21f8776625b5fa6b18059cfebe7549f2e00cb) | ||
org.apache.parquet.VersionParser$VersionParseException: Could not parse created_by: parquet-mr (build 6aa21f8776625b5fa6b18059cfebe7549f2e00cb) using format: (.*?)\s+version\s*(?:([^(]*?)\s*(?:\(\s*build\s*([^)]*?)\s*\))?)? | ||
at org.apache.parquet.VersionParser.parse(VersionParser.java:109) ~[parquet-common-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.CorruptStatistics.shouldIgnoreStatistics(CorruptStatistics.java:72) ~[parquet-column-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.format.converter.ParquetMetadataConverter.fromParquetStatisticsInternal(ParquetMetadataConverter.java:814) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.format.converter.ParquetMetadataConverter.fromParquetStatistics(ParquetMetadataConverter.java:834) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.format.converter.ParquetMetadataConverter.buildColumnChunkMetaData(ParquetMetadataConverter.java:1502) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.format.converter.ParquetMetadataConverter.fromParquetMetadata(ParquetMetadataConverter.java:1592) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.format.converter.ParquetMetadataConverter.readParquetMetadata(ParquetMetadataConverter.java:1490) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:582) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.hadoop.ParquetFileReader.<init>(ParquetFileReader.java:790) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.hadoop.ParquetFileReader.open(ParquetFileReader.java:657) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.hadoop.ParquetReader.initReader(ParquetReader.java:162) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.parquet.hadoop.ParquetReader.read(ParquetReader.java:135) ~[parquet-hadoop-1.13.0.jar:1.13.0] | ||
at org.apache.druid.data.input.parquet.ParquetReader$1.hasNext(ParquetReader.java:113) ~[classes/:?] | ||
at org.apache.druid.java.util.common.parsers.CloseableIteratorWithMetadata$1.hasNext(CloseableIteratorWithMetadata.java:71) ~[druid-processing-28.0.0-SNAPSHOT.jar:28.0.0-SNAPSHOT] | ||
at org.apache.druid.data.input.IntermediateRowParsingReader$1.hasNext(IntermediateRowParsingReader.java:66) ~[druid-processing-28.0.0-SNAPSHOT.jar:28.0.0-SNAPSHOT] | ||
at java.util.Iterator.forEachRemaining(Iterator.java:132) ~[?:?] | ||
at org.apache.druid.data.input.parquet.BaseParquetReaderTest.readAllRows(BaseParquetReaderTest.java:64) ~[test-classes/:?] | ||
at org.apache.druid.data.input.parquet.WikiParquetReaderTest.testWiki(WikiParquetReaderTest.java:54) ~[test-classes/:?] | ||
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?] | ||
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?] | ||
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?] | ||
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?] | ||
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.run(ParentRunner.java:413) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.Suite.runChild(Suite.java:128) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.Suite.runChild(Suite.java:27) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) ~[junit-4.13.2.jar:4.13.2] | ||
at org.junit.runners.ParentRunner.run(ParentRunner.java:413) ~[junit-4.13.2.jar:4.13.2] | ||
at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:49) ~[surefire-junit47-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:120) ~[surefire-junit47-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:95) ~[surefire-junit47-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75) ~[surefire-junit47-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:69) ~[surefire-junit47-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:146) ~[surefire-junit47-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:385) ~[surefire-booter-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) ~[surefire-booter-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:507) ~[surefire-booter-3.1.2.jar:3.1.2] | ||
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:495) ~[surefire-booter-3.1.2.jar:3.1.2] | ||
2023-11-30T22:02:58,471 INFO [main] org.apache.parquet.hadoop.InternalParquetRecordReader - RecordReader initialized will read a total of 5 records. | ||
2023-11-30T22:02:58,472 INFO [main] org.apache.parquet.hadoop.InternalParquetRecordReader - at row 0. reading next block | ||
2023-11-30T22:02:58,583 INFO [main] org.apache.hadoop.io.compress.CodecPool - Got brand-new decompressor [.gz] | ||
2023-11-30T22:02:58,623 INFO [main] org.apache.parquet.hadoop.InternalParquetRecordReader - block read in memory in 150 ms. row count = 5 | ||
2023-11-30T22:02:59,240 INFO [main] org.apache.parquet.hadoop.InternalParquetRecordReader - RecordReader initialized will read a total of 5 records. | ||
2023-11-30T22:02:59,241 INFO [main] org.apache.parquet.hadoop.InternalParquetRecordReader - at row 0. reading next block | ||
2023-11-30T22:02:59,242 INFO [main] org.apache.hadoop.io.compress.CodecPool - Got brand-new decompressor [.gz] | ||
2023-11-30T22:02:59,244 INFO [main] org.apache.parquet.hadoop.InternalParquetRecordReader - block read in memory in 3 ms. row count = 5 |
4 changes: 4 additions & 0 deletions
4
...loW2ePpVGMCO6Rb3cKks2zs+jc=/org.apache.druid.data.input.parquet.WikiParquetReaderTest.txt
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,4 @@ | ||
------------------------------------------------------------------------------- | ||
Test set: org.apache.druid.data.input.parquet.WikiParquetReaderTest | ||
------------------------------------------------------------------------------- | ||
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.617 s -- in org.apache.druid.data.input.parquet.WikiParquetReaderTest |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,4 @@ | ||
uPbJ1rMwKGejGTWUjRUmBJnepo+sLU9+fi7h2tdpmQo= | ||
oVganU+DIvhCvtZAxjmozVpJ18XmAwP0YUmGoU9jss= | ||
Ugy3BpooKHzl7yAzogLg533xY9TemTZrWqZaVx01nM= | ||
clean_x52C54ffcdm77gYUXXmCE9u6gKku4vmRP4c5WfGMm4= |
91 changes: 91 additions & 0 deletions
91
...33xY9TemTZrWqZaVx01nM=/TEST-org.apache.druid.data.input.parquet.WikiParquetReaderTest.xml
Large diffs are not rendered by default.
Oops, something went wrong.
11 changes: 11 additions & 0 deletions
11
...nsions-core/parquet-extensions/.nondex/Ugy3BpooKHzl7yAzogLg533xY9TemTZrWqZaVx01nM=/config
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,11 @@ | ||
nondexFilter=.* | ||
nondexMode=FULL | ||
nondexSeed=1016066 | ||
nondexStart=0 | ||
nondexEnd=9223372036854775807 | ||
nondexPrintstack=false | ||
nondexDir=/home/wenxiuw2/druid/extensions-core/parquet-extensions/.nondex | ||
nondexJarDir=/home/wenxiuw2/druid/extensions-core/parquet-extensions/.nondex | ||
nondexExecid=Ugy3BpooKHzl7yAzogLg533xY9TemTZrWqZaVx01nM= | ||
nondexLogging=CONFIG | ||
test= |
Empty file.
2 changes: 2 additions & 0 deletions
2
...s-core/parquet-extensions/.nondex/Ugy3BpooKHzl7yAzogLg533xY9TemTZrWqZaVx01nM=/invocations
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
COUNT:2366 | ||
SHUFFLES:2366 |
Oops, something went wrong.