Skip to content

Commit

Permalink
Change in Spark caused the 3.1.0 CI to fail (NVIDIA#988)
Browse files Browse the repository at this point in the history
* fix CI

* empty commit

Signed-off-by: Raza Jafri <rjafri@nvidia.com>

Co-authored-by: Raza Jafri <rjafri@nvidia.com>
  • Loading branch information
razajafri and razajafri authored Oct 20, 2020
1 parent 44b30be commit 19016e3
Showing 1 changed file with 9 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -29,14 +29,21 @@ import org.apache.spark.sql.types.StructType
/**
* This class exposes the ParquetRecordMaterializer
*/
class ParquetRecordMaterializer(parquetSchema: MessageType,
class ParquetRecordMaterializer(
parquetSchema: MessageType,
catalystSchema: StructType,
schemaConverter: ParquetToSparkSchemaConverter,
convertTz: Option[ZoneId],
datetimeRebaseMode: LegacyBehaviorPolicy.Value) extends RecordMaterializer[InternalRow] {

private val rootConverter = new ParquetRowConverter(
schemaConverter, parquetSchema, catalystSchema, convertTz, datetimeRebaseMode, NoopUpdater)
schemaConverter,
parquetSchema,
catalystSchema,
convertTz,
datetimeRebaseMode,
LegacyBehaviorPolicy.EXCEPTION,
NoopUpdater)

override def getCurrentRecord: InternalRow = rootConverter.currentRecord

Expand Down

0 comments on commit 19016e3

Please sign in to comment.