Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-49739][SQL] Throw Custom error condition for invalid options in SaveIntoDataSourceCommand class #48188

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions common/utils/src/main/resources/error/error-conditions.json
Original file line number Diff line number Diff line change
Expand Up @@ -2752,6 +2752,11 @@
"message" : [
"A type of keys and values in `map()` must be string, but got <mapType>."
]
},
"NULL_VALUE" : {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about to re-use the existing error condition: NULL_DATA_SOURCE_OPTION?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i actually missed this condition, it makes more sense

"message" : [
"Values of keys and values in `map()` must be string, but got null in value."
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Users can build values dynamically, and could be not clear which one is null. Let's help them in troubleshooting, and provide key which has the null value.

]
}
},
"sqlState" : "42K06"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ package org.apache.spark.sql.execution.datasources

import scala.util.control.NonFatal

import org.apache.spark.SparkThrowable
import org.apache.spark.{SparkIllegalArgumentException, SparkThrowable}
import org.apache.spark.sql.{Dataset, Row, SaveMode, SparkSession}
import org.apache.spark.sql.catalyst.plans.QueryPlan
import org.apache.spark.sql.catalyst.plans.logical.{CTEInChildren, CTERelationDef, LogicalPlan, WithCTE}
Expand All @@ -46,6 +46,10 @@ case class SaveIntoDataSourceCommand(
override def innerChildren: Seq[QueryPlan[_]] = Seq(query)

override def run(sparkSession: SparkSession): Seq[Row] = {
if (options.values.exists(_ == null)) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about keys? Should we check them too?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, the keys cannot be null since Map[String, String] can't accept null keys.

throw new SparkIllegalArgumentException(
errorClass = "INVALID_OPTIONS.NULL_VALUE")
}
var relation: BaseRelation = null

try {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

package org.apache.spark.sql.execution.datasources

import org.apache.spark.SparkIllegalArgumentException
import org.apache.spark.rdd.RDD
import org.apache.spark.sql.{AnalysisException, DataFrame, QueryTest, Row, SaveMode, SparkSession, SQLContext}
import org.apache.spark.sql.sources.{BaseRelation, CreatableRelationProvider, RelationProvider, TableScan}
Expand Down Expand Up @@ -72,6 +73,20 @@ class SaveIntoDataSourceCommandSuite extends QueryTest with SharedSparkSession {
FakeV1DataSource.data = null
}

test("Does not except null value in options") {
val invalidOptions: Map[String, String] = Map(
"key1" -> "val1",
"key2" -> null
)
val cmd = SaveIntoDataSourceCommand(null, null, invalidOptions, SaveMode.Ignore)
checkError(
exception = intercept[SparkIllegalArgumentException]{
cmd.run(spark)
},
condition = "INVALID_OPTIONS.NULL_VALUE",
parameters = Map.empty)
}

test("Data type support") {

val dataSource = DataSource(
Expand Down