Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build against specified spark-rapids-jni snapshot jar [skip ci] #5501

Merged
merged 6 commits into from
May 19, 2022
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 7 additions & 3 deletions build/buildall
Original file line number Diff line number Diff line change
Expand Up @@ -47,8 +47,12 @@ function print_usage() {
echo " Build in parallel, N (4 by default) is passed via -P to xargs"
echo " --install"
echo " Intall the resulting jar instead of just building it"
echo " -o=MVN_OPT, --option=MVN_OPT"
echo " use this option to build project with maven. E.g., --option='-Dcudf.version=cuda11'"
jlowe marked this conversation as resolved.
Show resolved Hide resolved
}

export MVN_OPT=${MVN_OPT:-''}

function bloopInstall() {
BLOOP_DIR="${BLOOP_DIR:-$PWD/.bloop}"
mkdir -p $BLOOP_DIR
Expand All @@ -60,7 +64,7 @@ function bloopInstall() {
mkdir -p "$bloop_config_dir"
rm -f "$bloop_config_dir"/*

mvn install ch.epfl.scala:maven-bloop_${BLOOP_SCALA_VERSION}:${BLOOP_VERSION}:bloopInstall -pl dist -am \
mvn $MVN_OPT install ch.epfl.scala:maven-bloop_${BLOOP_SCALA_VERSION}:${BLOOP_VERSION}:bloopInstall -pl dist -am \
-Dbloop.configDirectory="$bloop_config_dir" \
-DdownloadSources=true \
-Dbuildver="$bv" \
Expand Down Expand Up @@ -234,7 +238,7 @@ function build_single_shim() {
fi

echo "#### REDIRECTING mvn output to $LOG_FILE ####"
mvn -U "$MVN_PHASE" \
mvn $MVN_OPT -U "$MVN_PHASE" \
-DskipTests \
-Dbuildver="$BUILD_VER" \
-Drat.skip="$SKIP_CHECKS" \
Expand Down Expand Up @@ -270,7 +274,7 @@ time (
# a negligible increase of the build time by ~2 seconds.
joinShimBuildFrom="aggregator"
echo "Resuming from $joinShimBuildFrom build only using $BASE_VER"
mvn $FINAL_OP -rf $joinShimBuildFrom $MODULE_OPT $MVN_PROFILE_OPT $INCLUDED_BUILDVERS_OPT \
mvn $MVN_OPT $FINAL_OP -rf $joinShimBuildFrom $MODULE_OPT $MVN_PROFILE_OPT $INCLUDED_BUILDVERS_OPT \
-Dbuildver="$BASE_VER" \
-DskipTests -Dskip -Dmaven.javadoc.skip
)
8 changes: 4 additions & 4 deletions jenkins/databricks/build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,9 @@ set -ex
SPARKSRCTGZ=$1
# version of Apache Spark we are building against
BASE_SPARK_VERSION=$2
BUILD_PROFILES=$3
MVN_OPT=$3
NvTimLiu marked this conversation as resolved.
Show resolved Hide resolved
BASE_SPARK_VERSION_TO_INSTALL_DATABRICKS_JARS=$4
BUILD_PROFILES=${BUILD_PROFILES:-'databricks312,!snapshot-shims'}
MVN_OPT=${MVN_OPT:-''}
BASE_SPARK_VERSION=${BASE_SPARK_VERSION:-'3.1.2'}
BUILDVER=$(echo ${BASE_SPARK_VERSION} | sed 's/\.//g')db
# the version of Spark used when we install the Databricks jars in .m2
Expand All @@ -34,7 +34,7 @@ SPARK_MAJOR_VERSION_STRING=spark_${SPARK_MAJOR_VERSION_NUM_STRING}

echo "tgz is $SPARKSRCTGZ"
echo "Base Spark version is $BASE_SPARK_VERSION"
echo "build profiles $BUILD_PROFILES"
echo "build profiles $MVN_OPT"
echo "BASE_SPARK_VERSION_TO_INSTALL_DATABRICKS_JARS is $BASE_SPARK_VERSION_TO_INSTALL_DATABRICKS_JARS"

sudo apt install -y maven rsync
Expand Down Expand Up @@ -442,7 +442,7 @@ mvn -B install:install-file \
-Dversion=$SPARK_VERSION_TO_INSTALL_DATABRICKS_JARS \
-Dpackaging=jar

mvn -B -Ddatabricks -Dbuildver=$BUILDVER clean package -DskipTests
mvn -B -Ddatabricks -Dbuildver=$BUILDVER clean package -DskipTests $MVN_OPT

cd /home/ubuntu
tar -zcf spark-rapids-built.tgz spark-rapids
17 changes: 10 additions & 7 deletions jenkins/spark-nightly-build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,9 @@ set -ex
## export 'M2DIR' so that shims can get the correct Spark dependency info
export M2DIR="$WORKSPACE/.m2"

## maven options for building, e.g. '-Dspark-rapids-jni.version=xxx' to specify spark-rapids-jni dependency's version.
MVN_OPT=${MVN_OPT:-''}
NvTimLiu marked this conversation as resolved.
Show resolved Hide resolved

TOOL_PL=${TOOL_PL:-"tools"}
DIST_PL="dist"
function mvnEval {
Expand Down Expand Up @@ -64,7 +67,7 @@ function distWithReducedPom {
;;
esac

mvn -B $mvnCmd $MVN_URM_MIRROR \
mvn $MVN_OPT -B $mvnCmd $MVN_URM_MIRROR \
-Dcuda.version=$CUDA_CLASSIFIER \
-Dmaven.repo.local=$M2DIR \
-Dfile="${DIST_FPATH}.jar" \
Expand All @@ -76,9 +79,9 @@ function distWithReducedPom {
}

# build the Spark 2.x explain jar
mvn -B $MVN_URM_MIRROR -Dmaven.repo.local=$M2DIR -Dbuildver=24X clean install -DskipTests
mvn $MVN_OPT -B $MVN_URM_MIRROR -Dmaven.repo.local=$M2DIR -Dbuildver=24X clean install -DskipTests
[[ $SKIP_DEPLOY != 'true' ]] && \
mvn -B deploy $MVN_URM_MIRROR \
mvn $MVN_OPT -B deploy $MVN_URM_MIRROR \
-Dmaven.repo.local=$M2DIR \
-DskipTests \
-Dbuildver=24X
Expand All @@ -89,19 +92,19 @@ mvn -B $MVN_URM_MIRROR -Dmaven.repo.local=$M2DIR -Dbuildver=24X clean install -D
# Deploy jars unless SKIP_DEPLOY is 'true'

for buildver in "${SPARK_SHIM_VERSIONS[@]:1}"; do
mvn -U -B clean install -pl '!tools' $MVN_URM_MIRROR -Dmaven.repo.local=$M2DIR \
mvn $MVN_OPT -U -B clean install -pl '!tools' $MVN_URM_MIRROR -Dmaven.repo.local=$M2DIR \
-Dcuda.version=$CUDA_CLASSIFIER \
-Dbuildver="${buildver}"
distWithReducedPom "install"
[[ $SKIP_DEPLOY != 'true' ]] && \
mvn -B deploy -pl '!tools,!dist' $MVN_URM_MIRROR \
mvn $MVN_OPT -B deploy -pl '!tools,!dist' $MVN_URM_MIRROR \
-Dmaven.repo.local=$M2DIR \
-Dcuda.version=$CUDA_CLASSIFIER \
-DskipTests \
-Dbuildver="${buildver}"
done

mvn -B clean install -pl '!tools' \
mvn $MVN_OPT -B clean install -pl '!tools' \
$DIST_PROFILE_OPT \
-Dbuildver=$SPARK_BASE_SHIM_VERSION \
$MVN_URM_MIRROR \
Expand All @@ -115,7 +118,7 @@ if [[ $SKIP_DEPLOY != 'true' ]]; then
distWithReducedPom "deploy"

# this deploy includes 'tools' that is unconditionally built with Spark 3.1.1
mvn -B deploy -pl '!dist' \
mvn $MVN_OPT -B deploy -pl '!dist' \
-Dbuildver=$SPARK_BASE_SHIM_VERSION \
$MVN_URM_MIRROR -Dmaven.repo.local=$M2DIR \
-Dcuda.version=$CUDA_CLASSIFIER \
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -313,11 +313,18 @@ object RapidsExecutorPlugin {
* patch version then the actual patch version must be greater than or equal.
* For example, version 7.1 is not satisfied by version 7.2, but version 7.1 is satisfied by
* version 7.1.1.
* If the expected cudf version is a specified 'timestamp-seq' one, then it is satisfied by
* the SNAPSHOT version.
* For example, version 7.1-yyyymmdd.hhmmss-seq is satisfied by version 7.1-SNAPSHOT.
*/
def cudfVersionSatisfied(expected: String, actual: String): Boolean = {
val expHyphen = if (expected.indexOf('-') >= 0) expected.indexOf('-') else expected.length
val actHyphen = if (actual.indexOf('-') >= 0) actual.indexOf('-') else actual.length
if (actual.substring(actHyphen) != expected.substring(expHyphen)) return false
if (actual.substring(actHyphen) != expected.substring(expHyphen) &&
jlowe marked this conversation as resolved.
Show resolved Hide resolved
!(actual.substring(actHyphen) == "-SNAPSHOT" &&
expected.substring(expHyphen).matches("-([0-9]{8}).([0-9]{6})-([1-9][0-9]*)"))) {
return false
}

val (expMajorMinor, expPatch) = expected.substring(0, expHyphen).split('.').splitAt(2)
val (actMajorMinor, actPatch) = actual.substring(0, actHyphen).split('.').splitAt(2)
Expand Down