Thanks to visit codestin.com
Credit goes to github.com

Skip to content
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
Add dependency override for shapeless
Resolves a compiletime/runtime mismatch when Spark is in user classpath
  • Loading branch information
jeremyrsmith committed Nov 6, 2024
commit 8f0ce3f72b54a29de91608db7878c27605974ba7
1 change: 1 addition & 0 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -187,6 +187,7 @@ val `polynote-kernel` = project.settings(
"com.github.javaparser" % "javaparser-symbol-solver-core" % versions.javaparser,
"org.scalamock" %% "scalamock" % "4.4.0" % "test"
),
dependencyOverrides += "com.chuusai" %% "shapeless" % "2.3.2",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to pick a specific version based on what matches Spark?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, I think so. I wish there were a way to tell sbt, "use the version that you get from Spark, not a later version" but there isn't as far as I know.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there no like "provided" dependency? compileOnly or whatever?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is provided... but I don't see how that would solve this issue?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, by "compiletime/runtime" I mean cell compile time and cell runtime. The issue is that the shapeless in polynote-kernel-assembly (which needs to be there in the absence of Spark) gets used during compile time for some reason, and its macros are what get executed. But during runtime, Spark's shapeless gets used, which doesn't have the code that the macro referred to.

distFiles := Seq(assembly.value) ++ (Compile / dependencyClasspath).value.collect {
case jar if jar.data.name.matches(".*scala-(library|reflect|compiler|collection-compat|xml).*") => jar.data
},
Expand Down
Loading