Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Trying to Run Dummy Application on Databricks with Unity Catalog #1219

@davinder-veeam

Description

@davinder-veeam

I have use-case to use the databricks service credentials and that is only available if I use the Unity catalog feature with Spark Compute Cluster by enabling data_security_mode to DATA_SECURITY_MODE_AUTO and move my artifact to volume and spark runner jar to volume in the catalog.

  • With Unity Catalog enabled Cluster ( it fails to connect JVM Bridge some internal components )
25/08/25 12:46:25 INFO DatabricksEdgeConfigs: serverlessEnabled : false
25/08/25 12:46:25 INFO DatabricksEdgeConfigs: perfPackEnabled : false
25/08/25 12:46:25 INFO DatabricksEdgeConfigs: classicSqlEnabled : false
25/08/25 12:46:26 INFO Log4jUsageLogger: clusterRuntimeMode=1.0, tags=List(classicSqlEnabled=false, perfPackEnabled=false, serverlessEnabled=false), blob=null
25/08/25 12:46:26 INFO DatabricksEdgeConfigs: spark.databricks.test.default.enabled : false
25/08/25 12:46:26 INFO DotnetRunner: Copying user file /Volumes/davinder_dbx_poc/poc-infra-schema/poc-infra-volume/example.zip to /home/spark-6fda0355-3cbf-4d3a-b821-fd
25/08/25 12:46:26 INFO Utils: Copying /Volumes/davinder_dbx_poc/poc-infra-schema/poc-infra-volume/example.zip to /home/spark-6fda0355-3cbf-4d3a-b821-fd/example.zip

25/08/25 12:46:25 INFO DatabricksEdgeConfigs: serverlessEnabled : false
25/08/25 12:46:25 INFO DatabricksEdgeConfigs: perfPackEnabled : false
25/08/25 12:46:25 INFO DatabricksEdgeConfigs: classicSqlEnabled : false
25/08/25 12:46:26 INFO Log4jUsageLogger: clusterRuntimeMode=1.0, tags=List(classicSqlEnabled=false, perfPackEnabled=false, serverlessEnabled=false), blob=null
25/08/25 12:46:26 INFO DatabricksEdgeConfigs: spark.databricks.test.default.enabled : false
25/08/25 12:46:26 INFO DotnetRunner: Copying user file /Volumes/davinder_dbx_poc/poc-infra-schema/poc-infra-volume/example.zip to /home/spark-6fda0355-3cbf-4d3a-b821-fd
25/08/25 12:46:26 INFO Utils: Copying /Volumes/davinder_dbx_poc/poc-infra-schema/poc-infra-volume/example.zip to /home/spark-6fda0355-3cbf-4d3a-b821-fd/example.zip
25/08/25 12:46:34 INFO DotnetRunner: Unzipping .NET driver example.zip to /home/spark-6fda0355-3cbf-4d3a-b821-fd/example
25/08/25 12:46:35 INFO DotnetRunner: Starting DotnetBackend with /home/spark-6fda0355-3cbf-4d3a-b821-fd/example/Dummy.Example.
25/08/25 12:46:35 INFO DotnetBackend: The number of DotnetBackend threads is set to 10.
25/08/25 12:46:36 INFO DotnetRunner: Port number used by DotnetBackend is 42775


[2025-08-25T12:46:36.7138160Z] [0822-145301-0m2lrhde-10-139-64-4] [Info] [ConfigurationService] Using port 42775 for connection.
[2025-08-25T12:46:36.7263562Z] [0822-145301-0m2lrhde-10-139-64-4] [Info] [JvmBridge] JvMBridge port is 42775
[2025-08-25T12:46:36.7280276Z] [0822-145301-0m2lrhde-10-139-64-4] [Info] [JvmBridge] The number of JVM backend thread is set to 10. The max number of concurrent sockets in JvmBridge is set to 7.
[2025-08-25T12:46:37.8974133Z] [0822-145301-0m2lrhde-10-139-64-4] [Exception] [JvmBridge] Connection refused 127.0.0.1:42775
   at System.Net.Sockets.Socket.DoConnect(EndPoint endPointSnapshot, SocketAddress socketAddress)
   at System.Net.Sockets.Socket.Connect(EndPoint remoteEP)
   at Microsoft.Spark.Network.DefaultSocketWrapper.Connect(IPAddress remoteaddr, Int32 port, String secret) in /_/src/csharp/Microsoft.Spark/Network/DefaultSocketWrapper.cs:line 75
   at Microsoft.Spark.Interop.Ipc.JvmBridge.GetConnection() in /_/src/csharp/Microsoft.Spark/Interop/Ipc/JvmBridge.cs:line 82
   at Microsoft.Spark.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStatic, Object classNameOrJvmObjectReference, String methodName, Object[] args) in /_/src/csharp/Microsoft.Spark/Interop/Ipc/JvmBridge.cs:line 224
Unhandled exception. System.Net.Sockets.SocketException (111): Connection refused 127.0.0.1:42775
   at System.Net.Sockets.Socket.DoConnect(EndPoint endPointSnapshot, SocketAddress socketAddress)
   at System.Net.Sockets.Socket.Connect(EndPoint remoteEP)
   at Microsoft.Spark.Network.DefaultSocketWrapper.Connect(IPAddress remoteaddr, Int32 port, String secret) in /_/src/csharp/Microsoft.Spark/Network/DefaultSocketWrapper.cs:line 75
   at Microsoft.Spark.Interop.Ipc.JvmBridge.GetConnection() in /_/src/csharp/Microsoft.Spark/Interop/Ipc/JvmBridge.cs:line 82
   at Microsoft.Spark.Interop.Ipc.JvmBridge.CallJavaMethod(Boolean isStatic, Object classNameOrJvmObjectReference, String methodName, Object[] args) in /_/src/csharp/Microsoft.Spark/Interop/Ipc/JvmBridge.cs:line 224
   at Microsoft.Spark.Interop.Ipc.JvmBridge.CallStaticJavaMethod(String className, String methodName, Object[] args) in /_/src/csharp/Microsoft.Spark/Interop/Ipc/JvmBridge.cs:line 108
   at Microsoft.Spark.Sql.Builder..ctor() in /_/src/csharp/Microsoft.Spark/Sql/Builder.cs:line 21
   at Microsoft.Spark.Sql.SparkSession.Builder() in /_/src/csharp/Microsoft.Spark/Sql/SparkSession.cs:line 60
  • Without Unity Catalog enabled Cluster ( it works perfectly fine but unable to access service credentials as expected )

any suggestion, where and how to debug it further?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions