The Databricks JDBC driver implements the JDBC interface providing connectivity to a Databricks SQL warehouse. Please refer to Databricks documentation for more information.
Databricks JDBC is compatible with Java 11 and higher. CI testing runs on Java versions 11, 17, and 21.
Add the following dependency to your pom.xml
:
<dependency>
<groupId>com.databricks</groupId>
<artifactId>databricks-jdbc</artifactId>
<version>1.0.9-oss</version>
</dependency>
For applications requiring explicit dependency management, use the thin JAR variant:
<!-- Note: Available from version 1.0.10-oss onwards -->
<dependency>
<groupId>com.databricks</groupId>
<artifactId>databricks-jdbc-thin</artifactId>
<version>1.0.10-oss</version>
</dependency>
The thin JAR contains only the driver code and declares all dependencies in its POM, enabling dependency introspection and version management.
- Clone the repository
- Run the following command:
mvn clean package
- The following JAR files are generated:
target/databricks-jdbc-<version>.jar
(standard JAR with bundled dependencies)target/databricks-jdbc-<version>-thin.jar
(thin JAR without dependencies)
- The test coverage report is generated in
target/site/jacoco/index.html
To install the thin JAR locally with dependency metadata:
VERSION=$(grep -m1 '<version>' pom.xml | sed 's/.*<version>\(.*\)<\/version>.*/\1/')
mvn install:install-file -Dfile="target/databricks-jdbc-${VERSION}-thin.jar" -DpomFile=thin_public_pom.xml
jdbc:databricks://<host>:<port>;transportMode=http;ssl=1;AuthMech=3;httpPath=<path>;UID=token;PWD=<token>
The JDBC driver supports the following authentication methods:
Use AuthMech=3
for personal access token authentication:
AuthMech=3;UID=token;PWD=<your_token>
Use AuthMech=11
for OAuth2-based authentication. Several OAuth flows are supported:
Direct use of an existing OAuth token:
AuthMech=11;Auth_Flow=0;Auth_AccessToken=<your_access_token>
Configure standard OAuth client credentials flow:
AuthMech=11;Auth_Flow=1;OAuth2ClientId=<client_id>;OAuth2Secret=<client_secret>
Optional parameters:
AzureTenantId
: Azure tenant ID for Azure Databricks (default: null). If enabled, the driver will include refreshed Azure Active Directory (AAD) Service Principal OAuth tokens with every request.
Interactive browser-based OAuth flow with PKCE:
AuthMech=11;Auth_Flow=2
Optional parameters:
OAuth2ClientId
- Client ID for OAuth2 (default: databricks-cli)OAuth2RedirectUrlPort
- Ports for redirect URL (default: 8020)EnableOIDCDiscovery
- Enable OIDC discovery (default: 1)OAuthDiscoveryURL
- OIDC discovery endpoint (default: /oidc/.well-known/oauth-authorization-server)EnableSQLValidationForIsValid
- Enable SQL query based validation inisValid()
connection checks (default: 0)
The driver supports both SLF4J and Java Util Logging (JUL) frameworks:
- SLF4J: Enable with
-Dcom.databricks.jdbc.loggerImpl=SLF4JLOGGER
- JUL: Enable with
-Dcom.databricks.jdbc.loggerImpl=JDKLOGGER
(default)
For detailed logging configuration options, see Logging Documentation.
Basic test execution:
mvn test
Note: Due to a change in JDK 16 that introduced a compatibility issue with the Apache Arrow library used by the JDBC driver, runtime errors may occur when using the JDBC driver with JDK 16 or later. To avoid these errors, restart your application or driver with the following JVM command option:
--add-opens=java.base/java.nio=org.apache.arrow.memory.core ALL-UNNAMED
For more detailed information about integration tests and fake services, see Testing Documentation.
For more information, see the following resources: