Thanks to visit codestin.com
Credit goes to github.com

Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
52 commits
Select commit Hold shift + click to select a range
013f144
SK-2274 test internal release
skyflow-bharti Sep 3, 2025
8bbf2d6
[AUTOMATED] Private Release 3.0.0-beta.3-dev-013f144
skyflow-bharti Sep 3, 2025
b03c677
SK-2286 V3 release/25.9.3 (#224)
skyflow-shravan Sep 9, 2025
ad33628
SK-2286 override vault url and accept skyflow creds from env (#225)
skyflow-shravan Sep 9, 2025
61d26c7
SK-2289 Fix upsert operation not working issue
skyflow-vivek Sep 10, 2025
d3257dc
Merge pull request #226 from skyflowapi/skyflow-vivek/SK-2289-fix-ups…
skyflow-vivek Sep 10, 2025
6f23911
[AUTOMATED] Private Release 2.0.0-beta.4-dev-d3257dc
skyflow-vivek Sep 10, 2025
b50c410
SK-2292 Update default concurrency limit from 10 to 1
skyflow-vivek Sep 11, 2025
33f6d97
SK-2292 fix unit tests
skyflow-bharti Sep 11, 2025
082387d
SK-2292 fix log in detokenize
skyflow-bharti Sep 11, 2025
d7908d1
SK-2292 fix log in detokenize
skyflow-bharti Sep 11, 2025
a25231a
Merge branch 'v3-release/25.9.5' into skyflow-vivek/SK-2292-update-de…
skyflow-bharti Sep 11, 2025
8e725b7
Merge pull request #232 from skyflowapi/skyflow-vivek/SK-2292-update-…
skyflow-bharti Sep 11, 2025
0186972
[AUTOMATED] Private Release 2.0.0-dev-8e725b7
skyflow-bharti Sep 11, 2025
903835b
Merge pull request #230 from skyflowapi/skyflow-vivek/SK-2292-update-…
skyflow-bharti Sep 11, 2025
7b1b3bc
[AUTOMATED] Private Release 2.0.0-dev-903835b
skyflow-bharti Sep 11, 2025
e96d8dc
Merge branch 'v3-release/25.9.2' into v3-release/25.9.5
skyflow-bharti Sep 11, 2025
ff60830
[AUTOMATED] Private Release 2.0.0-dev-e96d8dc
skyflow-bharti Sep 11, 2025
10d960f
SK-2302 add multi table insert support
skyflow-bharti Sep 19, 2025
3fa14f8
SK-2291: Add unit test cases for V3 SDK. (#234)
saileshwar-skyflow Sep 19, 2025
b7d34bb
[AUTOMATED] Private Release 2.0.0-dev-3fa14f8
saileshwar-skyflow Sep 19, 2025
ae75c8d
SK-2302 add changes for bulk async changes
skyflow-bharti Sep 19, 2025
e561622
SK-2302 updated the messages
skyflow-bharti Sep 19, 2025
a5514d3
SK-2302 updated the messages
skyflow-bharti Sep 19, 2025
c5595d0
SK-2302 updated variable names
skyflow-bharti Sep 19, 2025
01ce5e0
SK-2302 updated limit
skyflow-bharti Sep 19, 2025
ae9f2f4
[AUTOMATED] Private Release 2.0.0-dev-01ce5e0
skyflow-bharti Sep 19, 2025
e0db412
Merge branch 'v3-release/25.9.5' into v3-release/25.9.11
skyflow-bharti Sep 19, 2025
72916bd
[AUTOMATED] Private Release 2.0.0-dev-e0db412
skyflow-bharti Sep 19, 2025
7d67954
Merge branch 'origin/v3-release/25.9.11' into SK-2302-multi-table-ins…
skyflow-vivek Sep 19, 2025
4ecc162
SK-2302 Fix unit test
skyflow-vivek Sep 19, 2025
e9d055a
Merge pull request #242 from skyflowapi/SK-2302-multi-table-insert-su…
skyflow-bharti Sep 19, 2025
ff35817
[AUTOMATED] Private Release 2.0.0-dev-e9d055a
skyflow-bharti Sep 19, 2025
6506b91
SK-2302 fixed error index
skyflow-bharti Sep 19, 2025
937cc18
Merge pull request #244 from skyflowapi/SK-2302-multi-table-insert-su…
skyflow-bharti Sep 19, 2025
2cc408f
[AUTOMATED] Private Release 2.0.0-dev-937cc18
skyflow-bharti Sep 19, 2025
1cc246e
SK-2302 fixed detokenize token redaction
skyflow-bharti Sep 19, 2025
b40c279
SK-2302 Update Concurrency values for insert and detokenize
skyflow-vivek Sep 20, 2025
e6833d8
[AUTOMATED] Private Release 2.0.0-dev-b40c279
skyflow-vivek Sep 20, 2025
3a80c07
SK-2302 add unit tests
skyflow-bharti Sep 22, 2025
4401e48
[AUTOMATED] Private Release 2.0.0-dev-3a80c07
skyflow-bharti Sep 22, 2025
1aa170f
Merge branch 'v3-release/25.9.11' into SK-2302-multi-table-insert-sup…
skyflow-bharti Sep 22, 2025
a742b35
SK-2302 remove pom chnages and added samples for multi table insert s…
skyflow-bharti Sep 22, 2025
7b83641
[AUTOMATED] Private Release 2.0.0-dev-a742b35
skyflow-bharti Sep 22, 2025
24cb613
SK-2302 changed Update enum to Upsert
skyflow-bharti Sep 22, 2025
339d8f7
Merge branch 'v3-release/25.9.11' into SK-2302-multi-table-insert-sup…
skyflow-bharti Sep 22, 2025
c3f911e
Merge pull request #245 from skyflowapi/SK-2302-multi-table-insert-su…
skyflow-bharti Sep 22, 2025
0537764
[AUTOMATED] Private Release 2.0.0-dev-c3f911e
skyflow-bharti Sep 22, 2025
6993fc0
SK-2302 updated the samples
skyflow-bharti Sep 22, 2025
fa3042a
[AUTOMATED] Private Release 2.0.0-dev-6993fc0
skyflow-bharti Sep 22, 2025
baba121
SK-2305 Fix few small behaviour inconsistencies
skyflow-vivek Sep 22, 2025
224b191
[AUTOMATED] Private Release 2.0.0-dev-baba121
skyflow-vivek Sep 22, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 9 additions & 1 deletion common/src/main/java/com/skyflow/errors/ErrorMessage.java
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,10 @@ public enum ErrorMessage {
ConnectionIdAlreadyInConfigList("%s0 Validation error. ConnectionId is present in an existing config. Specify a connectionId in config."),
ConnectionIdNotInConfigList("%s0 Validation error. ConnectionId is missing from the config. Specify the connectionIds from configs."),
EmptyCredentials("%s0 Validation error. Invalid credentials. Credentials must not be empty."),

TableSpecifiedInRequestAndRecordObject("%s0 Validation error. Table name cannot be specified at both the request and record levels. Please specify the table name in only one place."),
UpsertTableRequestAtRecordLevel("%s0 Validation error. Table name should be present at each record level when upsert is present at record level."),
UpsertTableRequestAtRequestLevel("%s0 Validation error. Upsert should be present at each record level when table name is present at record level."),
TableNotSpecifiedInRequestAndRecordObject("%s0 Validation error. Table name is missing. Table name should be specified at one place either at the request level or record level. Please specify the table name at one place."),
// Vault config
InvalidVaultId("%s0 Initialization failed. Invalid vault ID. Specify a valid vault ID."),
EmptyVaultId("%s0 Initialization failed. Invalid vault ID. Vault ID must not be empty."),
Expand Down Expand Up @@ -60,6 +63,10 @@ public enum ErrorMessage {
TableKeyError("%s0 Validation error. 'table' key is missing from the payload. Specify a 'table' key."),
EmptyTable("%s0 Validation error. 'table' can't be empty. Specify a table."),
ValuesKeyError("%s0 Validation error. 'values' key is missing from the payload. Specify a 'values' key."),
EmptyRecords("%s0 Validation error. 'records' can't be empty. Specify records."),
EmptyKeyInRecords("%s0 Validation error. Invalid key in data in records. Specify a valid key."),
EmptyValueInRecords("%s0 Validation error. Invalid value in records. Specify a valid value."),
RecordsKeyError("%s0 Validation error. 'records' key is missing from the payload. Specify a 'records' key."),
EmptyValues("%s0 Validation error. 'values' can't be empty. Specify values."),
EmptyKeyInValues("%s0 Validation error. Invalid key in values. Specify a valid key."),
EmptyValueInValues("%s0 Validation error. Invalid value in values. Specify a valid value."),
Expand Down Expand Up @@ -164,6 +171,7 @@ public enum ErrorMessage {
NullRedactionInTokenGroup("%s0 Validation error. Redaction in TokenGroupRedactions is null or empty. Specify a valid redaction."),

NullTokenGroupNameInTokenGroup("%s0 Validation error. TokenGroupName in TokenGroupRedactions is null or empty. Specify a valid tokenGroupName."),
InvalidRecord("%s0 Validation error. InsertRecord object in the list is invalid. Specify a valid InsertRecord object."),
;
;
private final String message;
Expand Down
10 changes: 8 additions & 2 deletions common/src/main/java/com/skyflow/logs/ErrorLogs.java
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,9 @@ public enum ErrorLogs {
EMPTY_TABLE_NAME("Invalid %s1 request. Table name can not be empty."),
VALUES_IS_REQUIRED("Invalid %s1 request. Values are required."),
EMPTY_VALUES("Invalid %s1 request. Values can not be empty."),
RECORDS_IS_REQUIRED("Invalid %s1 request. Records are required."),
EMPTY_RECORDS("Invalid %s1 request. Records can not be empty."),
INVALID_RECORD("Invalid %s1 request. Invalid record. Specify a valid record."),
RECORD_SIZE_EXCEED("Maximum number of records exceeded. The limit is 10000."),
TOKENS_SIZE_EXCEED("Maximum number of tokens exceeded. The limit is 10000."),
EMPTY_OR_NULL_VALUE_IN_VALUES("Invalid %s1 request. Value can not be null or empty in values for key \"%s2\"."),
Expand Down Expand Up @@ -143,8 +146,11 @@ public enum ErrorLogs {

UNEXPECTED_ERROR_DURING_BATCH_PROCESSING("Unexpected error occurred during batch processing. Error: %s1"),

PROCESSING_ERROR_RESPONSE("Processing error response.");

PROCESSING_ERROR_RESPONSE("Processing error response."),
TABLE_SPECIFIED_AT_BOTH_PLACE("Invalid %s1 request. Table name cannot be specified at both the request and record levels. Please specify the table name at only one place."),
TABLE_NOT_SPECIFIED_AT_BOTH_PLACE("Invalid %s1 request. Table name is missing. Table name should be specified at one place either at the request level or record level. Please specify the table name at one place."),
UPSERT_TABLE_REQUEST_AT_RECORD_LEVEL("Invalid %s1 request. Table name should be present at each record level when upsert is present at record level."),
UPSERT_TABLE_REQUEST_AT_REQUEST_LEVEL("Invalid %s1 request. Upsert should be present at each record level when table name is present at record level.");
private final String log;

ErrorLogs(String log) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ private static V1GetAuthTokenResponse generateBearerTokenFromCredentials(
FileReader reader = new FileReader(String.valueOf(credentialsFile));
JsonObject serviceAccountCredentials = JsonParser.parseReader(reader).getAsJsonObject();
return getBearerTokenFromCredentials(serviceAccountCredentials, context, roles);
} catch (JsonSyntaxException e) {
} catch (JsonSyntaxException | IllegalStateException e) {
LogUtil.printErrorLog(ErrorLogs.INVALID_CREDENTIALS_FILE_FORMAT.getLog());
throw new SkyflowException(ErrorCode.INVALID_INPUT.getCode(), BaseUtils.parameterizedString(
ErrorMessage.FileInvalidJson.getMessage(), credentialsFile.getPath()));
Expand All @@ -81,7 +81,7 @@ private static V1GetAuthTokenResponse generateBearerTokenFromCredentialString(
}
JsonObject serviceAccountCredentials = JsonParser.parseString(credentials).getAsJsonObject();
return getBearerTokenFromCredentials(serviceAccountCredentials, context, roles);
} catch (JsonSyntaxException e) {
} catch (JsonSyntaxException | IllegalStateException e) {
LogUtil.printErrorLog(ErrorLogs.INVALID_CREDENTIALS_STRING_FORMAT.getLog());
throw new SkyflowException(ErrorCode.INVALID_INPUT.getCode(),
ErrorMessage.CredentialsStringInvalidJson.getMessage());
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ private static List<SignedDataTokenResponse> generateSignedTokenFromCredentialsF
FileReader reader = new FileReader(String.valueOf(credentialsFile));
JsonObject serviceAccountCredentials = JsonParser.parseReader(reader).getAsJsonObject();
responseToken = generateSignedTokensFromCredentials(serviceAccountCredentials, dataTokens, timeToLive, context);
} catch (JsonSyntaxException e) {
} catch (JsonSyntaxException | IllegalStateException e) {
LogUtil.printErrorLog(ErrorLogs.INVALID_CREDENTIALS_FILE_FORMAT.getLog());
throw new SkyflowException(ErrorCode.INVALID_INPUT.getCode(), BaseUtils.parameterizedString(
ErrorMessage.FileInvalidJson.getMessage(), credentialsFile.getPath()));
Expand All @@ -80,7 +80,7 @@ private static List<SignedDataTokenResponse> generateSignedTokensFromCredentials
}
JsonObject serviceAccountCredentials = JsonParser.parseString(credentials).getAsJsonObject();
responseToken = generateSignedTokensFromCredentials(serviceAccountCredentials, dataTokens, timeToLive, context);
} catch (JsonSyntaxException e) {
} catch (JsonSyntaxException | IllegalStateException e) {
LogUtil.printErrorLog(ErrorLogs.INVALID_CREDENTIALS_STRING_FORMAT.getLog());
throw new SkyflowException(ErrorCode.INVALID_INPUT.getCode(),
ErrorMessage.CredentialsStringInvalidJson.getMessage());
Expand Down
11 changes: 0 additions & 11 deletions common/src/main/java/com/skyflow/vault/data/BaseInsertRequest.java
Original file line number Diff line number Diff line change
Expand Up @@ -14,15 +14,9 @@ public String getTable() {
return this.builder.table;
}

public ArrayList<HashMap<String, Object>> getValues() {
return this.builder.values;
}

static class BaseInsertRequestBuilder {
protected String table;
protected ArrayList<HashMap<String, Object>> values;
protected String upsert;

protected BaseInsertRequestBuilder() {
}

Expand All @@ -31,11 +25,6 @@ public BaseInsertRequestBuilder table(String table) {
return this;
}

public BaseInsertRequestBuilder values(ArrayList<HashMap<String, Object>> values) {
this.values = values;
return this;
}

}
}

41 changes: 29 additions & 12 deletions samples/src/main/java/com/example/vault/BulkInsertAsync.java
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,14 @@
import com.skyflow.config.VaultConfig;
import com.skyflow.enums.Env;
import com.skyflow.enums.LogLevel;
import com.skyflow.enums.UpdateType;
import com.skyflow.vault.data.InsertRecord;
import com.skyflow.vault.data.InsertRequest;
import com.skyflow.vault.data.InsertResponse;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.CompletionException;

Expand Down Expand Up @@ -44,24 +47,38 @@ public static void main(String[] args) {
.build();

// Step 4: Prepare first record for insertion
HashMap<String, Object> record1 = new HashMap<>();
record1.put("<YOUR_COLUMN_NAME_1>", "<YOUR_VALUE_1>");
record1.put("<YOUR_COLUMN_NAME_2>", "<YOUR_VALUE_1>");
HashMap<String, Object> recordData1 = new HashMap<>();
recordData1.put("<YOUR_COLUMN_NAME_1>", "<YOUR_VALUE_1>");
recordData1.put("<YOUR_COLUMN_NAME_2>", "<YOUR_VALUE_1>");

InsertRecord insertRecord1 = InsertRecord
.builder()
.data(recordData1)
.build();

// Step 5: Prepare second record for insertion
HashMap<String, Object> record2 = new HashMap<>();
record2.put("<YOUR_COLUMN_NAME_1>", "<YOUR_VALUE_1>");
record2.put("<YOUR_COLUMN_NAME_2>", "<YOUR_VALUE_1>");
HashMap<String, Object> recordData2 = new HashMap<>();
recordData2.put("<YOUR_COLUMN_NAME_1>", "<YOUR_VALUE_1>");
recordData2.put("<YOUR_COLUMN_NAME_2>", "<YOUR_VALUE_1>");

InsertRecord insertRecord2 = InsertRecord
.builder()
.data(recordData2)
.build();

// Step 6: Combine records into a single list
ArrayList<HashMap<String, Object>> values = new ArrayList<>();
values.add(record1);
values.add(record2);
// Step 6: Combine records into a Insert record list
ArrayList<InsertRecord> insertRecords = new ArrayList<>();
insertRecords.add(insertRecord1);
insertRecords.add(insertRecord2);

// Step 7: Build the insert request with table name and values
List<String> upsertColumns = new ArrayList<>();
upsertColumns.add("<YOUR_COLUMN_NAME_1>");
// Step 7: Build the insert request with table name and insertRecords
InsertRequest request = InsertRequest.builder()
.table("<YOUR_TABLE_NAME>")
.values(values)
.upsert(upsertColumns)
.upsertType(UpdateType.REPLACE)
.records(insertRecords)
.build();

// Step 8: Execute the async bulk insert operation and handle response using callbacks
Expand Down
41 changes: 29 additions & 12 deletions samples/src/main/java/com/example/vault/BulkInsertSync.java
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,14 @@
import com.skyflow.config.VaultConfig;
import com.skyflow.enums.Env;
import com.skyflow.enums.LogLevel;
import com.skyflow.enums.UpdateType;
import com.skyflow.errors.SkyflowException;
import com.skyflow.vault.data.InsertRequest;
import com.skyflow.vault.data.InsertResponse;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;

/**
* This sample demonstrates how to perform a synchronous bulk insert operation using the Skyflow Java SDK.
Expand Down Expand Up @@ -43,24 +45,39 @@ public static void main(String[] args) {
.build();

// Step 4: Prepare first record for insertion
HashMap<String, Object> record1 = new HashMap<>();
record1.put("<YOUR_COLUMN_NAME_1>", "<YOUR_VALUE_1>");
record1.put("<YOUR_COLUMN_NAME_2>", "<YOUR_VALUE_1>");
HashMap<String, Object> recordData1 = new HashMap<>();
recordData1.put("<YOUR_COLUMN_NAME_1>", "<YOUR_VALUE_1>");
recordData1.put("<YOUR_COLUMN_NAME_2>", "<YOUR_VALUE_1>");

InsertRecord insertRecord1 = InsertRecord
.builder()
.data(recordData1)
.build();

// Step 5: Prepare second record for insertion
HashMap<String, Object> record2 = new HashMap<>();
record2.put("<YOUR_COLUMN_NAME_1>", "<YOUR_VALUE_1>");
record2.put("<YOUR_COLUMN_NAME_2>", "<YOUR_VALUE_1>");
HashMap<String, Object> recordData2 = new HashMap<>();
recordData2.put("<YOUR_COLUMN_NAME_1>", "<YOUR_VALUE_1>");
recordData2.put("<YOUR_COLUMN_NAME_2>", "<YOUR_VALUE_1>");

InsertRecord insertRecord2 = InsertRecord
.builder()
.data(recordData2)
.build();

// Step 6: Combine records into a Insert record list
ArrayList<InsertRecord> insertRecords = new ArrayList<>();
insertRecords.add(insertRecord1);
insertRecords.add(insertRecord2);

// Step 6: Combine records into a single list
ArrayList<HashMap<String, Object>> values = new ArrayList<>();
values.add(record1);
values.add(record2);
List<String> upsertColumns = new ArrayList<>();
upsertColumns.add("<YOUR_COLUMN_NAME_1>");

// Step 7: Build the insert request with table name and values
// Step 7: Build the insert request with table name and insertRecords
InsertRequest request = InsertRequest.builder()
.table("<YOUR_TABLE_NAME>")
.values(values)
.upsert(upsertColumns)
.upsertType(UpdateType.REPLACE)
.records(insertRecords)
.build();

// Step 8: Execute the bulk insert operation and print the response
Expand Down
100 changes: 100 additions & 0 deletions samples/src/main/java/com/example/vault/BulkMultiTableInsertAsync.java
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@
package com.example.vault;

import com.skyflow.Skyflow;
import com.skyflow.config.Credentials;
import com.skyflow.config.VaultConfig;
import com.skyflow.enums.Env;
import com.skyflow.enums.LogLevel;
import com.skyflow.enums.UpdateType;
import com.skyflow.vault.data.InsertRecord;
import com.skyflow.vault.data.InsertRequest;
import com.skyflow.vault.data.InsertResponse;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.CompletionException;

/**
* This sample demonstrates how to perform an asynchronous bulk insert operation using the Skyflow Java SDK.
* The process involves:
* 1. Setting up credentials and vault configuration
* 2. Creating multiple records to be inserted
* 3. Building and executing an async bulk insert request
* 4. Handling the insert response or errors using CompletableFuture
*/
public class BulkMultiTableInsertAsync {

public static void main(String[] args) {
try {
// Step 1: Initialize credentials with the path to your service account key file
String filePath = "<YOUR_CREDENTIALS_FILE_PATH>";
Credentials credentials = new Credentials();
credentials.setPath(filePath);

// Step 2: Configure the vault with required parameters
VaultConfig vaultConfig = new VaultConfig();
vaultConfig.setVaultId("<YOUR_VAULT_ID>");
vaultConfig.setClusterId("<YOUR_CLUSTER_ID>");
vaultConfig.setEnv(Env.PROD);
vaultConfig.setCredentials(credentials);

// Step 3: Create Skyflow client instance with error logging
Skyflow skyflowClient = Skyflow.builder()
.setLogLevel(LogLevel.ERROR)
.addVaultConfig(vaultConfig)
.build();

// Step 4: Prepare first record for insertion
HashMap<String, Object> recordData1 = new HashMap<>();
rerecordData1cord1.put("<YOUR_COLUMN_NAME_1>", "<YOUR_VALUE_1>");
recordData1.put("<YOUR_COLUMN_NAME_2>", "<YOUR_VALUE_1>");

List<String> upsertColumns = new ArrayList<>();
upsertColumns.add("<YOUR_COLUMN_NAME_1>");

InsertRecord insertRecord1 = InsertRecord
.builder()
.data(recordData1)
.table("<YOUR_TABLE_NAME>")
.upsert(upsertColumns)
.upsertType(UpsertType.UPDATE)
.build();

// Step 5: Prepare second record for insertion
HashMap<String, Object> recordData2 = new HashMap<>();
recordData2.put("<YOUR_COLUMN_NAME_1>", "<YOUR_VALUE_1>");
recordData2.put("<YOUR_COLUMN_NAME_2>", "<YOUR_VALUE_1>");

InsertRecord insertRecord2 = InsertRecord
.builder()
.data(recordData2)
.table("<YOUR_TABLE_NAME>")
.build();

// Step 6: Combine records into a Insert record list
ArrayList<InsertRecord> insertRecords = new ArrayList<>();
insertRecords.add(insertRecord1);
insertRecords.add(insertRecord2);

// Step 7: Build the insert request with table name and values
InsertRequest request = InsertRequest.builder()
.records(insertRecords)
.build();

// Step 8: Execute the async bulk insert operation and handle response using callbacks
CompletableFuture<InsertResponse> future = skyflowClient.vault().bulkInsertAsync(request);
// Add success and error callbacks
future.thenAccept(response -> {
System.out.println("Async bulk insert resolved with response:\t" + response);
}).exceptionally(throwable -> {
System.err.println("Async bulk insert rejected with error:\t" + throwable.getMessage());
throw new CompletionException(throwable);
});
} catch (Exception e) {
// Step 9: Handle any synchronous errors that occur during setup
System.err.println("Error in Skyflow operations:\t" + e.getMessage());
}
}
}
Loading