Originally from the User Slack
@hamonica: hello~, I’m currently developing in Java and attempting to perform batch data inserts into ScyllaDB.
I receive warning messages when attempting to insert approximately 500 to 2,000 rows into ScyllaDB.
Upon reviewing the batch-related settings, I found that the recommended values are:
batch_size_warn_threshold_in_kb: 128 KB
batch_size_fail_threshold_in_kb: 1,024 KB
However, for my batch operations, I require significantly larger sizes. Therefore, I have increased these thresholds to:
batch_size_warn_threshold_in_kb: 512 KB
batch_size_fail_threshold_in_kb: 5,120 KB
Given these adjustments, I am concerned about the potential impact on system stability. Could you please advise if increasing these thresholds to the specified values could lead to any stability issues?
@hamonica: Is it a better approach to split batch processing into 100 rows and perform parallel processing (insert)?
@avi: yes, better to have smaller batches
@hamonica: @avi Currently, I’m performing batch inserts with only 500 rows.there don’t seem to be any major issues based on the tests conducted. If problems arise in the future, I will experiment by dividing it into smaller units.