What is considered “best practice” for inserting, say, 100.000 rows into a ScyllaDB table in general, or specifically for the Rust SDK?
The options I know about are:
- Sequential insertion - One query at a time. I find it hard to imagine this is a viable option.
- Batch insertion - Good option, but requires a reasonable chunk size to not exceed the maximum batch limit. Can’t batch all 100k rows in the same query.
- Parallel insertion - Seems like the best option at a first glance, but requires extra code. Is the trade-off in complexity worth the performance gains?
Please share your thoughts and opinions on what the best method of inserting many rows as fast as possible is.