Originally from the User Slack
@Arkam_Fahry: Hello can any some on help me please. I currently have a file upload system which sends events to a queue for many types of events like upload completion and stuff. I would like events to reliably published at least once by persisting them first into scylladb then publishing the event form scylladb. The current system runs mongodb with a outbox collection when a file is uploaded the file metadata object and the upload event is persisted in a multi document transaction. The events are processed every 250ms in micro batches from outbox collection. Is doing a dual table write with a batch operation a viable solution to persist the event and the entity state.
@Felipe_Cardeneti_Mendes: I think I addressed a similar question last year, see if it sheds some light (and check the references/history - as they provide additional context) https://forum.scylladb.com/t/are-there-any-plans-for-triggers-procedures-of-any-kind-on-the-scylladb-roadmap/889/4?u=felipemendes
ScyllaDB Community NoSQL Forum: Are there any plans for TRIGGERS/PROCEDURES of any kind on the ScyllaDB Roadmap
@Arkam_Fahry: Thanks I checked out the answer before posting here. Will using batch operations effect negatively in terms of performance.
@Felipe_Cardeneti_Mendes: Well, depends on the kind of batch. But if you’re far from overwhelming the server it works just fine