I am Trying to Execute below queries but Getting Exceptions, unable to fix these

CREATE KEYSPACE mykeyspace4
    WITH replication = {'class': 'NetworkTopologyStrategy', 'replication_factor' : 3, 'DC2': 2}

Exception: `ConfigurationException: Unrecognized strategy option {DC1} passed to org.apache.cassandra.locator.NetworkTopologyStrategy for keyspace mykeyspace2`
CREATE TABLE ks.cf (key blob PRIMARY KEY,  val blob) WITH tombstone_gc = {'mode':'repair'};

Exception: `ConfigurationException: tombstone_gc option with mode = repair not supported for table with RF one or local replication strategy`
CREATE TABLE data_atrest (
     pk text PRIMARY KEY,
     c0 int
 ) WITH scylla_encryption_options = {
    'cipher_algorithm' : 'AES/ECB/PKCS5Padding',
    'secret_key_strength' : 128,
    'key_provider': 'LocalFileSystemKeyProviderFactory',
    'secret_key_file': '/etc/scylla/data_encryption_keys/secret_key'};

Exceptions: `ConfigurationException: Validation failed:std::_Nested_exception<std::runtime_error> (Could not write key file '/etc/scylla/data_encryption_keys/secret_key'): std::filesystem::__cxx11::filesystem_error (error system:13, filesystem error: mkdir failed: Permission denied [/etc/scylla/data_encryption_keys/])`
CREATE KEYSPACE mykeyspace2
    WITH REPLICATION = { 'class' : 'SimpleStrategy', 'replication_factor' : 3 }
    AND STORAGE = { 'type' : 'S3', 'bucket' : '/tmp/b1', 'endpoint' : 'localhost' } ;

Exceptions: `InvalidRequest: Error from server: code=2200 [Invalid query] message="Keyspace storage options not supported in the cluster"`

(1) You are using a non-existent Datacenter in the keyspace definition, this is why the statement fails. The error message is confusing and I opened an issue for it.

(2) tombstone_gc can only be enabled with tables that are part of a keyspace, which has at least replication_factor of 2 or more. If your keyspace has replication_factor of 1 (which is really not recommended), just set gc_grace_seconds = 0 in your schema options, to speed up tombstone purging.

(3) Looks like you need to give ScyllaDB read/write rights to /etc/scylla/data_encryption_keys/.

(4) The STORAGE option is a new and experimental feature, which is only available in the latest OSS releases. I suppose the version you are using either doesn’t have this feature, or the experimental feature is not enabled.

Hi @denesb

  1. I am also not able to execute JSON queries how to execute those:
INSERT INTO mytable JSON '{ "\"myKey\"": 0, "value": 0}'

  1. How to Enable this (3) Looks like you need to give ScyllaDB read/write rights to /etc/scylla/data_encryption_keys/ . any idea?

  2. How to add STORAGE option in Experimental feature. I am using below mentions conf. in scylla.yaml:

experimental: true
experimental_features:
     - udf
#     - alternator-streams
#     - alternator-ttl
#     - raft
enable_user_defined_functions: true

4- Alter Keyspace with Storage not working:

CREATE KEYSPACE mykeyspace2
    WITH REPLICATION = { 'class' : 'SimpleStrategy', 'replication_factor' : 3 }
    AND STORAGE = { 'type' : 'S3', 'bucket' : '/tmp/b1', 'endpoint' : 'localhost' } ;

Running fine above Create query.


ALTER KEYSPACE mykeyspace2 WITH REPLICATION = { 'class' : 'NetworkStrategy', 'replication_factor' : 3 }
    AND STORAGE = { 'type' : 'S3', 'bucket': '/tmp/b2', 'endpoint' : 'localhost' } ;
InvalidRequest: Error from server: code=2200 [Invalid query] message="Cannot alter storage options: S3 to S3 is not supported"

 ALTER KEYSPACE mykeyspace2 WITH REPLICATION = { 'class' : 'NetworkStrategy', 'replication_factor' : 3 }
    AND STORAGE = { 'type' : 'LOCAL', 'bucket': '/tmp/b2', 'endpoint' : 'localhost' } ;
InvalidRequest: Error from server: code=2200 [Invalid query] message="Local storage does not accept any custom options"

ALTER KEYSPACE mykeyspace2 WITH REPLICATION = { 'class' : 'NetworkStrategy', 'replication_factor' : 3 }
    AND STORAGE = {'bucket': '/tmp/b2', 'endpoint' : 'localhost' } ;
InvalidRequest: Error from server: code=2200 [Invalid query] message="Cannot alter storage options: S3 to LOCAL is not supported"

(2) You need to use chown or chmod to change the permissions on said directory such that the user under which ScyllaDB runs (this depends on the installation) has read/write access to it.

(3) You need to add the keyspace-storage-options to the list of enabled experimental features.

(4) We do not support changing the storage options on a keyspace

Hi @Botond_Denes

1- Even though i give write permission to directory using sudo chmod +w /etc/scylla/ but still:

CREATE TABLE data_atrest (
     pk text PRIMARY KEY,
     c0 int
 ) WITH scylla_encryption_options = {
    'cipher_algorithm' : 'AES/ECB/PKCS5Padding',
    'secret_key_strength' : 128,
    'key_provider': 'LocalFileSystemKeyProviderFactory',
    'secret_key_file': '/etc/scylla/data_encryption_keys/secret_key'};
	
ConfigurationException: Validation failed:std::_Nested_exception<std::runtime_error> (Could not write key file '/etc/scylla/data_encryption_keys/secret_key'): std::filesystem::__cxx11::filesystem_error (error system:13, filesystem error: mkdir failed: Permission denied [/etc/scylla/data_encryption_keys/])

2- Below queries showing errors:

SELECT * FROM myTable WHERE date >= currentDate() - 2d;	
SyntaxException: line 1:50  : syntax error…

LIST SELECT PERMISSIONS OF carlos;	
SyntaxException: line 1:14  : syntax error…

CREATE TRIGGER myTrigger ON t USING 'org.apache.cassandra.triggers.InvertedIndex';
SyntaxException: line 1:0 no viable alternative at input 'CREATE'
  1. I am also not able to execute JSON queries :
CREATE TABLE Memo2 (id int PRIMARY KEY)
  WITH compaction = {
  'class' : 'IncrementalCompactionStrategy',
  'bucket_high' : 1.5,
  'bucket_low' : 0.5,
  'min_sstable_size' : 50,
  'min_threshold' : 4,
  'max_threshold' : 32,
  'sstable_size_in_mb' : 1000,
  'space_amplification_goal' : 1.25
  };

SELECT JSON id FROM Memo2;

Result of Select query:

[json]
--------

Then Executing below but getting exception:

INSERT INTO mytable JSON '{ "\"json\"": 0, "value": 0}';
InvalidRequest: Error from server: code=2200 [Invalid query] message="JSON values map contains unrecognized column: value"