What does a Snowflake Architect need to consider when implementing a Snowflake Connector for Kafka?
AEvery Kafka message is in JSON or Avro format.
BThe default retention time for Kafka topics is 14 days.
CThe Kafka connector supports key pair authentication, OAUTH, and basic authentication (for example, username and password).
DThe Kafka connector will create one table and one pipe to ingest data for each topic. If the connector cannot create the table or the pipe it will result in an exception.
What considerations need to be taken when using database cloning as a tool for data lifecycle management in a development environment? (Choose two.)
AAny pipes in the source are not cloned.
BAny pipes in the source referring to internal stages are not cloned.
CAny pipes in the source referring to external stages are not cloned.
DThe clone inherits all granted privileges of all child objects in the source object, including the database.
EThe clone inherits all granted privileges of all child objects in the source object, excluding the database.
What transformations are supported in the below SQL statement? (Choose three.)
CREATE PIPE ... AS COPY ... FROM (...)
AData can be filtered by an optional WHERE clause.
BColumns can be reordered.
CColumns can be omitted.
DType casts are supported.
EIncoming data can be joined with other tables.
FThe ON_ERROR - ABORT_STATEMENT command can be used.
What is a key consideration when setting up search optimization service for a table?
ASearch optimization service works best with a column that has a minimum of 100 K distinct values.
BSearch optimization service can significantly improve query performance on partitioned external tables.
CSearch optimization service can help to optimize storage usage by compressing the data into a GZIP format.
DThe table must be clustered with a key having multiple columns for effective search optimization.
When using the COPY INTO [table] command with the CSV file format, how does the MATCH_BY_COLUMN_NAME parameter behave?
AIt expects a header to be present in the CSV file, which is matched to a case-sensitive table column name.
BThe parameter will be ignored.
CThe command will return an error.
DThe command will return a warning stating that the file has unmatched columns.
Based on the architecture in the image, how can the data from DB1 be copied into TBL2? (Choose two.)
A
B
C
D
E
How can the Snowflake context functions be used to help determine whether a user is authorized to see data that has column-level security enforced? (Choose two.)
ASet masking policy conditions using CURRENT_ROLE targeting the role in use for the current session.
BSet masking policy conditions using IS_ROLE_IN_SESSION targeting the role in use for the current account.
CSet masking policy conditions using INVOKER_ROLE targeting the executing role in a SQL statement.
DDetermine if there are OWNERSHIP privileges on the masking policy that would allow the use of any function.
EAssign the ACCOUNTADMIN role to the user who is executing the object.
A company has a Snowflake environment running in AWS us-west-2 (Oregon). The company needs to share data privately with a customer who is running their Snowflake environment in Azure East US 2 (Virginia).
What is the recommended sequence of operations that must be followed to meet this requirement?
A
Create a share and add the database privileges to the share2. Create a new listing on the Snowflake Marketplace3. Alter the listing and add the share4. Instruct the customer to subscribe to the listing on the Snowflake Marketplace
B
Ask the customer to create a new Snowflake account in Azure EAST US 2 (Virginia)2. Create a share and add the database privileges to the share3. Alter the share and add the customer's Snowflake account to the share
C
Create a new Snowflake account in Azure East US 2 (Virginia)2. Set up replication between AWS us-west-2 (Oregon) and Azure East US 2 (Virginia) for the database objects to be shared3. Create a share and add the database privileges to the share4. Alter the share and add the customer's Snowflake account to the share
D
Create a reader account in Azure East US 2 (Virginia)2. Create a share and add the database privileges to the share3. Add the reader account to the share4. Share the reader account's URL and credentials with the customer
A Snowflake Architect created a new data share and would like to verify that only specific records in secure views are visible within the data share by the consumers.
What is the recommended way to validate data accessibility by the consumers?
ACreate reader accounts as shown below and impersonate the consumers by logging in with their credentials.create managed account reader_acct1 admin_name = user1 , admin_password = 'Sdfed43da!44' , type = reader;
BCreate a row access policy as shown below and assign it to the data share.create or replace row access policy rap_acct as (acct_id varchar) returns boolean -> case when 'acct1_role' = current_role() then true else false end;
CSet the session parameter called SIMULATED_DATA_SHARING_CONSUMER as shown below in order to impersonate the consumer accounts.alter session set simulated_data_sharing_consumer = 'Consumer Acct1'
DAlter the share settings as shown below, in order to impersonate a specific consumer account.alter share sales_share set accounts = 'Consumer1' share_restrictions = true
What is a characteristic of Role-Based Access Control (RBAC) as used in Snowflake?
APrivileges can be granted at the database level and can be inherited by all underlying objects.
BA user can use a "super-user" access along with SECURITYADMIN to bypass authorization checks and access all databases, schemas, and underlying objects.
CA user can create managed access schemas to support future grants and ensure only schema owners can grant privileges to other roles.
DA user can create managed access schemas to support current and future grants and ensure only object owners can grant privileges to other roles.
A group of Data Analysts have been granted the role ANALYST_ROLE. They need a Snowflake database where they can create and modify tables, views, and other objects to load with their own data. The Analysts should not have the ability to give other Snowflake users outside of their role access to this data.
How should these requirements be met?
AGrant ANALYST_ROLE OWNERSHIP on the database, but make sure that ANALYST_ROLE does not have the MANAGE GRANTS privilege on the account.
BGrant SYSADMIN OWNERSHIP of the database, but grant the create schema privilege on the database to the ANALYST_ROLE.
CMake every schema in the database a MANAGED ACCESS schema, owned by SYSADMIN, and grant create privileges on each schema to the ANALYST_ROLE for each type of object that needs to be created.
DGrant ANALYST_ROLE OWNERSHIP on the database, but grant the OWNERSHIP ON FUTURE [object type]s in database privilege to SYSADMIN.
Which columns can be included in an external table schema? (Choose three.)
AVALUE
BMETADATA$ROW_ID
CMETADATA$ISUPDATE
DMETADATA$FILENAME
EMETADATA$FILE_ROW_NUMBER
FMETADATA$EXTERNAL_TABLE_PARTITION
How can the Snowpipe REST API be used to keep a log of data load history?
ACall insertReport every 20 minutes, fetching the last 10,000 entries.
BCall loadHistoryScan every minute for the maximum time range.
CCall insertReport every 8 minutes for a 10-minute time range.
DCall loadHistoryScan every 10 minutes for a 15-minute time range.
A company is designing high availability and disaster recovery plans and needs to maximize redundancy and minimize recovery time objectives for their critical application processes.
Cost is not a concern as long as the solution is the best available.
The plan so far consists of the following steps:
Deployment of Snowflake accounts on two different cloud providers.
Selection of cloud provider regions that are geographically far apart.
The Snowflake deployment will replicate the databases and account data between both cloud provider accounts.
Implementation of Snowflake client redirect.
What is the MOST cost-effective way to provide the HIGHEST uptime and LEAST application disruption if there is a service event?
AConnect the applications using the - URL.Use the Business Critical Snowflake edition.
BConnect the applications using the - URL.Use the Virtual Private Snowflake (VPS) edition.
CConnect the applications using the - URL.Use the Enterprise Snowflake edition.
DConnect the applications using the - URL.Use the Business Critical Snowflake edition.
Data is being imported and stored as JSON in a VARIANT column. Query performance was fine, but most recently, poor query performance has been reported.
What could be causing this?
AThere were JSON nulls in the recent data imports.
BThe order of the keys in the JSON was changed.
CThe recent data imports contained fewer fields than usual.
DThere were variations in string lengths for the JSON values in the recent data imports.
Which SQL ALTER command will MAXIMIZE memory and compute resources for a Snowpark stored procedure when executed on the snowpark_opt_wh warehouse?
Aalter warehouse snowpark_opt_wh set max_concurrency_level = 1;
Balter warehouse snowpark_opt_wh set max_concurrency_level = 2;
Calter warehouse snowpark_opt_wh set max_concurrency_level = 8;
Dalter warehouse snowpark_opt_wh set max_concurrency_level = 16;
An Architect clones a database and all of its objects, including tasks. After the cloning, the tasks stop running.
Why is this occurring?
ATasks cannot be cloned.
BThe objects that the tasks reference are not fully qualified.
CCloned tasks are suspended by default and must be manually resumed.
DThe Architect has insufficient privileges to alter tasks on the cloned database.
Is it possible for a data provider account with a Snowflake Business Critical edition to share data with an Enterprise edition data consumer account?
AA Business Critical account cannot be a data sharing provider to an Enterprise consumer. Any consumer accounts must also be Business Critical.
BIf a user in the provider account with role authority to CREATE or ALTER SHARE adds an Enterprise account as a consumer, it can import the share.
CIf a user in the provider account with a share owning role sets SHARE_RESTRICTIONS to False when adding an Enterprise consumer account, it can import the share.
DIf a user in the provider account with a share owning role which also has OVERRIDE SHARE RESTRICTIONS privilege SHARE_RESTRICTIONS set to False when adding an Enterprise consumer account, it can import the share.
Which Snowflake objects can be used in a data share? (Choose two.)
AStandard view
BSecure view
CStored procedure
DExternal table
EStream
What are characteristics of the use of transactions in Snowflake? (Choose two.)
AExplicit transactions can contain DDL, DML, and query statements.
BThe AUTOCOMMIT setting can be changed inside a stored procedure.
CA transaction can be started explicitly by executing a BEGIN WORK statement and end explicitly by executing a COMMIT WORK statement.
DA transaction can be started explicitly by executing a BEGIN TRANSACTION statement and end explicitly by executing an END TRANSACTION statement.
EExplicit transactions should contain only DML statements and query statements. All DDL statements implicitly commit active transactions.
The diagram shows the process flow for Snowpipe auto-ingest with Amazon Simple Notification Service (SNS) with the following steps:
Step 1: Data files are loaded in a stage.
Step 2: An Amazon S3 event notification, published by SNS, informs Snowpipe — by way of Amazon Simple Queue Service (SQS) - that files are ready to load. Snowpipe copies the files into a queue.
Step 3: A Snowflake-provided virtual warehouse loads data from the queued files into the target table based on parameters defined in the specified pipe.
If an AWS Administrator accidentally deletes the SQS subscription to the SNS topic in Step 2, what will happen to the pipe that references the topic to receive event messages from Amazon
S3?
AThe pipe will continue to receive the messages as Snowflake will automatically restore the subscription to the same SNS topic and will recreate the pipe by specifying the same SNS topic name in the pipe definition.
BThe pipe will no longer be able to receive the messages and the user must wait for 24 hours from the time when the SNS topic subscription was deleted. Pipe recreation is not required as the pipe will reuse the same subscription to the existing SNS topic after 24 hours.
CThe pipe will continue to receive the messages as Snowflake will automatically restore the subscription by creating a new SNS topic. Snowflake will then recreate the pipe by specifying the new SNS topic name in the pipe definition.
DThe pipe will no longer be able to receive the messages. To restore the system immediately, the user needs to manually create a new SNS topic with a different name and then recreate the pipe by specifying the new SNS topic name in the pipe definition.
What step will improve the performance of queries executed against an external table?
APartition the external table.
BShorten the names of the source files.
CConvert the source files' character encoding to UTF-8.
DUse an internal stage instead of an external stage to store the source files.
A company needs to share its product catalog data with one of its partners. The product catalog data is stored in two database tables: PRODUCT_CATEGORY, and PRODUCT_DETAILS. Both tables can be joined by the PRODUCT_ID column. Data access should be governed, and only the partner should have access to the records.
The partner is not a Snowflake customer. The partner uses Amazon S3 for cloud storage.
Which design will be the MOST cost-effective and secure, while using the required Snowflake features?
AUse Secure Data Sharing with an S3 bucket as a destination.
BPublish PRODUCT_CATEGORY and PRODUCT_DETAILS data sets on the Snowflake Marketplace.
CCreate a database user for the partner and give them access to the required data sets.
DCreate a reader account for the partner and share the data sets as secure views.
Based on the Snowflake object hierarchy, what securable objects belong directly to a Snowflake account? (Choose three.)
ADatabase
BSchema
CTable
DStage
ERole
FWarehouse
Which of the following ingestion methods can be used to load near real-time data by using the messaging services provided by a cloud provider?