SnowPro Advanced ArchitectFree trialFree trial

By snowflake
Aug, 2025

Verified

25Q per page

Question 1

What built-in Snowflake features make use of the change tracking metadata for a table? (Choose two.)

  • A: The MERGE command
  • B: The UPSERT command
  • C: The CHANGES clause
  • D: A STREAM object
  • E: Thee CHANGE_DATA_CAPTURE command

Question 2

Which organization-related tasks can be performed by the ORGADMIN role? (Choose three.)

  • A: Changing the name of the organization
  • B: Creating an account
  • C: Viewing a list of organization accounts
  • D: Changing the name of an account
  • E: Deleting an account
  • F: Enabling the replication of a database

Question 3

A retail company has over 3000 stores all using the same Point Of Sale (POS) system. The company wants to deliver near real-time sales results to category managers. The stores operate in a variety of time zones and exhibit a dynamic range of transactions each minute, with some stores having higher sales volumes than others.

Sales results are provided in a uniform fashion using data engineered fields that will be calculated in a complex data pipeline. Calculations include exceptions, aggregations, and scoring using external functions interfaced to scoring algorithms. The source data for aggregations has over 100M rows.

Every minute, the POS sends all sales transactions files to a cloud storage location with a naming convention that includes store numbers and timestamps to identify the set of transactions contained in the files. The files are typically less than 10MB in size.

How can the near real-time results be provided to the category managers? (Choose two.)

  • A: All files should be concatenated before ingestion into Snowflake to avoid micro-ingestion.
  • B: A Snowpipe should be created and configured with AUTO_INGEST = TRUE. A stream should be created to process INSERTS into a single target table using the stream metadata to inform the store number and timestamps.
  • C: A STREAM should be created to accumulate the near real-time data and a TASK should be created that runs at a frequency that matches the real-time analytics needs.
  • D: An external scheduler should examine the contents of the cloud storage location and issue SnowSQL commands to process the data at a frequency that matches the real-time analytics needs.
  • E: The COPY INTO command with a task scheduled to run every second should be used to achieve the near-real time requirement.

Question 4

A company needs to share its product catalog data with one of its partners. The product catalog data is stored in two database tables: PRODUCT_CATEGORY, and PRODUCT_DETAILS. Both tables can be joined by the PRODUCT_ID column. Data access should be governed, and only the partner should have access to the records.

The partner is not a Snowflake customer. The partner uses Amazon S3 for cloud storage.

Which design will be the MOST cost-effective and secure, while using the required Snowflake features?

  • A: Use Secure Data Sharing with an S3 bucket as a destination.
  • B: Publish PRODUCT_CATEGORY and PRODUCT_DETAILS data sets on the Snowflake Marketplace.
  • C: Create a database user for the partner and give them access to the required data sets.
  • D: Create a reader account for the partner and share the data sets as secure views.

Question 5

A company has a Snowflake environment running in AWS us-west-2 (Oregon). The company needs to share data privately with a customer who is running their Snowflake environment in Azure East US 2 (Virginia).

What is the recommended sequence of operations that must be followed to meet this requirement?

  • A: 1. Create a share and add the database privileges to the share 2. Create a new listing on the Snowflake Marketplace 3. Alter the listing and add the share 4. Instruct the customer to subscribe to the listing on the Snowflake Marketplace
  • B: 1. Ask the customer to create a new Snowflake account in Azure EAST US 2 (Virginia) 2. Create a share and add the database privileges to the share 3. Alter the share and add the customer's Snowflake account to the share
  • C: 1. Create a new Snowflake account in Azure East US 2 (Virginia) 2. Set up replication between AWS us-west-2 (Oregon) and Azure East US 2 (Virginia) for the database objects to be shared 3. Create a share and add the database privileges to the share 4. Alter the share and add the customer's Snowflake account to the share
  • D: 1. Create a reader account in Azure East US 2 (Virginia) 2. Create a share and add the database privileges to the share 3. Add the reader account to the share 4. Share the reader account's URL and credentials with the customer

Question 6

Company A has recently acquired company B. The Snowflake deployment for company B is located in the Azure West Europe region.

As part of the integration process, an Architect has been asked to consolidate company B's sales data into company A's Snowflake account which is located in the AWS us-east-1 region.

How can this requirement be met?

  • A: Replicate the sales data from company B's Snowflake account into company A's Snowflake account using cross-region data replication within Snowflake. Configure a direct share from company B's account to company A's account.
  • B: Export the sales data from company B's Snowflake account as CSV files, and transfer the files to company A's Snowflake account. Import the data using Snowflake's data loading capabilities.
  • C: Migrate company B's Snowflake deployment to the same region as company A's Snowflake deployment, ensuring data locality. Then perform a direct database-to-database merge of the sales data.
  • D: Build a custom data pipeline using Azure Data Factory or a similar tool to extract the sales data from company B's Snowflake account. Transform the data, then load it into company A's Snowflake account.

Question 7

A Snowflake Architect created a new data share and would like to verify that only specific records in secure views are visible within the data share by the consumers.

What is the recommended way to validate data accessibility by the consumers?

  • A: Create reader accounts as shown below and impersonate the consumers by logging in with their credentials. create managed account reader_acct1 admin_name = user1 , admin_password = 'Sdfed43da!44' , type = reader;
  • B: Create a row access policy as shown below and assign it to the data share. create or replace row access policy rap_acct as (acct_id varchar) returns boolean -> case when 'acct1_role' = current_role() then true else false end;
  • C: Set the session parameter called SIMULATED_DATA_SHARING_CONSUMER as shown below in order to impersonate the consumer accounts. alter session set simulated_data_sharing_consumer = 'Consumer Acct1'
  • D: Alter the share settings as shown below, in order to impersonate a specific consumer account. alter share sales_share set accounts = 'Consumer1' share_restrictions = true

Question 8

A company is using Snowflake in Azure in the Netherlands. The company analyst team also has data in JSON format that is stored in an Amazon S3 bucket in the AWS Singapore region that the team wants to analyze.

The Architect has been given the following requirements:

  1. Provide access to frequently changing data
  2. Keep egress costs to a minimum
  3. Maintain low latency

How can these requirements be met with the LEAST amount of operational overhead?

  • A: Use a materialized view on top of an external table against the S3 bucket in AWS Singapore.
  • B: Use an external table against the S3 bucket in AWS Singapore and copy the data into transient tables.
  • C: Copy the data between providers from S3 to Azure Blob storage to collocate, then use Snowpipe for data ingestion.
  • D: Use AWS Transfer Family to replicate data between the S3 bucket in AWS Singapore and an Azure Netherlands Blob storage, then use an external table against the Blob storage.

Question 9

Based on the Snowflake object hierarchy, what securable objects belong directly to a Snowflake account? (Choose three.)

  • A: Database
  • B: Schema
  • C: Table
  • D: Stage
  • E: Role
  • F: Warehouse

Question 10

What is a characteristic of Role-Based Access Control (RBAC) as used in Snowflake?

  • A: Privileges can be granted at the database level and can be inherited by all underlying objects.
  • B: A user can use a "super-user" access along with SECURITYADMIN to bypass authorization checks and access all databases, schemas, and underlying objects.
  • C: A user can create managed access schemas to support future grants and ensure only schema owners can grant privileges to other roles.
  • D: A user can create managed access schemas to support current and future grants and ensure only object owners can grant privileges to other roles.

Question 11

Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would copying of data be required, and zero-copy cloning not be suitable? (Choose two.)

  • A: Developers create their own datasets to work against transformed versions of the live data.
  • B: Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.
  • C: Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.
  • D: Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.
  • E: The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.

Question 12

A new user user_01 is created within Snowflake. The following two commands are executed:

Command 1 --> show grants to user user_01;
Command 2 --> show grants on user user_01;

What inferences can be made about these commands?

  • A: Command 1 defines which user owns user_01 Command 2 defines all the grants which have been given to user_01
  • B: Command 1 defines all the grants which are given to user_01 Command 2 defines which user owns user_01
  • C: Command 1 defines which role owns user_0l Command 2 defines all the grants which have been given to user_01
  • D: Command 1 defines all the grants which are given to user_01 Command 2 defines which role owns user_01

Question 13

A Data Engineer is designing a near real-time ingestion pipeline for a retail company to ingest event logs into Snowflake to derive insights. A Snowflake Architect is asked to define security best practices to configure access control privileges for the data load for auto-ingest to Snowpipe.
What are the MINIMUM object privileges required for the Snowpipe user to execute Snowpipe?

  • A: OWNERSHIP on the named pipe, USAGE on the named stage, target database, and schema, and INSERT and SELECT on the target table
  • B: OWNERSHIP on the named pipe, USAGE and READ on the named stage, USAGE on the target database and schema, and INSERT end SELECT on the target table
  • C: CREATE on the named pipe, USAGE and READ on the named stage, USAGE on the target database and schema, and INSERT end SELECT on the target table
  • D: USAGE on the named pipe, named stage, target database, and schema, and INSERT and SELECT on the target table

Question 14

The IT Security team has identified that there is an ongoing credential stuffing attack on many of their organization’s system.
What is the BEST way to find recent and ongoing login attempts to Snowflake?

  • A: Call the LOGIN_HISTORY Information Schema table function.
  • B: Query the LOGIN_HISTORY view in the ACCOUNT_USAGE schema in the SNOWFLAKE database.
  • C: View the History tab in the Snowflake UI and set up a filter for SQL text that contains the text "LOGIN".
  • D: View the Users section in the Account tab in the Snowflake UI and review the last login column.

Question 15

An Architect has a VPN_ACCESS_LOGS table in the SECURITY_LOGS schema containing timestamps of the connection and disconnection, username of the user, and summary statistics.
What should the Architect do to enable the Snowflake search optimization service on this table?

  • A: Assume role with OWNERSHIP on future tables and ADD SEARCH OPTIMIZATION on the SECURITY_LOGS schema.
  • B: Assume role with ALL PRIVILEGES including ADD SEARCH OPTIMIZATION in the SECURITY LOGS schema.
  • C: Assume role with OWNERSHIP on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.
  • D: Assume role with ALL PRIVILEGES on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

Question 16

The table contains five columns and it has millions of records. The cardinality distribution of the columns is shown below:

Image 1

Column C4 and C5 are mostly used by SELECT queries in the GROUP BY and ORDER BY clauses. Whereas columns C1, C2 and C3 are heavily used in filter and join conditions of SELECT queries.
The Architect must design a clustering key for this table to improve the query performance.
Based on Snowflake recommendations, how should the clustering key columns be ordered while defining the multi-column clustering key?

  • A: C5, C4, C2
  • B: C3, C4, C5
  • C: C1, C3, C2
  • D: C2, C1, C3

Question 17

Which security, governance, and data protection features require, at a MINIMUM, the Business Critical edition of Snowflake? (Choose two.)

  • A: Extended Time Travel (up to 90 days)
  • B: Customer-managed encryption keys through Tri-Secret Secure
  • C: Periodic rekeying of encrypted data
  • D: AWS, Azure, or Google Cloud private connectivity to Snowflake
  • E: Federated authentication and SSO

Question 18

A company wants to deploy its Snowflake accounts inside its corporate network with no visibility on the internet. The company is using a VPN infrastructure and Virtual Desktop Infrastructure (VDI) for its Snowflake users. The company also wants to re-use the login credentials set up for the VDI to eliminate redundancy when managing logins.
What Snowflake functionality should be used to meet these requirements? (Choose two.)

  • A: Set up replication to allow users to connect from outside the company VPN.
  • B: Provision a unique company Tri-Secret Secure key.
  • C: Use private connectivity from a cloud provider.
  • D: Set up SSO for federated authentication.
  • E: Use a proxy Snowflake account outside the VPN, enabling client redirect for user logins.

Question 19

How do Snowflake databases that are created from shares differ from standard databases that are not created from shares? (Choose three.)

  • A: Shared databases are read-only.
  • B: Shared databases must be refreshed in order for new data to be visible.
  • C: Shared databases cannot be cloned.
  • D: Shared databases are not supported by Time Travel.
  • E: Shared databases will have the PUBLIC or INFORMATION_SCHEMA schemas without explicitly granting these schemas to the share.
  • F: Shared databases can also be created as transient databases.

Question 20

What integration object should be used to place restrictions on where data may be exported?

  • A: Stage integration
  • B: Security integration
  • C: Storage integration
  • D: API integration

Question 21

The following DDL command was used to create a task based on a stream:

Image 1

Assuming MY_WH is set to auto_suspend – 60 and used exclusively for this task, which statement is true?

  • A: The warehouse MY_WH will be made active every five minutes to check the stream.
  • B: The warehouse MY_WH will only be active when there are results in the stream.
  • C: The warehouse MY_WH will never suspend.
  • D: The warehouse MY_WH will automatically resize to accommodate the size of the stream.

Question 22

When using the Snowflake Connector for Kafka, what data formats are supported for the messages? (Choose two.)

  • A: CSV
  • B: XML
  • C: Avro
  • D: JSON
  • E: Parquet

That’s the end of your free questions

You’ve reached the preview limit for SnowPro Advanced Architect

Consider upgrading to gain full access!

Page 1 of 5 • Questions 1-25 of 109

Free preview mode

Enjoy the free questions and consider upgrading to gain full access!