Month End Special Sale - 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: spcl70

Practice Free ARA-C01 SnowPro Advanced: Architect Certification Exam Exam Questions Answers With Explanation

We at Crack4sure are committed to giving students who are preparing for the Snowflake ARA-C01 Exam the most current and reliable questions . To help people study, we've made some of our SnowPro Advanced: Architect Certification Exam exam materials available for free to everyone. You can take the Free ARA-C01 Practice Test as many times as you want. The answers to the practice questions are given, and each answer is explained.

Question # 6

What considerations need to be taken when using database cloning as a tool for data lifecycle management in a development environment? (Select TWO).

A.

Any pipes in the source are not cloned.

B.

Any pipes in the source referring to internal stages are not cloned.

C.

Any pipes in the source referring to external stages are not cloned.

D.

The clone inherits all granted privileges of all child objects in the source object, including the database.

E.

The clone inherits all granted privileges of all child objects in the source object, excluding the database.

Question # 7

A company wants to deploy its Snowflake accounts inside its corporate network with no visibility on the internet. The company is using a VPN infrastructure and Virtual Desktop Infrastructure (VDI) for its Snowflake users. The company also wants to re-use the login credentials set up for the VDI to eliminate redundancy when managing logins.

What Snowflake functionality should be used to meet these requirements? (Choose two.)

A.

Set up replication to allow users to connect from outside the company VPN.

B.

Provision a unique company Tri-Secret Secure key.

C.

Use private connectivity from a cloud provider.

D.

Set up SSO for federated authentication.

E.

Use a proxy Snowflake account outside the VPN, enabling client redirect for user logins.

Question # 8

A large manufacturing company runs a dozen individual Snowflake accounts across its business divisions. The company wants to increase the level of data sharing to support supply chain optimizations and increase its purchasing leverage with multiple vendors.

The company’s Snowflake Architects need to design a solution that would allow the business divisions to decide what to share, while minimizing the level of effort spent on configuration and management. Most of the company divisions use Snowflake accounts in the same cloud deployments with a few exceptions for European-based divisions.

According to Snowflake recommended best practice, how should these requirements be met?

A.

Migrate the European accounts in the global region and manage shares in a connected graph architecture. Deploy a Data Exchange.

B.

Deploy a Private Data Exchange in combination with data shares for the European accounts.

C.

Deploy to the Snowflake Marketplace making sure that invoker_share() is used in all secure views.

D.

Deploy a Private Data Exchange and use replication to allow European data shares in the Exchange.

Question # 9

There are two databases in an account, named fin_db and hr_db which contain payroll and employee data, respectively. Accountants and Analysts in the company require different permissions on the objects in these databases to perform their jobs. Accountants need read-write access to fin_db but only require read-only access to hr_db because the database is maintained by human resources personnel.

An Architect needs to create a read-only role for certain employees working in the human resources department.

Which permission sets must be granted to this role?

A.

USAGE on database hr_db, USAGE on all schemas in database hr_db, SELECT on all tables in database hr_db

B.

USAGE on database hr_db, SELECT on all schemas in database hr_db, SELECT on all tables in database hr_db

C.

MODIFY on database hr_db, USAGE on all schemas in database hr_db, USAGE on all tables in database hr_db

D.

USAGE on database hr_db, USAGE on all schemas in database hr_db, REFERENCES on all tables in database hr_db

Question # 10

An Architect runs the following SQL query:

ARA-C01 question answer

How can this query be interpreted?

A.

FILEROWS is a stage. FILE_ROW_NUMBER is line number in file.

B.

FILEROWS is the table. FILE_ROW_NUMBER is the line number in the table.

C.

FILEROWS is a file. FILE_ROW_NUMBER is the file format location.

D.

FILERONS is the file format location. FILE_ROW_NUMBER is a stage.

Question # 11

A Developer is having a performance issue with a Snowflake query. The query receives up to 10 different values for one parameter and then performs an aggregation over the majority of a fact table. It then

joins against a smaller dimension table. This parameter value is selected by the different query users when they execute it during business hours. Both the fact and dimension tables are loaded with new data in an overnight import process.

On a Small or Medium-sized virtual warehouse, the query performs slowly. Performance is acceptable on a size Large or bigger warehouse. However, there is no budget to increase costs. The Developer

needs a recommendation that does not increase compute costs to run this query.

What should the Architect recommend?

A.

Create a task that will run the 10 different variations of the query corresponding to the 10 different parameters before the users come in to work. The query results will then be cached and ready to respond quickly when the users re-issue the query.

B.

Create a task that will run the 10 different variations of the query corresponding to the 10 different parameters before the users come in to work. The task will be scheduled to align with the users' working hours in order to allow the warehouse cache to be used.

C.

Enable the search optimization service on the table. When the users execute the query, the search optimization service will automatically adjust the query execution plan based on the frequently-used parameters.

D.

Create a dedicated size Large warehouse for this particular set of queries. Create a new role that has USAGE permission on this warehouse and has the appropriate read permissions over the fact and dimension tables. Have users switch to this role and use this warehouse when they want to access this data.

Question # 12

Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would be copying of data be required, and zero-copy cloning not be suitable? (Select TWO).

A.

Developers create their own datasets to work against transformed versions of the live data.

B.

Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.

C.

Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.

D.

Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.

E.

The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.

Question # 13

A healthcare company wants to share data with a medical institute. The institute is running a Standard edition of Snowflake; the healthcare company is running a Business Critical edition.

How can this data be shared?

A.

The healthcare company will need to change the institute’s Snowflake edition in the accounts panel.

B.

By default, sharing is supported from a Business Critical Snowflake edition to a Standard edition.

C.

Contact Snowflake and they will execute the share request for the healthcare company.

D.

Set the share_restriction parameter on the shared object to false.

Question # 14

A company is using a Snowflake account in Azure. The account has SAML SSO set up using ADFS as a SCIM identity provider. To validate Private Link connectivity, an Architect performed the following steps:

* Confirmed Private Link URLs are working by logging in with a username/password account

* Verified DNS resolution by running nslookups against Private Link URLs

* Validated connectivity using SnowCD

* Disabled public access using a network policy set to use the company’s IP address range

However, the following error message is received when using SSO to log into the company account:

IP XX.XXX.XX.XX is not allowed to access snowflake. Contact your local security administrator.

What steps should the Architect take to resolve this error and ensure that the account is accessed using only Private Link? (Choose two.)

A.

Alter the Azure security integration to use the Private Link URLs.

B.

Add the IP address in the error message to the allowed list in the network policy.

C.

Generate a new SCIM access token using system$generate_scim_access_token and save it to Azure AD.

D.

Update the configuration of the Azure AD SSO to use the Private Link URLs.

E.

Open a case with Snowflake Support to authorize the Private Link URLs’ access to the account.

Question # 15

Data is being imported and stored as JSON in a VARIANT column. Query performance was fine, but most recently, poor query performance has been reported.

What could be causing this?

A.

There were JSON nulls in the recent data imports.

B.

The order of the keys in the JSON was changed.

C.

The recent data imports contained fewer fields than usual.

D.

There were variations in string lengths for the JSON values in the recent data imports.

Question # 16

A company is trying to Ingest 10 TB of CSV data into a Snowflake table using Snowpipe as part of Its migration from a legacy database platform. The records need to be ingested in the MOST performant and cost-effective way.

How can these requirements be met?

A.

Use ON_ERROR = continue in the copy into command.

B.

Use purge = TRUE in the copy into command.

C.

Use FURGE = FALSE in the copy into command.

D.

Use on error = SKIP_FILE in the copy into command.

Question # 17

A Snowflake Architect is designing a multi-tenant application strategy for an organization in the Snowflake Data Cloud and is considering using an Account Per Tenant strategy.

Which requirements will be addressed with this approach? (Choose two.)

A.

There needs to be fewer objects per tenant.

B.

Security and Role-Based Access Control (RBAC) policies must be simple to configure.

C.

Compute costs must be optimized.

D.

Tenant data shape may be unique per tenant.

E.

Storage costs must be optimized.

Question # 18

A Snowflake Architect is setting up database replication to support a disaster recovery plan. The primary database has external tables.

How should the database be replicated?

A.

Create a clone of the primary database then replicate the database.

B.

Move the external tables to a database that is not replicated, then replicate the primary database.

C.

Replicate the database ensuring the replicated database is in the same region as the external tables.

D.

Share the primary database with an account in the same region that the database will be replicated to.

Question # 19

Is it possible for a data provider account with a Snowflake Business Critical edition to share data with an Enterprise edition data consumer account?

A.

A Business Critical account cannot be a data sharing provider to an Enterprise consumer. Any consumer accounts must also be Business Critical.

B.

If a user in the provider account with role authority to create or alter share adds an Enterprise account as a consumer, it can import the share.

C.

If a user in the provider account with a share owning role sets share_restrictions to False when adding an Enterprise consumer account, it can import the share.

D.

If a user in the provider account with a share owning role which also has override share restrictions privilege share_restrictions set to False when adding an Enterprise consumer account, it can import the share.

Question # 20

Database DB1 has schema S1 which has one table, T1.

DB1 --> S1 --> T1

The retention period of EG1 is set to 10 days.

The retention period of s: is set to 20 days.

The retention period of t: Is set to 30 days.

The user runs the following command:

Drop Database DB1;

What will the Time Travel retention period be for T1?

A.

10 days

B.

20 days

C.

30 days

D.

37 days

Question # 21

What integration object should be used to place restrictions on where data may be exported?

A.

Stage integration

B.

Security integration

C.

Storage integration

D.

API integration

Question # 22

Which technique will efficiently ingest and consume semi-structured data for Snowflake data lake workloads?

A.

IDEF1X

B.

Schema-on-write

C.

Schema-on-read

D.

Information schema

Question # 23

Two queries are run on the customer_address table:

create or replace TABLE CUSTOMER_ADDRESS ( CA_ADDRESS_SK NUMBER(38,0), CA_ADDRESS_ID VARCHAR(16), CA_STREET_NUMBER VARCHAR(IO) CA_STREET_NAME VARCHAR(60), CA_STREET_TYPE VARCHAR(15), CA_SUITE_NUMBER VARCHAR(10), CA_CITY VARCHAR(60), CA_COUNTY

VARCHAR(30), CA_STATE VARCHAR(2), CA_ZIP VARCHAR(10), CA_COUNTRY VARCHAR(20), CA_GMT_OFFSET NUMBER(5,2), CA_LOCATION_TYPE

VARCHAR(20) );

ALTER TABLE DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS ADD SEARCH OPTIMIZATION ON SUBSTRING(CA_ADDRESS_ID);

Which queries will benefit from the use of the search optimization service? (Select TWO).

A.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where substring(CA_ADDRESS_ID,1,8)= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,8);

B.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,16);

C.

select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDLIKE ’%BAAASKD%';

D.

select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDLIKE '%PHPP%';

E.

select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDNOT LIKE '%AAAAAAAAPHPPL%';

Question # 24

An Architect is troubleshooting a query with poor performance using the QUERY function. The Architect observes that the COMPILATION_TIME Is greater than the EXECUTION_TIME.

What is the reason for this?

A.

The query is processing a very large dataset.

B.

The query has overly complex logic.

C.

The query Is queued for execution.

D.

The query Is reading from remote storage

Question # 25

An Architect is using an event table associated with a Sales database (sales_db) to track logging and tracing of procedures and functions. The event table is also used to refresh dynamic tables.

A stored procedure causing issues resides in the Marketing database (marketing_db). Both databases are in the same Snowflake account. The Marketing database is not associated with a specific event table.

How can the Architect investigate the issue?

A.

Add a new event table and associate it with the Marketing database.

B.

Add a new event table and associate it with the Snowflake account.

C.

Query the event table SNOWFLAKE.TELEMETRY.EVENTS.

D.

Query the event table MARKETING_DB.TELEMETRY.EVENTS.

Question # 26

What is a characteristic of loading data into Snowflake using the Snowflake Connector for Kafka?

A.

The Connector only works in Snowflake regions that use AWS infrastructure.

B.

The Connector works with all file formats, including text, JSON, Avro, Ore, Parquet, and XML.

C.

The Connector creates and manages its own stage, file format, and pipe objects.

D.

Loads using the Connector will have lower latency than Snowpipe and will ingest data in real time.

Question # 27

A company needs to have the following features available in its Snowflake account:

1. Support for Multi-Factor Authentication (MFA)

2. A minimum of 2 months of Time Travel availability

3. Database replication in between different regions

4. Native support for JDBC and ODBC

5. Customer-managed encryption keys using Tri-Secret Secure

6. Support for Payment Card Industry Data Security Standards (PCI DSS)

In order to provide all the listed services, what is the MINIMUM Snowflake edition that should be selected during account creation?

A.

Standard

B.

Enterprise

C.

Business Critical

D.

Virtual Private Snowflake (VPS)

Question # 28

A retail company has over 3000 stores all using the same Point of Sale (POS) system. The company wants to deliver near real-time sales results to category managers. The stores operate in a variety of time zones and exhibit a dynamic range of transactions each minute, with some stores having higher sales volumes than others.

Sales results are provided in a uniform fashion using data engineered fields that will be calculated in a complex data pipeline. Calculations include exceptions, aggregations, and scoring using external functions interfaced to scoring algorithms. The source data for aggregations has over 100M rows.

Every minute, the POS sends all sales transactions files to a cloud storage location with a naming convention that includes store numbers and timestamps to identify the set of transactions contained in the files. The files are typically less than 10MB in size.

How can the near real-time results be provided to the category managers? (Select TWO).

A.

All files should be concatenated before ingestion into Snowflake to avoid micro-ingestion.

B.

A Snowpipe should be created and configured with AUTO_INGEST = true. A stream should be created to process INSERTS into a single target table using the stream metadata to inform the store number and timestamps.

C.

A stream should be created to accumulate the near real-time data and a task should be created that runs at a frequency that matches the real-time analytics needs.

D.

An external scheduler should examine the contents of the cloud storage location and issue SnowSQL commands to process the data at a frequency that matches the real-time analytics needs.

E.

The copy into command with a task scheduled to run every second should be used to achieve the near-real time requirement.

Question # 29

A Snowflake Architect is designing an application and tenancy strategy for an organization where strong legal isolation rules as well as multi-tenancy are requirements.

Which approach will meet these requirements if Role-Based Access Policies (RBAC) is a viable option for isolating tenants?

A.

Create accounts for each tenant in the Snowflake organization.

B.

Create an object for each tenant strategy if row level security is viable for isolating tenants.

C.

Create an object for each tenant strategy if row level security is not viable for isolating tenants.

D.

Create a multi-tenant table strategy if row level security is not viable for isolating tenants.

Question # 30

A company is storing large numbers of small JSON files (ranging from 1-4 bytes) that are received from IoT devices and sent to a cloud provider. In any given hour, 100,000 files are added to the cloud provider.

What is the MOST cost-effective way to bring this data into a Snowflake table?

A.

An external table

B.

A pipe

C.

A stream

D.

A copy command at regular intervals

Question # 31

An Architect Is designing a data lake with Snowflake. The company has structured, semi-structured, and unstructured data. The company wants to save the data inside the data lake within the Snowflake system. The company is planning on sharing data among Its corporate branches using Snowflake data sharing.

What should be considered when sharing the unstructured data within Snowflake?

A.

A pre-signed URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with no time limit for the URL.

B.

A scoped URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 24-hour time limit for the URL.

C.

A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 7-day time limit for the URL.

D.

A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with the "expiration_time" argument defined for the URL time limit.

Question # 32

When using the copy into <table> command with the CSV file format, how does the match_by_column_name parameter behave?

A.

It expects a header to be present in the CSV file, which is matched to a case-sensitive table column name.

B.

The parameter will be ignored.

C.

The command will return an error.

D.

The command will return a warning stating that the file has unmatched columns.

Question # 33

A Snowflake Architect created a new data share and would like to verify that only specific records in secure views are visible within the data share by the consumers.

What is the recommended way to validate data accessibility by the consumers?

A.

Create reader accounts as shown below and impersonate the consumers by logging in with their credentials.create managed account reader_acctl admin_name = userl , adroin_password ? 'Sdfed43da!44T , type = reader;

B.

Create a row access policy as shown below and assign it to the data share.create or replace row access policy rap_acct as (acct_id varchar) returns boolean -> case when 'acctl_role' = current_role() then true else false end;

C.

Set the session parameter called SIMULATED_DATA_SHARING_C0NSUMER as shown below in order to impersonate the consumer accounts.alter session set simulated_data_sharing_consumer - 'Consumer Acctl*

D.

Alter the share settings as shown below, in order to impersonate a specific consumer account.alter share sales share set accounts = 'Consumerl’ share restrictions = true

Question # 34

What are purposes for creating a storage integration? (Choose three.)

A.

Control access to Snowflake data using a master encryption key that is maintained in the cloud provider’s key management service.

B.

Store a generated identity and access management (IAM) entity for an external cloud provider regardless of the cloud provider that hosts the Snowflake account.

C.

Support multiple external stages using one single Snowflake object.

D.

Avoid supplying credentials when creating a stage or when loading or unloading data.

E.

Create private VPC endpoints that allow direct, secure connectivity between VPCs without traversing the public internet.

F.

Manage credentials from multiple cloud providers in one single Snowflake object.

Question # 35

What is a valid object hierarchy when building a Snowflake environment?

A.

Account --> Database --> Schema --> Warehouse

B.

Organization --> Account --> Database --> Schema --> Stage

C.

Account --> Schema > Table --> Stage

D.

Organization --> Account --> Stage --> Table --> View

Question # 36

Which query will identify the specific days and virtual warehouses that would benefit from a multi-cluster warehouse to improve the performance of a particular workload?

A)

ARA-C01 question answer

B)

ARA-C01 question answer

C)

ARA-C01 question answer

D)

ARA-C01 question answer

A.

Option A

B.

Option B

C.

Option C

D.

Option D

Question # 37

You are a snowflake architect in an organization. The business team came to to deploy an use case which requires you to load some data which they can visualize through tableau. Everyday new data comes in and the old data is no longer required.

What type of table you will use in this case to optimize cost

A.

TRANSIENT

B.

TEMPORARY

C.

PERMANENT

Question # 38

A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

The data pipeline needs to run continuously ang efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

Which design will meet these requirements?

A.

Ingest the data using COPY INTO and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

B.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

C.

Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

D.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

Question # 39

User1 and User2 are new users that were granted different functional roles.

User1 was granted the IT_ANALYST_ROLE

User2 was granted the FIN_ANALYST_ROLE

Review the following security design (as shown in the diagram):

ARA-C01 question answer

A database (DB) grants USAGE and SELECT on all tables to DB_IT_RO_ROLE

DB_IT_RO_ROLE is granted to IT_ANALYST_ROLE

IT_SCHEMA contains TABLE1

FINANCE_SCHEMA grants USAGE and SELECT to DB_FIN_ROLE

DB_FIN_ROLE is granted to FIN_ANALYST_ROLE

FINANCE_SCHEMA contains FIN_TABLE

Which tables can each user read?

A.

User1 will be the only user able to read tables from both schemas, since the DB_IT_RO_ROLE has SELECT privileges on all database tables.

B.

User1 will be able to read tables from both schemas, while User2 will be able to read only the FINANCE_SCHEMA tables.

C.

User2 will be able to read tables from the FINANCE_SCHEMA, while User1 will be unable to read any table.

D.

User2 will be able to read tables from both schemas, while User1 will be able to read tables only in IT_SCHEMA.

Question # 40

A company is following the Data Mesh principles, including domain separation, and chose one Snowflake account for its data platform.

An Architect created two data domains to produce two data products. The Architect needs a third data domain that will use both of the data products to create an aggregate data product. The read access to the data products will be granted through a separate role.

Based on the Data Mesh principles, how should the third domain be configured to create the aggregate product if it has been granted the two read roles?

A.

Use secondary roles for all users.

B.

Create a hierarchy between the two read roles.

C.

Request a technical ETL user with the sysadmin role.

D.

Request that the two data domains share data using the Data Exchange.

Question # 41

An Architect needs to automate the daily Import of two files from an external stage into Snowflake. One file has Parquet-formatted data, the other has CSV-formatted data.

How should the data be joined and aggregated to produce a final result set?

A.

Use Snowpipe to ingest the two files, then create a materialized view to produce the final result set.

B.

Create a task using Snowflake scripting that will import the files, and then call a User-Defined Function (UDF) to produce the final result set.

C.

Create a JavaScript stored procedure to read. join, and aggregate the data directly from the external stage, and then store the results in a table.

D.

Create a materialized view to read, Join, and aggregate the data directly from the external stage, and use the view to produce the final result set

Question # 42

A company has several sites in different regions from which the company wants to ingest data.

Which of the following will enable this type of data ingestion?

A.

The company must have a Snowflake account in each cloud region to be able to ingest data to that account.

B.

The company must replicate data between Snowflake accounts.

C.

The company should provision a reader account to each site and ingest the data through the reader accounts.

D.

The company should use a storage integration for the external stage.

Question # 43

Role A has the following permissions:

. USAGE on db1

. USAGE and CREATE VIEW on schemal in db1

. SELECT on tablel in schemal

Role B has the following permissions:

. USAGE on db2

. USAGE and CREATE VIEW on schema2 in db2

. SELECT on table2 in schema2

A user has Role A set as the primary role and Role B as a secondary role.

What command will fail for this user?

A.

use database db1;use schema schemal;create view v1 as select * from db2.schema2.table2;

B.

use database db2;use schema schema2;create view v2 as select * from dbl.schemal. tablel;

C.

use database db2;use schema schema2;select * from db1.schemal.tablel union select * from table2;

D.

use database db1;use schema schemal;select * from db2.schema2.table2;

Question # 44

A company has a Snowflake environment running in AWS us-west-2 (Oregon). The company needs to share data privately with a customer who is running their Snowflake environment in Azure East US 2 (Virginia).

What is the recommended sequence of operations that must be followed to meet this requirement?

A.

1. Create a share and add the database privileges to the share2. Create a new listing on the Snowflake Marketplace3. Alter the listing and add the share4. Instruct the customer to subscribe to the listing on the Snowflake Marketplace

B.

1. Ask the customer to create a new Snowflake account in Azure EAST US 2 (Virginia)2. Create a share and add the database privileges to the share3. Alter the share and add the customer's Snowflake account to the share

C.

1. Create a new Snowflake account in Azure East US 2 (Virginia)2. Set up replication between AWS us-west-2 (Oregon) and Azure East US 2 (Virginia) for the database objects to be shared3. Create a share and add the database privileges to the share4. Alter the share and add the customer's Snowflake account to the share

D.

1. Create a reader account in Azure East US 2 (Virginia)2. Create a share and add the database privileges to the share3. Add the reader account to the share4. Share the reader account's URL and credentials with the customer

Question # 45

Which steps are recommended best practices for prioritizing cluster keys in Snowflake? (Choose two.)

A.

Choose columns that are frequently used in join predicates.

B.

Choose lower cardinality columns to support clustering keys and cost effectiveness.

C.

Choose TIMESTAMP columns with nanoseconds for the highest number of unique rows.

D.

Choose cluster columns that are most actively used in selective filters.

E.

Choose cluster columns that are actively used in the GROUP BY clauses.

Question # 46

An Architect has selected the Snowflake Connector for Python to integrate and manipulate Snowflake data using Python to handle large data sets and complex analyses.

Which features should the Architect consider in terms of query execution and data type conversion? (Select TWO).

A.

The large queries will require conn.cursor() to execute.

B.

The Connector supports asynchronous and synchronous queries.

C.

The Connector converts NUMBER data types to DECIMAL by default.

D.

The Connector converts Snowflake data types to native Python data types by default.

E.

The Connector converts data types to STRING by default.

Question # 47

A retailer's enterprise data organization is exploring the use of Data Vault 2.0 to model its data lake solution. A Snowflake Architect has been asked to provide recommendations for using Data Vault 2.0 on Snowflake.

What should the Architect tell the data organization? (Select TWO).

A.

Change data capture can be performed using the Data Vault 2.0 HASH_DIFF concept.

B.

Change data capture can be performed using the Data Vault 2.0 HASH_DELTA concept.

C.

Using the multi-table insert feature in Snowflake, multiple Point-in-Time (PIT) tables can be loaded in parallel from a single join query from the data vault.

D.

Using the multi-table insert feature, multiple Point-in-Time (PIT) tables can be loaded sequentially from a single join query from the data vault.

E.

There are performance challenges when using Snowflake to load multiple Point-in-Time (PIT) tables in parallel from a single join query from the data vault.

Question # 48

An Architect executes the following statements in order:

CREATE TABLE emp (id INTEGER);

INSERT INTO emp VALUES (1),(2);

CREATE TEMPORARY TABLE emp (id INTEGER);

INSERT INTO emp VALUES (1);

Then executes:

SELECT COUNT(*) FROM emp;

DROP TABLE emp;

SELECT COUNT(*) FROM emp;

What will be the result?

A.

COUNT() = 2

COUNT() = 1

B.

COUNT() = 1

COUNT() = 2

C.

COUNT() = 2

COUNT() = 2

D.

The final query results in an error.

Question # 49

Which Snowflake data modeling approach is designed for BI queries?

A.

3 NF

B.

Star schema

C.

Data Vault

D.

Snowflake schema

Question # 50

An Architect with the ORGADMIN role wants to change a Snowflake account from an Enterprise edition to a Business Critical edition.

How should this be accomplished?

A.

Run an ALTER ACCOUNT command and create a tag of EDITION and set the tag to Business Critical.

B.

Use the account's ACCOUNTADMIN role to change the edition.

C.

Failover to a new account in the same region and specify the new account's edition upon creation.

D.

Contact Snowflake Support and request that the account's edition be changed.

Question # 51

A user, analyst_user has been granted the analyst_role, and is deploying a SnowSQL script to run as a background service to extract data from Snowflake.

What steps should be taken to allow the IP addresses to be accessed? (Select TWO).

A.

ALTERROLEANALYST_ROLESETNETWORK_POLICY='ANALYST_POLICY';

B.

ALTERUSERANALYSTJJSERSETNETWORK_POLICY='ANALYST_POLICY';

C.

ALTERUSERANALYST_USERSETNETWORK_POLICY='10.1.1.20';

D.

USE ROLE SECURITYADMIN;CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

E.

USE ROLE USERADMIN;CREATE OR REPLACE NETWORK POLICY ANALYST_POLICYALLOWED_IP_LIST = ('10.1.1.20');

Question # 52

An Architect wants to stream website logs near real time to Snowflake using the Snowflake Connector for Kafka.

What characteristics should the Architect consider regarding the different ingestion methods? (Select TWO).

A.

Snowpipe Streaming is the default ingestion method.

B.

Snowpipe Streaming supports schema detection.

C.

Snowpipe has lower latency than Snowpipe Streaming.

D.

Snowpipe Streaming automatically flushes data every one second.

E.

Snowflake can handle jumps or resetting offsets by default.

Question # 53

When activating Tri-Secret Secure in a hierarchical encryption model in a Snowflake account, at what level is the customer-managed key used?

ARA-C01 question answer

A.

At the root level (HSM)

B.

At the account level (AMK)

C.

At the table level (TMK)

D.

At the micro-partition level

Question # 54

Which SQL alter command will MAXIMIZE memory and compute resources for a Snowpark stored procedure when executed on the snowpark_opt_wh warehouse?

A)ARA-C01 question answer

B)ARA-C01 question answer

C)ARA-C01 question answer

D)ARA-C01 question answer

A.

Option A

B.

Option B

C.

Option C

D.

Option D

ARA-C01 PDF

$33

$109.99

3 Months Free Update

  • Printable Format
  • Value of Money
  • 100% Pass Assurance
  • Verified Answers
  • Researched by Industry Experts
  • Based on Real Exams Scenarios
  • 100% Real Questions

ARA-C01 PDF + Testing Engine

$52.8

$175.99

3 Months Free Update

  • Exam Name: SnowPro Advanced: Architect Certification Exam
  • Last Update: Feb 1, 2026
  • Questions and Answers: 182
  • Free Real Questions Demo
  • Recommended by Industry Experts
  • Best Economical Package
  • Immediate Access

ARA-C01 Engine

$39.6

$131.99

3 Months Free Update

  • Best Testing Engine
  • One Click installation
  • Recommended by Teachers
  • Easy to use
  • 3 Modes of Learning
  • State of Art Technology
  • 100% Real Questions included