Summer Special Sales Coupon - 55% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: c4s55disc

SAA-C02 PDF

$49.5

$109.99

3 Months Free Update

  • Printable Format
  • Value of Money
  • 100% Pass Assurance
  • Verified Answers
  • Researched by Industry Experts
  • Based on Real Exams Scenarios
  • 100% Real Questions

SAA-C02 PDF + Testing Engine

$79.2

$175.99

3 Months Free Update

  • Exam Name: AWS Certified Solutions Architect - Associate (SAA-C02)
  • Last Update: May 13, 2022
  • Questions and Answers: 640
  • Free Real Questions Demo
  • Recommended by Industry Experts
  • Best Economical Package
  • Immediate Access

SAA-C02 Engine

$59.4

$131.99

3 Months Free Update

  • Best Testing Engine
  • One Click installation
  • Recommended by Teachers
  • Easy to use
  • 3 Modes of Learning
  • State of Art Technology
  • 100% Real Questions included

SAA-C02 AWS Certified Solutions Architect - Associate (SAA-C02) Questions and Answers

Question # 6

A company created and hosts a legacy software application for its customers. The application runs on a dedicated Linux server for each customer. The application stores no persistent data except for MySQL data.

The company experienced some data corruption issues in the past and wants to move the application to AWS. The company needs to implement a solution to optimize the stability of the application. The solution also must give the company the ability to restore a customer's database to a specific point in time. The company will migrate customer data by using AWS Database Migration Service (AWS DMS).

Which architecture should a solutions architect recommend to meet these requirements?

A.

Set up a shared Amazon Aurora database. Configure an Amazon EC2 launch template for each customer.

B.

Set up a shared Amazon Aurora database. Create an Amazon EC2 Amazon Machine Image (AMI) for each customer. Use the AMI to launch the application.

C.

Set up an Amazon RDS database and an Amazon EC2 instance for each customer. Download the installation script. Run the script to install and configure the application.

D.

Set up an Amazon RDS database for each customer Deploy the application by using an Amazon EC2 launch template. Use user data to configure the customer-specific data.

Full Access
Question # 7

A company recently migrated multiple applications and databases from an on-premises data center to the AWS Cloud. Most of the applications run on AWS Fargate. and some of the applications run on Amazon EC2 instances Most of the databases run on Amazon RDS, and a small number of databases run on EC2 Instances.

All the applications and databases must be available 24 hours a day. 7 days a week. The company uses AWS Organizations to manage AWS accounts. A solutions architect must recommend how to minimize the cost of these workloads over the next 3 years

Which solution meets these requirements?

A.

Purchase All Upfront Reserved Instances with a 3-year term for Amazon EC2 and Fargate

B.

Purchase All Upfront Reserved Instances with a 3-year term for Amazon EC2 and Amazon RDS

C.

Purchase All Upfront Compute Savings Plans with a 3-year term for Amazon EC2 and Fargate Purchase All Upfront Reserved Instances with a 3-year term for Amazon RDS

D.

Purchase All Upfront EC2 Instance Savings Plans with a 3-year term for Amazon EC2 and Fargate Purchase All Upfront Reserved Instances with a 3-year term for Amazon RDS

Full Access
Question # 8

A three-tier web application processes orders from customers. The web tier consists of Amazon EC2 instances behind an Application Load Balancer, a middle tier of three EC2 instances decoupled from the web tier using Amazon SQS. and an Amazon DynamoDB backend At peak times customers who submit orders using the site have to wait much longer than normal to receive confirmations due to lengthy processing times A solutions architect needs to reduce these processing times

Which action will be MOST effective in accomplishing this?

A.

Replace the SQS queue with Amazon Kinesis Data Firehose

B.

Use Amazon ElastiCache for Redis in front of the DynamoDB backend tier

C.

Add an Amazon CloudFront distribution to cache the responses for the web tier.

D.

Use Amazon EC2 Auto Scaling to scale out the middle tier instances based on the SQS queue depth

Full Access
Question # 9

A company hosts its enterprise content management platform in one AWS Region but needs to operate the platform across multiple Regions The company has an Amazon Elastic Kubernetes Service (Amazon EKS) cluster that runs its microservices The EKS cluster stores and retrieves objects from Amazon S3 The EKS cluster also stores and retrieves metadata from Amazon DynamoDB

Which combination of steps should a solutions architect take to deploy the platform across multiple Regions? (Select TWO.)

A.

Replicate the EKS cluster with cross-Region replication.

B.

Use Amazon API Gateway to create a global endpoint to the EKS cluster

C.

Use AWS Global Accelerator endpoints to distribute the traffic to multiple Regions

D.

Use Amazon S3 access points to give access to the objects across multiple Regions Configure DynamoDB Accelerator (DAX) Connect DAX to the relevant tables.

E.

Deploy an EKS cluster and an S3 bucket in another Region Configure cross-Region replication on both S3 buckets Turn on global tables for DynamoDB

Full Access
Question # 10

A company wants to build an immutable infrastructure for its software applications The company wants to test the software applications before sending traffic to them The company seeks an efficient solution that limits the effects of application bugs

Which combination of steps should a solutions architect recommend? {Select TWO)

A.

Use AWS Cloud Formation to update the production infrastructure and roll back the stack if the update fails

B.

Apply Amazon Route 53 weighted routing to test the staging environment and gradually increase the traffic as the tests pass

C.

Apply Amazon Route 53 failover routing to test the staging environment and fail over to the production environment if the tests pass

D.

Use AWS Cloud Formation with a parameter set to the staging value in a separate environment other than the production environment

E.

Use AWS Cloud Formation to deploy the staging environment with a snapshot deletion policy and reuse the resources in the production environment if the tests pass

Full Access
Question # 11

A company has three AWS accounts Management Development and Production. These accounts use AWS services only in the us-east-1 Region All accounts have a VPC with VPC Flow Logs configured to publish data to an Amazon S3 bucket in each separate account For compliance reasons the company needs an ongoing method to aggregate all the VPC flow logs across all accounts into one destination S3 bucket in the Management account.

What should a solutions architect do to meet these requirements with the LEAST operational overhead?

A.

Add S3 Same-Region Replication rules in each S3 bucket that stores VPC flow logs to replicate objects to the destination S3 bucket Configure the destination S3 bucket to allow objects to be received from the S3 buckets in other accounts

B.

Set up an IAM user in the Management account Grant permissions to the IAM user to access the S3 buckets that contain the VPC flow logs Run the aws s3 sync command in the AWS CLl to copy the objects to the destination S3 bucket

C.

Use an S3 inventory report to specify which objects in the S3 buckets to copy Perform an S3 batch operation to copy the objects into the destination S3 bucket in the Management account with a single request.

D.

Create an AWS Lambda function in the Management account Grant S3 GET permissions on the source S3 buckets Grant S3 PUT permissions on the destination S3 bucket Configure the function to invoke when objects are loaded in the source S3 buckets

Full Access
Question # 12

A startup company is using me AWS Cloud to develop a traffic control monitoring system for a large city The system must be highly available and must provide near-real-time results for residents and city officials even during peak events

Gigabytes of data will come in daily from loT devices that run at intersections and freeway ramps across the city The system must process the data sequentially to provide the correct timeline However results need to show only what has happened in the last 24 hours.

Which solution will meet these requirements MOST cost-effectively?

A.

Deploy Amazon Kinesis Data Firehose to accept incoming data from the loT devices and write the data to Amazon S3 Build a web dashboard to display the data from the last 24 hours

B.

Deploy an Amazon API Gateway API endpoint and an AWS Lambda function to process incoming data from the loT devices and store the data in Amazon DynamoDB Build a web dashboard to display the data from the last 24 hours

C.

Deploy an Amazon API Gateway API endpoint and an Amazon Simple Notification Service (Amazon SNS) tope to process incoming data from the loT devices Write the data to Amazon Redshift Build a web dashboard to display the data from the last 24 hours

D.

Deploy an Amazon Simple Queue Service (Amazon SOS) FIFO queue and an AWS Lambda function to process incoming data from the loT devices and store the data in an Amazon RDS DB instance Build a web dashboard to display the data from the last 24 hours

Full Access
Question # 13

A company serves content to its subscribers across the world using an application running on AWS The application has several Amazon EC2 instances in a private subnet behind an Application Load Balancer (ALB) Due to a recent change in copyright restrictions, the chief information officer (CiO) wants to block access for certain countries.

Which action will meet these requirements?

A.

Modify the ALB security group to deny incoming traffic from blocked countries

B.

Modify the security group for EC2 instances to deny incoming traffic from blocked countries

C.

Use Amazon CloudFront to serve the application and deny access to blocked countries

D.

Use ALB listener rules to return access dented responses to incoming traffic from blocked countries

Full Access
Question # 14

A company recently deployed a new auditing system to centralize information about operating system versions patching and installed software for Amazon EC2 instances. A solutions architect must ensure all instances provisioned through EC2 Auto Scaling groups successfully send reports to the auditing system as soon as they are launched and terminated

Which solution achieves these goals MOST efficiently?

A.

Use a scheduled AWS Lambda function and run a script remotely on all EC2 instances to send data to the audit system.

B.

Use EC2 Auto Scaling lifecycle hooks to run a custom script to send data to the audit system when instances are launched and terminated

C.

Use an EC2 Auto Scaling launch configuration to run a custom script through user data to send data to the audit system when instances are launched and terminated

D.

Run a custom script on the instance operating system to send data to the audit system Configure the script to be invoked by the EC2 Auto Scaling group when the instance starts and is terminated

Full Access
Question # 15

A company has a remote factory that has unreliable connectivity. The factory needs to gather and process machine data and sensor data so that it can sense products on its conveyor belts and initiate a robotic movement to direct the products to the right location Predictable low-latency compute processing is essential for the on-premises control systems

Which solution should the factory use to process the data?

A.

Amazon CloudFront lambda©Edge functions

B.

An Amazon EC2 instance that has enhanced networking enabled

C.

An Amazon EC2 instance that uses an AWS Global Accelerator endpoint

D.

An Amazon Elastic Block Store (Amazon EBS) volume on an AWS Snowball Edge cluster

Full Access
Question # 16

A company is planning to migrate its virtual server-based workloads to AWS The company has internet-facing load balancers backed by application servers The application servers rely on patches from an internet-hosted repository

Which services should a solutions architect recommend be hosted on the public subnet? (Select TWO.)

A.

NAT gateway

B.

Amazon RDS DB instances

C.

Application Load Balancers

D.

Amazon EC2 application servers

E.

Amazon Elastic File System (Amazon EFS) volumes

Full Access
Question # 17

A company hosts a marketing website in an on-premises data center. The website consists of static documents and runs on a single server. An administrator updates the website content infrequently and uses an SFTP client to upload new documents.

The company decides to host its website on AWS and to use Amazon CloudFront. The company's solutions architect creates a CloudFront distribution. The solutions architect must design the most cost-effective and resilient architecture for website hosting to serve as the CloudFront origin

Which solution will meet these requirements?

A.

Create a virtual server by using Amazon Lightsail Configure the web server in the Lightsail instance Upload website content by using an SFTP client

B.

Create an AWS Auto Scaling group for Amazon EC2 instances Use an Application Load Balancer Upload website content by using an SFTP client

C.

Create a private Amazon S3 bucket Use an S3 bucket policy to allow access from a CloudFront origin access identity (OAI) Upload website content by using the AWS CLI

D.

Create a public Amazon S3 bucket Configure AWS Transfer for SFTP Configure the S3 bucket for website hosting Upload website content by using the SFTP client

Full Access
Question # 18

A medical company is designing a new application that gathers symptoms from patients The company has decided to use Amazon Simple Queue Service (Amazon SQS) and Amazon Simple Notification Service (Amazon SNS) in the architecture

A solutions architect is reviewing the infrastructure design Data must be encrypted while at rest and in transit Only authorized personnel of the company can access the data

Which combination of steps should the solutions architect take to meet these requirements'? (Select TWO )

A.

Turn on server-side encryption on the SQS components Update the default key policy to restrict key usage to a set of authorized principals

B.

Turn on server-side encryption on the SNS components by using a custom CMK Apply a key policy to restrict key usage to a set of authorized principals

C.

Turn on encryption on the SNS components Update the default key policy to restrict key usage to a set of authorized principals Set a condition in the topic policy to allow only encrypted connections over TLS.

D.

Turn on server-side encryption on the SQS components by using a custom CMK. Apply a key policy to restrict key usage to a set of authonzed pnncipals Set a condition in the queue policy to allow only encrypted connections over TLS.

E.

Turn on server-side encryption on the SQS components by using a custom CMK. Apply an IAM policy to restrict key usage to a set of authorized principals Set a condition in the queue policy to allow only encrypted connections over TLS.

Full Access
Question # 19

A solutions architect needs to host a high performance computing (HPC) workload in the AWS Cloud. The workload will run on hundreds of Amazon EC2 instances and will require parallel access to a shared file system to enable distributed processing of large datasets. Datasets will be accessed across multiple instances simultaneously. The workload requires access latency within 1 ms. After processing has completed, engineers will need access to the dataset for manual postprocessing.

Which solution will meet these requirements?

A.

Use Amazon Elastic File System (Amazon EFS) as a shared file system. Access the dataset from Amazon EFS.

B.

Mount an Amazon S3 bucket to serve as the shared file system. Perform postprocessing directly from the S3 bucket.

C.

Use Amazon FSx for Lustre as a shared file system. Link the file system to an Amazon S3 bucket for postprocessing.

D.

Configure AWS Resource Access Manager to share an Amazon S3 bucket so that it can be mounted to all instances for processing and postprocessing.

Full Access
Question # 20

A company runs an application on several Amazon EC2 instances that store persistent data on an Amazon Elastic File System (Amazon EFS) file system. The company needs to replicate the data to another AWS Region by using an AWS managed service solution

Which solution will meet these requirements MOST cost-effectively'?

A.

Use the EFS-to-EFS backup solution to replicate the data to an EFS file system in another Region

B.

Run a nightly script to copy data from the EFS file system to an Amazon S3 bucket Enable S3 Cross-Region Replication on the S3 bucket

C.

Create a VPC in another Region Establish a cross-Region VPC peer Run a nightly rsync to copy data from the original Region to the new Region.

D.

Use AWS Backup to create a backup plan with a rule that takes a daily backup and replicates it to another Region Assign the EFS file system resource to the backup plan

Full Access
Question # 21

A company has a web application that users access from around the world The company has web servers in multiple AWS Regions to support the traffic A solutions architect must configure an Amazon Route 53 routing policy to send traffic to only the active web servers

Which configuration meets this requirement?

A.

Create a simple routing policy that uses health checks for each Region

B.

Create a multivalue answer routing policy that uses health checks for each Region

C.

Create a geoproximity routing policy with a health check bias of 99 for each Region

D.

Create a weighted routing policy with a health check weight of 100 for each Region

Full Access
Question # 22

A company's web application uses an Amazon RDS PostgreSQL DB instance to store its application data During the financial closing period at the start of every month, Accountants run large queries that impact the database's performance due to high usage The company wants to minimize the impact that the reporting activity has on the web application

What should a solutions architect do to reduce the impact on the database with the LEAST amount of effort?

A.

Create a read replica and direct reporting traffic to the replica

B.

Create a Multi-AZ database and direct reporting traffic to the standby

C.

Create a cross-Region read replica and direct reporting traffic to the replica.

D.

Create an Amazon Redshift database and direct reporting traffic to the Amazon Redshift database

Full Access
Question # 23

A mobile gaming company runs application servers on Amazon EC2 instances. The servers receive updates from players every 15 minutes. The mobile game creates a JSON object of the progress made in the game since the last update and sends the JSON object to an Application Load Balancer As the mobile game is played game updates are being lost. The company wants to create a durable way to get the updates in order

What should a solutions architect recommend to decouple the system?

A.

Use Amazon Kinesis Data Streams to capture the data and store the JSON object in Amazon S3

B.

Use Amazon Kinesis Data Firehose to capture the data and store the JSON object in Amazon S3

C.

Use Amazon Simple Queue Service (Amazon SQS) FIFO queues to capture the data and EC2 instances to process the messages in the queue

D.

Use Amazon Simple Notification Service (Amazon SNS) to capture the data and EC2 instances to process the messages sent to the Application Load Balancer

Full Access
Question # 24

A company has an application that collects data from loT sensors on automobiles. The data is streamed and stored in Amazon S3 through Amazon Kinesis Date Firehose The data produces trillions of S3 objects each year. Each morning, the company uses the data from the previous 30 days to retrain a suite of machine learning (ML) models.

Four times each year, the company uses the data from the previous 12 months to perform analysis and train other ML models The data must be available with minimal delay for up to 1 year. After 1 year, the data must be retained for archival purposes.

Which storage solution meets these requirements MOST cost-effectively?

A.

Use the S3 Intelligent-Tiering storage class. Create an S3 Lifecycle policy to transition objects to S3 Glacier Deep Archive after 1 year

B.

Use the S3 Intelligent-Tiering storage class. Configure S3 Intelligent-Tiering to automatically move objects to S3 Glacier Deep Archive after 1 year.

C.

Use the S3 Standard-Infrequent Access (S3 Standard-IA) storage class. Create an S3 Lifecycle policy to transition objects to S3 Glacier Deep Archive after 1 year.

D.

Use the S3 Standard storage class. Create an S3 Lifecycle policy to transition objects to S3 Standard-Infrequent Access (S3 Standard-IA) after 30 days, and then to S3 Glacier Deep Archive after 1 year.

Full Access
Question # 25

A company's legacy application is currently relying on a single-instance Amazon RDS MySQL database without encryption. Due to new compliance requirements all existing and new data in this database must be encrypted.

How should this be accomplished?

A.

Create an Amazon S3 bucket with server-side encryption enabled Move all the data to Amazon S3 Delete the RDS instance

B.

Enable RDS Multi-AZ mode with encryption at rest enabled. Perform a failover to the standby instance to delete the original instance

C.

Take a snapshot of the RDS instance. Create an encrypted copy of the snapshot. Restore the RDS instance from the encrypted snapshot.

D.

Create an RDS read replica with encryption at rest enabled Promote the read replica to master and switch the application over to the new master Delete the old RDS instance

Full Access
Question # 26

A company runs an application on a group of Amazon Linux EC2 instances. For compliance reasons, the company must retain all application log files for 7 years. The log files will be analyzed by a reporting tool that must be able to access all the files concurrently.

Which storage solution meets these requirements MOST cost-effectively?

A.

Amazon Elastic Block Store (Amazon EBS)

B.

Amazon Elastic File System (Amazon EFS)

C.

Amazon EC2 instance store

D.

Amazon S3

Full Access
Question # 27

A company captures ordered clickstream data from multiple websites and uses batch processing to analyze the data. The company receives 100 million event records, all approximately 1 KB in size, each day. The company loads the data into Amazon Redshift each night, and business analysts consume the data.

The company wants to move toward near-real-time data processing for timely insights. The solution should process the streaming data while requiring the least possible operational overhead.

Which combination of AWS services will meet these requirements MOST cost-effectively? (Select TWO.)

A.

Amazon EC2

B.

AWS Batch

C.

Amazon Simple Queue Service (Amazon SQS)

D.

Amazon Kinesis Data Firehose

E.

Amazon Kinesis Data Analytics

Full Access
Question # 28

A company needs to connect its on-premises data center network to a new VPC. The data center network has a 100 Mbps symmetrical internet connection. An application that is running on premises will transfer multiple gigabytes of data each day. The application will use an Amazon Kinesis Data Firehose delivery stream for processing

What should a solutions architect recommend for maximum performance?

A.

Create a VPC peering connection between the on-premises network and the VPC Configure routing for the on-premises network to use the VPC peering connection.

B.

Procure an AWS Snowball Edge Storage Optimized device. After several days' worth of data has accumulated, copy the data to the device and ship the device to AWS for expedited transfer to Kinesis Data Firehose Repeat as needed

C.

Create an AWS Site-to-Site VPN connection between the on-premises network and the VPC Configure BGP routing between the customer gateway and the virtual private gateway. Use the VPN connection to send the data from on premises to Kinesis Data Firehose.

D.

Use AWS PrivateLink to create an interface VPC endpoint for Kinesis Data Firehose in the VPC. Set up a 1 Gbps AWS Direct Connect connection between the on-premises network and AWS Use the PrivateLink endpoint to send the data from on premises to Kinesis Data Firehose.

Full Access
Question # 29

A company needs to store 160TB of data for an indefinite of time. The company must be able to use standard SQL and business intelligence tools to query all of the data. The data will be queried no more than twice each month.

What is the MOST cost-effective solution that meets these requirements?

A.

Store the data in Amazon Aurora Serverles with MySQL . Use an SQL client to query the data.

B.

Store the data in Amazon S3. Use AWS Glue. Amazon Athena. IDBC and COBC drivers to query the data.

C.

Store the data in an Amazon EMR cluster with EMR File System (EMRFS) as the storage layer use Apache Presto to query the data.

D.

Store a subnet of the data in Amazon Redshift, and store the remaining data in Amazon S3. Use Amazon Redshift Spectrum to query the S3 data.

Full Access
Question # 30

A medical records company is hosting an application on Amazon EC2 instances. The application processes customer data files that are stored on Amazon S3. The EC2 instances are hosted in public subnets. The EC2 instances access Amazon S3 over the internet, but they do not require any other network access.

A new requirement mandates that the network traffic for file transfers take a private route and not be sent over the internet.

Which change to the network architecture should a solutions architect recommend to meet this requirement"?

A.

Create a NAT gateway. Configure the route table for the public subnets to send traffic to Amazon S3 through the NAT gateway.

B.

Configure the security group for the EC2 instances to restrict outbound traffic so that only traffic to the S3 prefix list is permitted.

C.

Move the EC2 instances to private subnets. Create a VPC endpoint for Amazon S3, and link the endpoint to the route table for the private subnets

D.

Remove the internet gateway from the VPC. Set up an AWS Direct Connect connection, and route traffic to Amazon S3 over the Direct Connect connection.

Full Access
Question # 31

A solutions architect must design a database solution for a high-traffic ecommerce web application. The database stores customer profiles and shopping cart information. The database must support a peak load of several million requests each second and deliver responses in milliseconds The operational overhead for managing and scaling the database must be minimized

Which database solution should the solutions architect recommend?

A.

Amazon Aurora

B.

Amazon DynamoDB

C.

Amazon RDS

D.

Amazon Redshift

Full Access
Question # 32

A company has created a multi-tier application for its ecommerce website The website uses an Application Load Balancer that resides in the public subnets, a web tier in the public subnets, and a MySQL cluster hosted on Amazon EC2 instances in the private subnets. The MySQL database needs to retrieve product catalog and pricing information that is hosted on the internet by a third-party provider A solutions architect must devise a strategy that maximizes security without increasing operational overhead

What should the solutions architect do to meet these requirements?

A.

Deploy a NAT instance in the VPC Route all the internet-based traffic through the NAT instance

B.

Deploy a NAT gateway in the public subnets. Modify the private subnet route table to direct all internet-bound traffic to the NAT gateway.

C.

Configure an internet gateway and attach it to the VPC Modify the private subnet route table to direct internet-bound traffic to the internet gateway

D.

Configure a virtual private gateway and attach it to the VPC Modify the private subnet route table to direct internet-bound traffic to the virtual private gateway.

Full Access
Question # 33

A solutions architect is designing a security solution for a company that wants to provide developers with individual AWS accounts through AWS Organizations, while also maintaining standard security controls Because the individual developers will have AWS account root user-level access to their own accounts, the solutions architect wants to ensure that the mandatory AWS CloudTrail configuration that is applied to new developer accounts is not modified.

Which action meets these requirements?

A.

Create an IAM policy that prohibits changes to CloudTrail, and attach it to the root user

B.

Create a new trail in CloudTrail from within the developer accounts with the organization trails option enabled.

C.

Create a service control policy (SCP) the prohibits changes to CloudTrail, and attach it the developer accounts

D.

Create a service-linked role for CloudTrail with a policy condition that allows changes only from an Amazon Resource Name (ARN) in the master account

Full Access
Question # 34

A company needs to migrate a legacy application from an on-premises data center to the AWS Cloud because of hardware capacity constraints. The application runs 24 hours a day. & days a week,. The application database storage continues to grow over time.

What should a solution architect do to meet these requirements MOST cost-affectivity?

A.

Migrate the application layer to Amazon FC2 Spot Instances Migrate the data storage layer to Amazon S3.

B.

Migrate the application layer to Amazon EC2 Reserved Instances Migrate the data storage layer to Amazon RDS On-Demand Instances.

C.

Migrate the application layer to Amazon EC2 Reserved instances Migrate the data storage layer to Amazon Aurora Reserved Instances.

D.

Migrate the application layer to Amazon EC2 On Demand Amazon Migrate the data storage layer to Amazon RDS Reserved instances.

Full Access
Question # 35

A company runs an application on Amazon EC2 instances. that are part of an Auto Scaling group Traffic to the application increases substantially during business hours. A solutions architect needs to implement an Auto Scaling policy that addresses user latency concerns during periods of high traffic. The company does not want to provision more compute man is necessary

What should me solutions architect do to meet these requirements?

A.

Configure a predictive scaling policy with the appropriate scaling metric.

B.

Configure a dynamic target tracking scaling policy with the appropriate scaling metric

C.

Configure a scheduled scaling policy that launches additional EC2 instances during business hours

D.

Configure dynamic step or simple scaling policies with Ama7on CloudWatch alarms to add and remove EC2 instances based on alarm status

Full Access
Question # 36

A company's order fulfillment service uses a MySQL database The database needs to support a large number of concurrent queries and transactions Developers are spending time patching and tuning the database This is causing delays in releasing new product features

The company wants to use cloud-based services to help address this new challenge The solution must allow the developers to migrate the database with little or no code changes and must optimize performance

Which service should a solutions architect use to meet these requirements'?

A.

Amazon Aurora

B.

Amazon DynamoDB

C.

Amazon ElastiCache

D.

MySQL on Amazon EC2

Full Access
Question # 37

A company is migrating a distributed application to AWS The application serves variable workloads The legacy platform consists of a primary server trial coordinates jobs across multiple compute nodes The company wants to modernize the application with a solution that maximizes resiliency and scalability

How should a solutions architect design the architecture to meet these requirements?

A.

Configure an Amazon Simple Queue Service (Amazon SQS) queue as a destination for the jobs Implement the compute nodes with Amazon EC2 instances that are managed in an Auto Scaling group. Configure EC2 Auto Scaling to use scheduled scaling

B.

Configure an Amazon Simple Queue Service (Amazon SQS) queue as a destination for the jobs Implement the compute nodes with Amazon EC2 Instances that are managed in an Auto Scaling group Configure EC2 Auto Scaling based on the size of the queue

C.

Implement the primary server and the compute nodes with Amazon EC2 instances that are managed In an Auto Scaling group. Configure AWS CloudTrail as a destination for the fobs Configure EC2 Auto Scaling based on the load on the primary server

D.

implement the primary server and the compute nodes with Amazon EC2 instances that are managed in an Auto Scaling group Configure Amazon EventBridge (Amazon CloudWatch Events) as a destination for the jobs Configure EC2 Auto Scaling based on the load on the compute nodes

Full Access
Question # 38

A company has a web application that is based ornavaan^PH^Tnecompan^lanstomove the application from on premises to AWS The company needs the ability to test new site features frequently The company also needs a highly available and managed solution that requires minimum operational overhead.

Which solution will meet these requirements?

A.

Create an Amazon S3 bucket Enable static web hosting on the S3 bucket Upload the static content to the S3 bucket Use AWS Lambda to process all dynamic content

B.

Deploy the web application to an AWS Elastic Beanstalk environment Use URL swapping to switch between multiple Elastic Beanstalk environments for feature testing

C.

Deploy the web application to Amazon EC2 instances that are configured with Java and PHP Use Auto Scaling groups and an Application Load Balancer to manage the website's availability.

D.

Containerize the web application Deploy the web application to Amazon EC2 instances Use the AWS Load Balancer Controller to dynamically route traffic between containers that contain the new site features for testing

Full Access
Question # 39

A company uses Application Load Balancers (ALBs) in different AWS Regions The ALBs receive inconsistent traffic that can spike and drop throughout the year The company's networking team needs to allow the IP addresses of the ALBs in the on-premises firewall to enable connectivity

Which solution is the MOST scalable with minimal configuration changes?

A.

Write an AWS Lambda script to get the IP addresses of the ALBs in different Regions. Update the on-premises firewall's rule to allow the IP addresses of the ALBs

B.

Migrate all ALBs in different Regions to the Network Load Balancers (NLBs) Update the on-premises firewall's rule to allow the Elastic IP addresses of all the NLBs

C.

Launch AWS Global Accelerator Register the ALBs in different Regions to the accelerator Update the on-premises firewall's rule to allow static IP addresses associated with the accelerator

D.

Launch a Network Load Balancer (NLB) in one Region Register the private IP addresses of the ALBs in different Regions with the NLB. Update the on-premises firewall's rule to allow the Elastic IP address attached to the NLB.

Full Access
Question # 40

A company is building ils web application by using contains on AWS. The company requires three instances of the web application to run at all times The application must be highly available and must be able to scale to meet increases In demand

Which solution meets these requirements?

A.

Use the AWS Fargate launch type to create an Amazon Elastic Contain« Service (Amazon ECS) dust« Create a task definition for the web application Create an ECS service that ha6 a desired count of three tasks.

B.

Use the Amazon EC2 launch type to create an Amazon Elastic Contain« Service (Amazon ECS) cluster that has three container Instances in one Availability Zone Create a task definition for the web application Place one task for each container instance.

C.

Use the AWS Fargate launch type to create an Amazon Elastic Contain« Service (Amazon ECS) cluster that has three container instances in three different Availability Zones Create a task definition for the web application Create an ECS service that has a desired count of three tasks

D.

Use the Amazon EC2 launch type to create an Amazon Elastic Contain« Service (Amazon ECS) duster that has one container instance in two different Availability Zones. Ceate definition for the web application Place two tasks on one container instance Place one task on the remaining container instance

Full Access
Question # 41

A development team needs to host a website that will be accessed by other teams. The website contents consist of HTML. CSS, client-side JavaScript, and images Which method is the MOST cost-effective for hosting the website?

A.

Containerize the website and host it in AWS Fargate.

B.

Create an Amazon S3 bucket and host the website there

C.

Deploy a web server on an Amazon EC2 instance to host the website.

D.

Configure an Application Loa d Balancer with an AWS Lambda target that uses the Express js framework.

Full Access
Question # 42

A company is implementing new data retention policies for all databases that run on Amazon RDS DB instances. The company must retain daily backups for a minimum period of 2 years. The backups must be consistent and restorable.

Which solution should a solutions architect recommend to meet these requirements?

A.

Create a backup vault in AWS Backup to retain RDS backups. Create a new backup plan with a daily schedule and an expiration period of 2 years after creation. Assign the RDS DB instances to the backup plan.

Configure a backup window for the RDS DB Instances for daily snapshots. Assign a snapshot retention policy of 2 years to each RDS DB instance. Use Amazon Data Lifecycle Manager (Amazon DLM) B. to schedule snapshot deletions.

B.

Configure database transaction logs to be automatically backed up to Amazon CloudWatch Logs with an expiration period of 2 years

C.

Configure an AWS Database Migration Service (AWS DMS) replication task. Deploy a replication instance, and configure a change data capture (CDC) task to stream database changes to Amazon S3 as the target Configure S3 Lifecycle policies to delete the snapshots after 2 years.

Full Access
Question # 43

A company hosts its multi-tier, public web application in the AWS Cloud. The web application runs on Amazon EC2 instances and its database runs on Amazon RDS The company is anticipating a large increase in sales during an upcoming holiday weekend A solutions architect needs to build a solution to analyze the performance of the web application with a granularity of no more than 2 minutes.

What should the solutions architect do to meet this requirement?

A.

Send Amazon CloudWatch logs to Amazon Redshift Use Amazon QuickSight to perform further analysis

B.

Enable detailed monitoring on all EC2 instances. Use Amazon CloudWatch metrics to perform further analysis.

C.

Create an AWS Lambda function to fetch EC2 logs from Amazon CloudWatch Logs. Use Amazon CloudWatch metrics to perform further analysis

D.

Send EC2 logs to Amazon S3. Use Amazon Redshift to fetch logs from the S3 bucket to process raw data for further analysis with Amazon QuickSight

Full Access
Question # 44

A company is automating an order management application. The company's development team has decided to use SFTP to transfer and store the business-critical information files The files must be encrypted and must be highly available. The files also must be automatically deleted a month after they are created.

Which solution meets these requirements with the LEAST operational overhead?

A.

Configure an Amazon S3 bucket with encryption enabled. Use AWS transfer for SFTP to securely transfer the files to the S3 bucket Apply an AWS Transfer for SFTP file retention policy to delete the files after a month

B.

Install an SFTP service on an Amazon EC2 instance Mount an Amazon Elastic File System (Amazon EFS) file share on the EC2 instance. Enable cron to delete the files after a month

C.

Configure an Amazon Elastic File System (Amazon EFS) file system with encryption enabled. Use AWS Transfer for SFTP to securely transfer the files to the EFS file system. Apply an EFS lifecycle policy to automatically delete the files after a month.

D.

Configure an Amazon S3 bucket with encryption enabled. Use AWS Transfer for SFTP to securely transfer the files to the S3 bucket. Apply S3 Lifecycle rules to automatically delete the files after a month.

Full Access
Question # 45

A company operates a website on Amazon EC2 Linux instances Some of the instances are failing. Troubleshooting points to insufficient swap space on the failed instances. The operations team lead needs a solution to monitor this.

What should a solutions architect recommend?

A.

Configure an Amazon CloudWatch SwapUsage metric dimension Monitor the SwapUsage dimension in the EC2 metrics in CloudWatch.

B.

Use EC2 metadata to collect information, then publish it to Amazon CloudWatch custom metrics Monitor SwapUsage metrics in CloudWatch

C.

Install an Amazon CloudWatch agent on the instances. Run an appropriate script on a set schedule. Monitor SwapUtilization metrics in CloudWatch

D.

Enable detailed monitoring in the EC2 console Create an Amazon CloudWatch SwapUtilization custom metric Monitor SwapUtilization metrics in CloudWatch

Full Access
Question # 46

A company processes large amounts of data. The output data is stored in Amazon S3 Standard storage in an S3 bucket, where it is analyzed for 1 month. The data must remain immediately accessible after the 1-month analysis period.

Which storage solution meets these requirements MOST cost-effectively?

A.

Configure an S3 Lifecycle policy to transition the objects to S3 Glacier after 30 days.

B.

Configure S3 Intelligent-Tiering to transition the objects to S3 Glacier after 30 days.

C.

Configure an S3 Lifecycle policy to transition the objects to S3 One Zone-Infrequent Access (S3 One Zone-IA) after 30 days.

D.

Configure an S3 Lifecycle policy to delete the objects after 30 days. Enable versioning on the S3 bucket so that deleted objects can still be immediately restored as needed.

Full Access
Question # 47

A company's order system sends requests from clients to Amazon EC2 instances The EC2 instances process the orders and then store the orders in a database on Amazon RDS. Users report that they must reprocess orders when the system fails. The company wants a resilient solution that can process orders automatically if a system outage occurs.

What should a solutions architect do to meet these requirements?

A.

Move the EC2 instances Into an Auto Scaling group. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to target an Amazon Elastic Container Service (Amazon ECS) task

B.

Move the EC2 instances into an Auto Seating group behind an Application Load Balancer (Al B) Update the order system to send message to the ALB endpoint

C.

Move the EC2 instances into an Auto Scaling group. Configure the order system to send messages to an Amazon Simple Queue Service (Amazon SGS) queue. Configure the EC2 instances to consume messages from the queue.

D.

Create an Amazon Simple Notification Service (Amazon SNS) topic. Create an AWS Lambda function, and subscribe the function to the SNS topic Configure (he order system to send messages to the SNS topic. Send a command to the EC2 instances to process the messages by using AWS Systems Manager Run Command

Full Access
Question # 48

A company runs an application on Amazon EC2 instances. The application is deployed in private subnets in three Availability Zones of the us-east-1 Region. The instances must be able to connect to the internet to download files The company wants a design that is highly available across the Region.

Which solution should be implemented to ensure that there are no disruptions to internet connectivity?

A.

Deploy a NAT instance in a private subnet of each Availability Zone.

B.

Deploy a NAT gateway in a public subnet of each Availability Zone

C.

Deploy a transit gateway in a private subnet of each Availability Zone.

D.

Deploy an internet gateway in a public subnet of each Availability Zone

Full Access
Question # 49

A company is migrating a large, mission-critical database to AWS. A solutions architect has decided to use an Amazon RDS for MySQL Multi-AZ DB instance that Is deployed with 80,000 Provisioned IOPS for storage The solutions architect is using AWS Database Migration Service (AWS DMS) to perform the data migration. The migration is taking longer than expected, and the company wants to speed up the process. The company's network team has ruled out bandwidth as a limiting factor.

Which actions should the solutions architect take to speed up the migration? (Select TWO.)

A.

Disable Multi-AZ on the target DB instance.

B.

Create a new DMS instance that has a larger instance size.

C.

Turn off logging on the target DB instance until the initial load is complete.

D.

Restart the DMS task on a new DMS instance with transfer acceleration enabled.

E.

Change the storage type on the target DB instance to Amazon Elastic Block Store (Amazon EBS) General Purpose SSD (gp2).

Full Access
Question # 50

A company has an ecommerce application that stores data in an on-premises SQL database. The company has decided to migrate this database to AWS. However, as part of the migration, the company wants to find a way to attain sub-millisecond responses to common read requests

A solutions architect knows that the increase in speed is paramount and that a small percentage of stale data returned in the database reads is acceptable.

What should the solutions architect recommend'?

A.

Build Amazon RDS read replicas.

B.

Build the database as a larger instance type.

C.

Build a database cache using Amazon ElastiCache

D.

Build a database cache using Amazon Elasticsearch Service (Amazon ES).

Full Access
Question # 51

An ecommerce company is creating an application that requires a connection to a third-party payment service to process payments. The payment service needs to explicitly allow the public IP address of the server that is making the payment request. However, the company's security policies do not allow any server to be exposed directly to the public internet.

Which solution will meet these requirements?

A.

Provision an Elastic IP address. Host the application servers on Amazon EC2 instances in a private subnet. Assign the public IP address to the application servers.

B.

Create a NAT gateway in a public subnet. Host the application servers on Amazon EC2 instances in a private subnet Route payment requests through the NAT gateway.

C.

Deploy an Application Load Balancer (ALB). Host the application servers on Amazon EC2 instances in a private subnet. Route the payment requests through the ALB.

D.

Set up an AWS Client VPN connection to the payment service Host the application servers on Amazon EC2 instances in a private subnet Route the payment requests through the VPN.

Full Access
Question # 52

A company has an application that processes customer of tiers. The company hosts the application on an Amazon EC2 instance that saves the orders to an Amazon Aurora database. Occasionally when traffic Is high, the workload does not process orders fast enough.

What should a solutions architect do to write the orders reliably to the database as quickly as possible?

A.

Increase the instance size of the EC2 instance when baffle Is high. Write orders to Amazon Simple Notification Service (Amazon SNS) Subscribe the database endpoint to the SNS topic

B.

Write orders to an Amazon Simple Queue Service (Amazon SOS) queue Use EC2 instances in an Auto Scaling group behind an Application Load Balancer to read born the SQS queue and process orders into the database

C.

Write orders to Amazon Simple Notification Service (Amazon SNS). Subscribe the database endpoint to the SNS topic. Use EC2 ^stances in an Auto Scaling group behind an Application Load Balancer to read from the SNS topic.

D.

Write orders to an Amazon Simple Queue Service (Amazon SQS) queue when the EC2 instance reaches CPU threshold limits. Use scheduled scaling of EC2 instances in an Auto Scaling group behind an Application Load Balancer to read from the SQS queue and process orders into the database

Full Access
Question # 53

A company is running a critical business application on Amazon EC2 instances behind an Application Load Balancer The EC2 instances run in an Auto Scaling group and access an Amazon RDS DB instance

The design did not pass an operational review because the EC2 instances and the DB instance are all located in a single Availability Zone A solutions architect must update the design to use a second Availability Zone

Which solution will make the application highly available?

A.

Provision a subnet in each Availability Zone Configure the Auto Scaling group to distribute the EC2 instances across both

Availability Zones Configure the DB instance with connections to each network

B.

Provision two subnets that extend across both Availability Zones Configure the Auto Scaling group to distribute the EC2 instances

across both Availability Zones Configure the DB instance with connections to each network

C.

Provision a subnet in each Availability Zone Configure the Auto Scaling group to distribute the EC2 instances across both Availability Zones Configure the DB instance for Multi-AZ deployment

D.

Provision a subnet that extends across both Availability Zones Configure the Auto Scaling group to distribute the EC2 instances

across both Availability Zones Configure the DB instance for Multi-AZ deployment

Full Access
Question # 54

A company is building an ecommerce application and needs to store sensitive customer information. The company needs to give customers the ability to complete purchase transactions on the website. The company also needs to ensure that sensitive customer data is protected, even from database administrators.

Which solution meets these requirements?

A.

Store sensitive data in an Amazon Elastic Block Store (Amazon EBS) volume. Use EBS encryption to encrypt the data. Use an IAM instance role to restrict access.

B.

Store sensitive data in Amazon RDS for MySQL. Use AWS Key Management Service (AWS KMS) client-side encryption to encrypt the data.

C.

Store sensitive data in Amazon S3. Use AWS Key Management Service (AWS KMS) service-side encryption the data. Use S3 bucket policies to restrict access.

D.

Store sensitive data in Amazon FSx for Windows Server. Mount the file share on application servers. Use Windows file permissions to restrict access.

Full Access
Question # 55

A company runs us two-tier ecommerce website on AWS The web tier consists of a load balancer that sends traffic to Amazon EC2 instances The database tier uses an Amazon RDS D8 instance The EC2 instances and the ROS DB instance should not be exposed to the public internet The EC2 instances require internet access to complete payment processing of orders through a third-party web service The application must be highly available

Which combination of configuration options will meet these requirements? (Select TWO.)

A.

Use an Auto Scaling group to launch the EC2 Instances in private subnets Deploy an RDS Mulli-AZ DB instance in private subnets

B.

Configure a VPC with two private subnets and two NAT gateways across two Availability Zones Deploy an Application Load Balancer in the private subnets

C.

Use an Auto Scaling group to launch the EC2 instances in public subnets across two Availability Zones Deploy an RDS Multi-AZ DB instance in private subnets

D.

Configure a VPC with one public subnet, one private subnet, and two NAT gateways across two Availability Zones Deploy an Application Load Balancer in the public subnet

E.

Configure a VPC with two public subnets, two private subnets, and two NAT gateways across two Availability Zones Deploy an Application Load Balancer in the public subnets

Full Access
Question # 56

A company maintains a searchable repository of items on its website. The data is stored in an Amazon RDS for MySQL database table that contains more than 10 million rows The database has 2 TB of General Purpose SSD storage There are millions of updates against this data every day through the company's website

The company has noticed that some insert operations are taking 10 seconds or longer The company has determined that the database storage performance is the problem

Which solution addresses this performance issue?

A.

Change the storage type to Provisioned IOPS SSD

B.

Change the DB instance to a memory optimized instance class

C.

Change the DB instance to a burstable performance instance class

D.

Enable Multi-AZ RDS read replicas with MySQL native asynchronous replication.

Full Access
Question # 57

A company has 150 TB of archived image data stored on-premises that needs to be moved to the AWS Cloud within the next month. The company's current network connection allows up to 100 Mbps uploads for this purpose during the night only.

What is the MOST cost-effective mechanism to move this data and meet the migration deadline?

A.

Use AWS Snowmobile to ship the data to AWS.

B.

Order multiple AWS Snowball devices to ship the data to AWS.

C.

Enable Amazon S3 Transfer Acceleration and securely upload the data.

D.

Create an Amazon S3 VPC endpoint and establish a VPN to upload the data

Full Access
Question # 58

A company hosts a website on Amazon EC2 instances behind an Application Load Balancer (ALB) The website serves static content Website traffic is increasing, and the company is concerned about a potential increase in cost.

What should a solutions architect do to reduce the cost of the website?

A.

Create an Amazon CloudFront distribution to cache static files at edge locations.

B.

Create an Amazon ElastiCache cluster Connect the ALB to the ElastiCache cluster to serve cached files.

C.

Create an AWS WAF web ACL, and associate it with the ALB Add a rule to the web ACL to cache static files.

D.

Create a second ALB in an alternative AWS Region Route user traffic to the closest Region to minimize data transfer costs.

Full Access
Question # 59

A solutions architect is designing a two-tier web application The application consists of a public-facing web tier hosted on Amazon EC2 in public subnets The database tier consists of Microsoft SQL Server running on Amazon EC2 in a private subnet Security is a high priority for the company

How should security groups be configured in this situation? (Select TWO )

A.

Configure the security group for the web tier to allow inbound traffic on port 443 from 0.0.0.0/0.

B.

Configure the security group for the web tier to allow outbound traffic on port 443 from 0.0.0.0/0.

C.

Configure the security group for the database tier to allow inbound traffic on port 1433 from the security group for the web tier.

D.

Configure the security group for the database tier to allow outbound traffic on ports 443 and 1433 to the security group for the web tier.

E.

Configure the security group for the database tier to allow inbound traffic on ports 443 and 1433 from the security group for the web tier.

Full Access
Question # 60

A company is running an ASP.NET MVC application on a single Amazon EC2 instance. A recent increase in application traffic is causing slow response times for users during lunch hours. The company needs to resolve this concern with the least amount of configuration.

What should a solutions architect recommend to meet these requirements?

A.

Move the application to AWS Elastic Beanstalk. Configure load-based auto scaling and time-based scaling to handle scaling during lunch hours

B.

Move the application to Amazon Elastic Container Service (Amazon ECS) Create an AWS Lambda function to handle scaling during lunch hours.

C.

Move the application to Amazon Elastic Container Service (Amazon ECS). Configure scheduled scaling for AWS Application Auto Scaling during lunch hours.

D.

Move the application to AWS Elastic Beanstalk. Configure load-based auto scaling, and create an AWS Lambda function to handle scaling during lunch hours.

Full Access
Question # 61

A company is expecting rapid growth in the near future. A solutions architect needs to configure existing users and grant permissions to new users on AWS The solutions architect has decided to create IAM groups The solutions architect will add the new users to IAM groups based on department

Which additional action is the MOST secure way to grant permissions to the new users?

A.

Apply service control policies (SCPs) to manage access permissions

B.

Create IAM roles that have least privilege permission Attach the roles lo the IAM groups

C.

Create an IAM policy that grants least privilege permission Attach the policy to the IAM groups

D.

Create IAM roles Associate the roles with a permissions boundary that defines the maximum permissions

Full Access
Question # 62

A company is migrating a NoSQL database cluster to Amazon EC2. The database automatically replicates data to maintain at least three copies of the data I/O throughput of the servers is the highest priority.

Which instance type should a solutions architect recommend for the migration?

A.

Storage optimized instances with instance store

B.

Burstable general purpose instances with an Amazon Elastic Block Store (Amazon EBS) volume

C.

Memory optimized instances with Amazon Elastic Block Store {Amazon EBS) optimization enabled

D.

Compute optimized instances with Amazon Elastic Block Store (Amazon EBS) optimization enabled

Full Access
Question # 63

A company hosts an application on AWS Lambda functions mat are invoked by an Amazon API Gateway API The Lambda functions save customer data to an Amazon Aurora MySQL database Whenever the company upgrades the database, the Lambda functions fail to establish database connections until the upgrade is complete The result is that customer data Is not recorded for some of the event

A solutions architect needs to design a solution that stores customer data that is created during database upgrades

Which solution will meet these requirements?

A.

Provision an Amazon RDS proxy to sit between the Lambda functions and the database Configure the Lambda functions to connect to the RDS proxy

B.

Increase the run time of me Lambda functions to the maximum Create a retry mechanism in the code that stores the customer data in the database

C.

Persist the customer data to Lambda local storage. Configure new Lambda functions to scan the local storage to save the customer data to the database.

D.

Store the customer data m an Amazon Simple Queue Service (Amazon SOS) FIFO queue Create a new Lambda function that polls the queue and stores the customer data in the database

Full Access
Question # 64

A company is creating a new application that will store a large amount of data. The data will be analyzed hourly and will be modified by several Amazon EC2 Linux instances that are deployed across multiple Availability Zones. The needed amount of storage space will continue to grow for the next 6 Months.

Which storage solution should a solutions architect recommend to meet these requirements?

A.

Store the data in Amazon S3 Glacier Update me S3 Glacier vault policy to allow access to the application Instances

B.

Store the data in an Amazon Elastic Block Store (Amazon EBS) volume Mount the EBS volume on the application nuances.

C.

Store the data in an Amazon Elastic File System (Amazon EFS) tile system Mount the file system on the application instances.

D.

Store the data in an Amazon Elastic Block Store (Amazon EBS) Provisioned K)PS volume shared between the application instances.

Full Access
Question # 65

A company hosts its web application on AWS using seven Amazon EC2 instances. The company requires that the IP addresses of all healthy EC2 instances be returned in response to DNS queries.

Which policy should be used to meet this requirement?

A.

Simple routing policy

B.

Latency routing policy

C.

Multivalue routing policy

D.

Geolocation routing policy

Full Access
Question # 66

A company deploys Amazon EC2 instances that run in a VPC The EC2 instances load source data into Amazon S3 buckets so that the data can be processed in the future According to compliance laws, the data must not be transmitted over the public internet Servers in the company's on-premises data center will consume the output from an application that runs on the EC2 instances

Which solution will meet these requirements?

A.

Deploy an interface VPC endpoint for Amazon EC2 Create an AWS Site-to-Site VPN connection between the company and the VPC

B.

Deploy a gateway VPC endpoint for Amazon S3 Set up an AWS Direct Connect connection between the on-premises network and the VPC

C.

Set up an AWS Transit Gateway connection from the VPC to the S3 buckets Create an AWS Site-to-Site VPN connection between the company and the VPC

D.

Set up proxy EC2 instances that have routes to NAT gateways Configure the proxy EC2 instances to fetch S3 data and feed the application instances

Full Access
Question # 67

A marketing company is storing CSV files in an Amazon S3 bucket for statistical analysis An application on an Amazon EC2 instance needs permission to efficiently process the CSV data stored in the S3 bucket.

A.

Attach a resource-based policy lo the S3 bucket

B.

Create an IAM user for the application with specific permissions to the S3 bucket

C.

Associate an IAM role with least privilege permissions lo the EC2 instance profile

D Store AWS a credential directly on the EC2 instance for applications on the instance to use for API calls

Full Access
Question # 68

A company has a popular gaming platform running on AWS. The application is sensitive to latency because latency can impact the user experience and Introduce unfair advantages to some prayers. The application la deployed In a very AWS Region. It runs on Amazon FC2 Instances Vial are part of Auto Scaling groups configured behind Application Load Balancers (ALBs) A solutions architect needs to implement a mechanism to monitor the hearth of the application and redirect traffic to healthy endpoints.

Which solution meets these requirements?

A.

Configure an accelerator In AWS Global Accelerator Add a listens for the port that the application listens on. and attach it to a Regional endpoint m each Region Add the ALB as the endpoint

B.

Create an Amazon CloudFron4t distribution and specify the ALB as the origin server Configure the cache behaviour to use origin cache headers Use AWS Lambda functions to optimize the traffic

C.

Create an Amazon CloudFront distribution and specify Amazon S3 as the origin server. Configure tie cache behaviour to use origin cache headers Use AWS Lambda functions to optimize the traffic

D.

Configure an Amazon DynamoDB database to serve as the data store tor the application Create a DynamoDB Accelerator (DAX) cluster to act as the m-memory cache for DynamoDB hosting the

application data

Full Access
Question # 69

A company plant to host a survey website on AWS The company anticipates an unpredictable amount of traffic This traffic results m asynchronous updates to the database. The company wants to ensure mat writes to the database hosted on AWS do not gel dropped

How should the company write its application to hand to these database requests?

A.

Configure the application to publish to an Amazon Simple Notification Service (Amazon SNS) topic Subscribe the database to the SNS topic.

B.

Configure the application to subscribe to an Amazon Simple Notification Service (Amazon SNS) topic. Publish the database updates to the SNS topic

C.

Use Amazon Simple Queue Service (Amazon SOS) FIFO queues to queue the database connection until the database has resources to wrist the data.

D.

Use Amazon Simple Queue Service (Amazon SOS) FIFO queues tor capturing the writes and draining the queue as each write is made to the database.

Full Access
Question # 70

A company stores can wordings on a monthly basis Users access lie recorded files randomly within 1year of recording, but users rarely access the files after 1year. The company wants to optimize its solution by allowing only files that ant newer than 1year old to be queried and retrieved as quickly as possible. A delay in retrieving older fees is acceptable

Which solution meets these requirements MOST cost-effectively?

A.

Store individual files in Amazon S3 Glacier Store search metadata in object tags that are created in S3 Glacier Query the S3 Glacier tags to retrieve the files from S3 Glacier.

B.

Store individual files in Amazon S3. Use S3 Lifecycle polices to move the ties to S3 Glacier after

1year. Query and retrieve the files that are in Amazon S3 by using Amazon Athena. Query and retrieve the files that are in S3 Glacier by using S3 Glacier Select.

C.

Store Individual files In Amazon S3 Store search metadata for each archive In Amazon S3 Use S3 Lifecycle policies to move the ties to S3 Glacier after 1 year Query and retrieve tie flies by searching for metadata from Amazon S3.

D.

Store individual files in Amazon S3 Use S3 Lifecycle policies to move the files to S3 Glacier after

1year. Store search metadata in Amazon RDS Query the Sea from Amazon RDS Retrieve the files from Amazon S3 or S3 Glacier

Full Access
Question # 71

A company has a mutt-tier application deployed on several Amazon EC2 instances m an Auto Scaling group. An Amazon RDS for Oracle instance is the application’s data layer that uses Oracle-specific

PL/'SQL functions. Traffic to the application has been steadily Increasing. This is causing the EC2 instances to become overloaded and the RDS instance to run out of storage. The Auto Scaling group does not have any scaling metrics and defines the minimum healthy instance count only. The company predicts that traffic will continue to increase at a steady but unpredictable rate before levelling off.

What should a solutions architect do to ensure the system can automatically scale for the increased traffic? (Select TWO.)

A.

Configure storage Auto Scaling on the RDS for Oracle Instance.

B.

Migrate the database to Amazon Aurora to use Auto Scaling storage.

C.

Configure an alarm on the RDS for Oracle Instance for low free storage space

D.

Configure the Auto Scaling group to use the average CPU as the scaling metric

E.

Configure the Auto Scaling group to use the average free memory as the seeing metric

Full Access
Question # 72

A company runs an application In a branch office within a small data closet with no vitalized compute resources. The application data is stored on an NFS volume Compliance standards require a daily offsite backup of the NFS volume.

Which solution meets these requirements?

A.

Install an AWS Storage Gateway fie gateway on premises to replicate the data to Amazon S3

B.

Install an AWS Storage Gateway fie gateway hardware appliance on premises to replicate the data to Amazon S3.

C.

Install an AWS Storage Gateway volume gateway with stored volumes on premises to replicate the data to Amazon S3

D.

Install an AWS Storage Gateway volume gateway with cached volumes on premises to replicate the data to Amazon S3.

Full Access
Question # 73

A company is running an application on AWS to process weather sensor data that is stored in an Amazon S3 bucket. Three batch jobs run hourly to process the data in the S3 bucket for different purposes. The company wants to reduce the overall processing time by running. The three applications in parallel using an event-based approach.

What should a solutions architect do to meet these requirements?

A.

Enable S3 Event Notifications for new objects to an Amazon Simple Queue Service (Amazon SOS) FIFO queue Subscribe al applications to the queue for processing.

B.

Enable S3 Event Notifications for new objects to an Amazon Simple Queue Service (Amazon SOS) standard queue Create an additional SOS queue for all applications, and subscribe all applications to the meal queue for processing.

C.

Enable S3 Event Notifications for new objects to separate Amazon Simple Queue Service (Amazon SOS) FIFO queues Create an additional SOS queue (or each application and subscribe each queue to the initial topic for processing

D.

Enable S3 Event Notifications tor new objects to an Amazon Simple Notification Service (Amazon SNS) topic. Create an Amazon Simple Queue Service (Amazon SOS) queue for each application, and subscribe each queue to the topic for processing

Full Access
Question # 74

A company wants to move a multi-tiered application from on premises to the AWS Cloud to improve the application’s performance. The application consists of application tiers that communicate with each other by way of

Which solution moots these and is the MOST operationally efficient?

A.

Use Amazon API Gateway and direct transactions to the AWS Lambda functions as the application layer Use Amazon Simple Queue Service (Amazon SOS) as the communication layer between application services.

B.

Use Amazon CloudWatch metrics to analyze the application performance history to determine the servers' peak utilization during the performance failures Increase the size or the application servers Amazon EC2 instance to meet the peak requirements

C.

Use Amazon Simple Notification Service (Amazon SNS) to handle the messaging between application servers running on Amazon EC2 m an Auto Scaling group Use Amazon CloudWatch to monitor the SNS queue length and scale up and down as required.

D.

Use Amazon Simple Queue Service (Amazon SOS) to handle the messaging between application servers running on Amazon EC2 In an Auto Seeing group Use Amazon CloudWatch to monitor the SOS queue length and scale up when communication failures are detected.

Full Access
Question # 75

A company runs a fleet of web servers using an Amazon RDS for PostgreSQL DB instance After a routine compliance check, the company sets a standard that requires a recovery pant objective (RPO) of less than 1 second for all its production databases.

Which solution meets these requirement?

A.

Enable a Multi-AZ deployment for the DB Instance

B.

Enable auto scaling for the OB instance m one Availability Zone.

C.

Configure the 06 instance in one Availability Zone and create multiple read replicas in a separate Availability Zone

D.

Configure the 06 instance m one Availability Zone, and configure AWS Database Migration Service (AWS DMS) change data capture (CDC) lacks

Full Access
Question # 76

A company runs an application in the AWS Cloud and uses Amazon DynamoDB as the database. The company deploys Amazon EC2 instances to a private network to process data horn the database. The company uses two NAT instances to provide connectivity lo DynamoDB

The company wants to retire the NAT instances. A solutions architect must implement a solution that provides connectivity to DynamoDB and that does not require ongoing management

What Is the MOST cost-effective solution that meets these requirements?

A.

Create a gateway VPC endpoint to provide connectivity to DynamoDB.

B.

Configure a managed NAT gateway to provide connectivity to DynamoDB.

C.

Establish an AWS Direct Connect connection behaviour to private network and DynamoDB.

D.

Deploy an AWS PrivateLink endpoint service between the private network and DynamoDB.

Full Access
Question # 77

A company needs to save the results from a medical trial to an Amazon S3 repository. The repository must allow a few scientists to add new dies and must restrict all other users to read-only access No users can have the ability to modify or delete any files in the repository. The company must heap every lie in the repository for a minimum of 1 year after its creation date.

Which solution will meet these requirements?

A.

Use S3 Object Lock In governance mode with a legal hold of 1 year

B.

Use S3 Object Lock in compliance mode with a retention period of 365 days.

C.

Use an IAM role to restrict all users from deleting or changing objects in the S3 bucket Use an S3 bucket policy to only allow the IAM role

D.

Configure the S3 bucket to invoke an AWS Lambda function every tune an object is added Configure the function to track the hash of the saved object to that modified objects can be marked accordingly

Full Access
Question # 78

A company wants to automate the security assessment of its Amazon EC2 instances The company needs to validate and demonstrate that it is meeting security and compliance standards throughout the development process.

What should a solutions architect do to meet these requirements?

A.

Use Amazon Macie to automatically discover, classify and protect the EC2 instances

B.

Use Amazon GuardDuty on the EC2 instances to publish Amazon Simple Notification Service (Amazon SNS) notifications

C.

Use Amazon Inspector with Amazon CloudWatch to publish Amazon Simple Notification Service (Amazon SNS) notifications

D.

Use Amazon EventBridge (Amazon CloudWatch Events) to detect and react to changes in the status of AWS Trusted Advisor checks

Full Access
Question # 79

A company wants to migrate its accounting system from an on-premises data center to the AWS Cloud m a single AWS Region. Data security and an immutable audit log are the top priorities. The company must monitor all AWS activities for compliance auditing. The company that enabled AWS CloudTrail but wants to make sure it meets meat requirements

Which actions should a solutions architect take lo protect and secure CloudTrail? (Select TWO.)

A.

Enable CloudTrail log file validation.

B.

Enable the CloudTrail Proceeding Library.

C.

Enable logging of Insights events in CloudTrail.

D.

Enable custom logging from the on-premises resources

E.

Create an AWS Config rule to monitor whether CloudTrail is configured to use server-side encryption with AWS KMS managed encryption keys (SSE-KMS)

Full Access
Question # 80

A company has an AWS Lambda function that needs read access to an Amazon S3 bucket that is located in the same AWS account. Which solution will meet these requirement in the MOST secure manner?

A.

Apply an S3 bucket pokey that grants road access to the S3 bucket

B.

Apply an IAM role to the Lambda function Apply an IAM policy to the role to grant read access to the S3 bucket

C.

Embed an access key and a secret key In the Lambda function's coda to grant the required IAM permissions for read access to the S3 bucket

D.

Apply an IAM role to the Lambda function. Apply an IAM policy to the role to grant read access to all S3 buckets In the account

Full Access
Question # 81

Some of the company’s customers are retrieving records frequently, leading to an increase in costs for the company. The company wants to limit retrieved requests in the future. The company also wants to ensure that if one customer reaches its retrieval limit other customers will not affected.

Which solution will meet these requirements?

A.

Set up server-side throttling limits for API Gateway.

B.

Limit DynamoDB read throughput on the table lo an amount that results m the maximum cost that the company is willing to incur.

C.

Set up a usage plan for API Gateway Implement throttling limits tor each customer. and distribute API keys to each customer

D.

Set up AWS Budgets. Monitor the usage of API Gateway and DynamoDB Configure an alarm to provide an alert when the cost exceeds a certain threshold each month

Full Access
Question # 82

A company is using a fleet of Amazon EC2 instances to ingest data from on-premises data sources. The data is in JSON format and Ingestion rates can be as high as 1 MB/s. When an EC2 instance is rebooted, the data in-flight is lost. The company's data science team wants to query Ingested data In near-real time.

Which solution provides near-real -time data querying that is scalable with minimal data loss?

A.

Publish data to Amazon Kinesis Data Streams Use Kinesis data Analytics to query the data.

B.

Publish data to Amazon Kinesis Data Firehose with Amazon Redshift as the destination Use Amazon Redshift to query the data

C.

Store ingested data m an EC2 Instance store Publish data to Amazon Kinesis Data Firehose with Amazon S3 as the destination. Use Amazon Athena to query the data.

D.

Store ingested data m an Amazon Elastic Block Store (Amazon EBS) volume Publish data to Amazon ElastiCache tor Red Subscribe to the Redis channel to query the data

Full Access
Question # 83

An entertainment company is using Amazon DynamoDB to store media metadata. The application Is read intensive and experience delays The company does not have staff to handle additional operational overhead and needs to Improve the performance efficiency of DynamoDB without reconfiguring the application

What should a solutions architect recommend to meet this requirement?

A.

Use Amazon ElastiCache for Redis

B.

Use Amazon DynamoDB Accelerator (DAX).

C.

Replicate data by using DynamoDB global tables

D.

Use Amazon ElasoCache for Merncached with Auto Discovery enabled

Full Access
Question # 84

A company has NFS servers in an on-premises data center that need to periodically back up small amounts of data to Amazon S3.

Which solution marts these requirement and is MOST cost-effective?

A.

Set up AWS Glue lo copy the data from the on-premises servers to Amazon S3.

B.

Set up an AWS DataSync agent on Vie on-premises servers, and sync the data lo Amazon S3

C.

Set up an SFTP sync using AWS Transfer for SFTP lo sync data from on premises lo Amazon S3

D.

Set up an AWS Direct Connect connection between the on-premises data center and a VPC, and copy the data to Amazon S3

Full Access
Question # 85

A company is running a multi-tier recommence web application in the AWS Cloud. The application runs on Amazon EC2 instances with an Amazon RDS for MySQL Multi-AZ OB instance. Amazon ROS is configured with the latest generation DB instance with 2.000 GB of storage In a General Purpose SSD (gp3) Amazon Elastic Block Store (Amazon EBSl volume. The database performance affects the application during periods high demand.

A database administrator analyzes the logs in Amazon CloudWatch Logs and discovers that the application performance always degrades when the number of read and write IOPS is higher than 20.000.

What should a solutions architect do to improve the application performance?

A.

Replace the volume with a magnetic volume.

B.

Increase the number of IOPS on the gp3 volume.

C.

Replace the volume with a Provisioned IOPS SSD (Io2) volume.

D.

Replace the 2.000 GB gp3 volume with two 1.000 GB gp3 volumes

Full Access
Question # 86

A solutions architect Is designing a new API using Amazon API Gateway that will receive requests from users. The volume of requests is highly variable: several hours can pass without receiving a single request. The data processing will take place asynchronously, but should be completed within a few seconds after a request la made.

Which compute service should the solutions architect have the API invoke to deliver the requirements at the lowest cost?

A.

An AWS Glue job

B.

An AWS Lambda function

C.

A containerized service hosted in Amazon Elastic Kubemetes Service {Amazon EKS)

D.

A containerized service hosted in Amazon ECS with Amazon EC2

Full Access
Question # 87

A solutions architect is designing a multi-Region disaster recovery solution (or an application that will provide public API access The application will use Amazon EC2 instances with a userdata script to load application code and an Amazon RDS for MySQL database The Recovery Time Objective (RTO) is 3 hours and the Recovery Point Objective (RPO) is 24 hours

Which architecture would meet these requirements at the LOWEST cost/?

A.

Use an Application Load Balancer for Region failover Deploy new EC2 instances with the userdata script Deploy separate RDS instances in each Region

B.

Use Amazon Route 53 for Region failover Deploy new EC2 instances with the userdata script Create a read replica of the RDS instance in a backup Region

C.

Use Amazon API Gateway for the public APIs and Region failover Deploy new EC2 instances with the userdata script Create a MySQL read replica of the RDS instance in a backup Region

D.

Use Amazon Route 53 for Region failover Deploy new EC2 instances with the userdata script for APIs, and create a snapshot of the RDS instance daily for a backup Replicate the snapshot to a backup Region

Full Access
Question # 88

A solution architect has created a new AWS account and must secure AWS account root user access Which combination of actions mil accomplish this? (Select TWO )

A.

Ensure the root user uses a strong password

B.

Enable multi-factor authentication to the root user

C.

Store root user access keys m an encrypted Amazon S3 bucket

D.

Add the root user to a group containing administrative permissions

E.

Apply the required permissions to the root user with an inline policy document

Full Access
Question # 89

A company has an on-premises application that collects data and stores it to an on-premises NFS server The company recently set up a 10 Gbps AWS Direct Connect connection The

company is running out of storage capacity on premises. The company needs to migrate the application data from on premises to the AWS Cloud while maintaining low-latency access to the data from the on-premises application

What should a solutions architect do to meet these requirements?

A.

Deploy AWS Storage Gateway for the application data and use the file gateway to store the data in Amazon S3 Connect the on-premises application servers to the file gateway using NFS

B.

Attach an Amazon Elastic File System (Amazon EFS) file system to the NFS server and copy the application data to the EFS file system. Then connect the on-premises application to Amazon EFS

C.

Configure AWS Storage Gateway as a volume gateway Make the application data available to the on-premises application from the NFS server and with Amazon Elastic Block Store {Amazon EBS) snapshots

D.

Create an AWS DataSync agent with the NFS server as the source location and an Amazon Elastic File System (Amazon EFS) file system as the destination for application data transfer Connect the on-premises application to the EFS file system

Full Access
Question # 90

An ecommerce company has noticed performance degradation of its Amazon RDS based web application The performance degradation is attributed to an increase in the number of read-only SQL queries triggered by business analysts A solutions architect needs to solve the problem with minimal changes to the existing web application

What should the solutions architect recommend''

A.

Export the data to Amazon DynamoDB and have the business analysts run their queries

B.

Load the data into Amazon ElastiCache and have the business analysts run their queries

C.

Create a read replica of the primary database and have the business analysts run their queries

D.

Copy the data into an Amazon Redshift cluster and have the business analysts run their queries

Full Access
Question # 91

A company runs a web-based portal that provides users with global breaking news local alerts, and weather updates The portal delivers each user a personalized view by using a mixture of static and dynamic content Content is served over HTTPS through an API server running on an Amazon EC2 instance behind an Application Load Balancer (ALB) The company wants the portal to provide this content to its users across the world as quickly as possible

How should a solutions architect design the application to ensure the LEAST amount of latency for all users?

A.

Deploy the application stack in a single AWS Region Use Amazon CloudFront to serve all static and dynamic content by specifying the ALB as an origin

B.

Deploy the application stack in two AWS Regions Use an Amazon Route 53 latency routing policy to serve all content from the ALB in the closest Region

C.

Deploy the application stack in a single AWS Region Use Amazon CloudFront to serve the static content Serve the dynamic content directly from the ALB

D.

Deploy the application stack in two AWS Regions Use an Amazon Route 53 geolocation routing policy to serve all content from the ALB in the closest Region

Full Access
Question # 92

A company sells datasets to customers who do research in artificial intelligence and machine learning (Al/ML) The datasets are large, formatted files that are stored in an Amazon S3 bucket in the us-east-1 Region The company hosts a web application that the customers use to purchase access to a given dataset The web application is deployed on multiple Amazon EC2 instances behind an Application Load Balancer After a purchase is made customers receive an S3 signed URL that allows access to the files.

The customers are distributed across North America and Europe The company wants to reduce the cost that is associated with data transfers and wants to maintain or improve performance.

What should a solutions architect do to meet these requirements?

A.

Configure S3 Transfer Acceleration on the existing S3 bucket Direct customer requests to the S3 Transfer Acceleration endpoint Continue to use S3 signed URLs for access control

B.

Deploy an Amazon CloudFront distribution with the existing S3 bucket as the origin Direct customer requests to the CloudFront URL Switch to CloudFront signed URLs for access control

C.

Set up a second S3 bucket in the eu-central-1 Region with S3 Cross-Region Replication between the buckets Direct customer requests to the closest Region Continue to use S3 signed URLs for access control

D.

Modify the web application to enable streaming of the datasets to end users. Configure the web application to read the data from the existing S3 bucket Implement access control directly in the application

Full Access
Question # 93

A solutions architect is creating an application that will handle batch processing of large amounts of data The input data will be held in Amazon S3 and the output data will be stored in a different S3 bucket For processing, the application will transfer the data over the network between multiple Amazon EC2 instances

What should the solutions architect do to reduce the overall data transfer costs?

A.

Place ail the EC2 instances in an Auto Scaling group

B.

Place all the EC2 instances in the same AWS Region

C.

Place ail the EC2 instances in the same Availability Zone

D.

Place all the EC2 Instances in private subnets in multiple Availability Zones

Full Access
Question # 94

A company runs en application on a large fleet of Amazon EC2 instances. The application reads and write entries into an Amazon DynamoDB table The size of the OynamoDB table continuously grows but the application needs only data from the last 30 days The company needs a solution that minimizes cost and development effort

Which solution meets these requirements'?

A.

Use an AWS CloudFormation template to deploy the complete solution Redeploy the Cloud Formation stack every 30 days, and delete the original stack

B.

Use an EC2 instance that runs a monitoring application from AWS Marketplace Configure the monitoring application to use Amazon DynamoOB Streams to store the timestamp when a new item is created in the table Use a script that runs on the EC2 instance to delete items that have a timestamp that is older than 30 days

C.

Configure Amazon DynamoDB Streams to invoke an AWS Lambda function when a new item is created in the table Configure the Lambda function to delete items m the table that are older than 30 days

D.

Extend the application to add an attribute that has a value of the current timestamp plus 30 days to each new item that is created in the table Configure DynamoDB to use the attribute as the TTL attribute

Full Access
Question # 95

A company runs an application using Amazon ECS. The application creates resized versions of an original Image and then makes Amazon S3 API calls to store the resized images in Amazon S3

How can a solutions architect ensure that the application has permission to access Amazon S3?

A.

Update the S3 role in AWS IAM to allow read/write access from Amazon ECS and then relaunch the container.

B.

Create an IAM role with S3 permissions and then specify that role as the taskRoleArn in the task definition.

C.

Create a security group that allows access from Amazon ECS to Amazon S3 and update the launch configuration used by the ECS cluster.

D.

Create an IAM user with S3 permissions, and then relaunch the Amazon EC2 instances for the ECS cluster while logged in as this account.

Full Access