Master the SAA-C03 AWS Certified Solutions Architect - Associate (SAA-C03) content and be ready for exam day success quickly with this Testking SAA-C03 free download. We guarantee it!We make it a reality and give you real SAA-C03 questions in our Amazon-Web-Services SAA-C03 braindumps.Latest 100% VALID Amazon-Web-Services SAA-C03 Exam Questions Dumps at below page. You can use our Amazon-Web-Services SAA-C03 braindumps and pass your exam.

Amazon-Web-Services SAA-C03 Free Dumps Questions Online, Read and Test Now.

NEW QUESTION 1
A company has a production workload that runs on 1,000 Amazon EC2 Linux instances. The workload is powered by third-party software. The company needs to patch the third-party software on all EC2 instances as quickly as possible to remediate a critical security vulnerability.
What should a solutions architect do to meet these requirements?

  • A. Create an AWS Lambda function to apply the patch to all EC2 instances.
  • B. Configure AWS Systems Manager Patch Manager to apply the patch to all EC2 instances.
  • C. Schedule an AWS Systems Manager maintenance window to apply the patch to all EC2 instances.
  • D. Use AWS Systems Manager Run Command to run a custom command that applies the patch to all EC2 instances.

Answer: D

NEW QUESTION 2
A company uses NFS to store large video files in on-premises network attached storage. Each video file ranges in size from 1MB to 500 GB. The total storage is 70 TB and is no longer growing. The company decides to migrate the video files to Amazon S3. The company must migrate the video files as soon as possible while using the least possible network bandwidth.
Which solution will meet these requirements?

  • A. Create an S3 bucket Create an 1AM role that has permissions to write to the S3 bucke
  • B. Use the AWS CLI to copy all files locally to the S3 bucket.
  • C. Create an AWS Snowball Edge jo
  • D. Receive a Snowball Edge device on premise
  • E. Use the Snowball Edge client to transfer data to the devic
  • F. Return the device so that AWS can import the data intoAmazon S3.
  • G. Deploy an S3 File Gateway on premise
  • H. Create a public service endpoint to connect to the S3 File Gateway Create an S3 bucket Create a new NFS file share on the S3 File Gateway Point the new file share to the S3 bucke
  • I. Transfer the data from the existing NFS file share to the S3 File Gateway.
  • J. Set up an AWS Direct Connect connection between the on-premises network and AW
  • K. Deploy an S3 File Gateway on premise
  • L. Create a public virtual interlace (VIF) to connect to the S3 File Gatewa
  • M. Create an S3 bucke
  • N. Create a new NFS file share on the S3 File Gatewa
  • O. Point the new file share to the S3 bucke
  • P. Transfer the data from the existing NFS file share to the S3 File Gateway.

Answer: C

NEW QUESTION 3
A company is planning on deploying a newly built application on AWS in a default VPC. The application will consist of a web layer and database layer. The web server was created in public subnets, and the MySQL database was created in private subnet. All subnets are created with the default network ACL settings, and the default security group in the VPC will be replaced with new custom security groups.

  • A. Create a database server security group with inbound and outbound rules for MySQL port 3306 traffic to and from anywhere (0.0.0.0/0).
  • B. Create a database server security group with an inbound rule for MySQL port 3300 and specify the source as a web server security group.
  • C. Create a web server security group within an inbound allow rule for HTTPS port 443 traffic from anywbere (0.0.0.0/0) and an inbound deny rule for IP range 182. 20.0.0/16
  • D. Create a web server security group with an inbound rule for HTTPS port 443 traffic from anywhere (0.0.0.0/0). Create network ACL inbound and outbound deny rules for IP range 182. 20.0.0/16
  • E. Create a web server security group with an inbound and outbound rules for HTTPS port 443 traffic to and from anywbere (0.0.0.0/0). Create a network ACL inbound deny rule for IP range 182. 20.0.0/16.

Answer: BD

NEW QUESTION 4
A company has an Amazon S3 bucket that contains critical dat a. The company must protect the data from accidental deletion.
Which combination of steps should a solutions architect take to meet these requirements? (Choose two.)

  • A. Enable versioning on the S3 bucket.
  • B. Enable MFA Delete on the S3 bucket.
  • C. Create a bucket policy on the S3 bucket.
  • D. Enable default encryption on the S3 bucket.
  • E. Create a lifecycle policy for the objects in the S3 bucket.

Answer: AB

NEW QUESTION 5
A company's web application consists o( an Amazon API Gateway API in front of an AWS Lambda function and an Amazon DynamoDB database. The Lambda function
handles the business logic, and the DynamoDB table hosts the data. The application uses Amazon Cognito user pools to identify the individual users of the application. A solutions architect needs to update the application so that only users who have a subscription can access premium content.

  • A. Enable API caching and throttling on the API Gateway API
  • B. Set up AWS WAF on the API Gateway API Create a rule to filter users who have a subscription
  • C. Apply fine-grained 1AM permissions to the premium content in the DynamoDB table
  • D. Implement API usage plans and API keys to limit the access of users who do not have a subscription.

Answer: C

NEW QUESTION 6
A company has two applications: a sender application that sends messages with payloads to be processed and a processing application intended to receive the messages with payloads. The company wants to implement an AWS service to handle messages between the two applications. The sender application can send about 1.000 messages each hour. The messages may take up to 2 days to be processed. If the messages fail to process, they must be retained so that they do not impact the processing of any remaining messages.
Which solution meets these requirements and is the MOST operationally efficient?

  • A. Set up an Amazon EC2 instance running a Redis databas
  • B. Configure both applications to use the instanc
  • C. Store, process, and delete the messages, respectively.
  • D. Use an Amazon Kinesis data stream to receive the messages from the sender applicatio
  • E. Integrate the processing application with the Kinesis Client Library (KCL).
  • F. Integrate the sender and processor applications with an Amazon Simple Queue Service (Amazon SQS) queu
  • G. Configure a dead-letter queue to collect the messages that failed to process.
  • H. Subscribe the processing application to an Amazon Simple Notification Service (Amazon SNS) topic to receive notifications to proces
  • I. Integrate the sender application to write to the SNS topic.

Answer: C

Explanation:
Explanation
https://aws.amazon.com/blogs/compute/building-loosely-coupled-scalable-c-applications-with-amazon-sqs-and- https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-dead-letter-queues.htm

NEW QUESTION 7
A gaming company hosts a browser-based application on AWS The users of the application consume a large number of videos and images that are stored in Amazon S3. This content is the same for all users
The application has increased in popularity, and millions of users worldwide are accessing these media files. The company wants to provide the files to the users while reducing the load on the origin
Which solution meets these requirements MOST cost-effectively?

  • A. Deploy an AWS Global Accelerator accelerator in front of the web servers
  • B. Deploy an Amazon CloudFront web distribution in front of the S3 bucket
  • C. Deploy an Amazon ElastiCache for Redis instance in front of the web servers
  • D. Deploy an Amazon ElastiCache for Memcached instance in front of the web servers

Answer: B

Explanation:
CloudFront uses Edge Locations to cache content while Global Accelerator uses Edge Locations to find an optimal pathway to the nearest regional endpoint.

NEW QUESTION 8
A company has hired a solutions architect to design a reliable architecture for its application. The application consists of one Amazon RDS DB instance and two manually provisioned Amazon EC2 instances that run web servers. The EC2 instances are located in a single Availability Zone.
What should the solutions architect do to maximize reliability of the application Infrastructure?

  • A. Delete one EC2 instance and enable termination protection on the other EC2 instanc
  • B. Update the DB instance to De multi-AZ, and enable deletion protection.
  • C. Update the DB instance to be Multi-A
  • D. and enable deletion protectio
  • E. Place the EC2 instances behind an Application Load Balancer, and run them in an EC2 Auto Scaling group across multiple Availability Zones
  • F. Create an additional DB instance along with an Amazon API Gateway and an AWS Lambda function.Configure the application to invoke the Lambda function through API Gateway Have the Lambda function write the data to the two DB instances.
  • G. Place the EC2 instances in an EC2 Auto Scaling group that has multiple subnets located in multiple Availability Zone
  • H. Use Spot Instances instead of On-Demand Instance
  • I. Set up Amazon CloudWatch alarms to monitor the health of the instance
  • J. Update the DB instance to be Multi-AZ, and enable deletion protection.

Answer: B

Explanation:
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-spot-instances.html

NEW QUESTION 9
A solutions architect must design a highly available infrastructure for a website. The website is powered by Windows web servers that run on Amazon EC2 instances. The solutions architect must implement a solution that can mitigate a large-scale DDoS attack that originates from thousands of IP addresses. Downtime is not acceptable for the website.
Which actions should the solutions architect take to protect the website from such an attack? (Select TWO.)

  • A. Use AWS Shield Advanced to stop the DDoS attack.
  • B. Configure Amazon GuardDuty to automatically block the attackers.
  • C. Configure the website to use Amazon CloudFront for both static and dynamic content.
  • D. Use an AWS Lambda function to automatically add attacker IP addresses to VPC network ACLs.
  • E. Use EC2 Spot Instances in an Auto Scaling group with a target tracking scaling policy that is set to 80% CPU utilization

Answer: AC

NEW QUESTION 10
A company uses a legacy application to produce data in CSV format The legacy application stores the output data In Amazon S3 The company is deploying a new commercial off-the-shelf (COTS) application that can perform complex SQL queries to analyze data that is stored Amazon Redshift and Amazon S3 only However the COTS application cannot process the csv files that the legacy application produces The company cannot update the legacy application to produce data in another format The company needs to implement a solution so that the COTS application can use the data that the legacy applicator produces.
Which solution will meet these requirements with the LEAST operational overhead?

  • A. Create a AWS Glue extract, transform, and load (ETL) job that runs on a schedul
  • B. Configure the ETL job to process the .csv files and store the processed data in Amazon Redshit.
  • C. Develop a Python script that runs on Amazon EC2 instances to convert th
  • D. csv files to sql files invoke the Python script on cron schedule to store the output files in Amazon S3.
  • E. Create an AWS Lambda function and an Amazon DynamoDB tabl
  • F. Use an S3 event to invoke the Lambda functio
  • G. Configure the Lambda function to perform an extract transform, and load (ETL) job to process the .csv files and store the processed data in the DynamoDB table.
  • H. Use Amazon EventBridge (Amazon CloudWatch Events) to launch an Amazon EMR cluster on a weekly schedul
  • I. Configure the EMR cluster to perform an extract, tractform, and load (ETL) job to process the .csv files and store the processed data in an Amazon Redshift table.

Answer: C

NEW QUESTION 11
A company is running a publicly accessible serverless application that uses Amazon API Gateway and AWS Lambda. The application’s traffic recently spiked due to fraudulent requests from botnets.
Which steps should a solutions architect take to block requests from unauthorized users? (Select TWO.)

  • A. Create a usage plan with an API key that it shared with genuine users only.
  • B. Integrate logic within the Lambda function to ignore the requests lion- fraudulent IP addresses
  • C. Implement an AWS WAF rule to target malicious requests and trigger actions to filler them out
  • D. Convert the existing public API to a private API Update the DNS records to redirect users to the new API endpoint
  • E. Create an IAM role tor each user attempting to access the API A user will assume the role when making the API call

Answer: CD

NEW QUESTION 12
A rapidly growing ecommerce company is running its workloads in a single AWS Region. A solutions architect must create a disaster recovery (DR) strategy that includes a different AWS Region. The company wants its database to be up to date in the DR Region with the least possible latency. The remaining infrastructure in the DR Region needs to run at reduced capacity and must be able to scale up if necessary.
Which solution will meet these requirements with the LOWEST recovery time objective (RTO)?

  • A. Use an Amazon Aurora global database with a pilot light deployment.
  • B. Use an Amazon Aurora global database with a warm standby deployment.
  • C. Use an Amazon RDS Multi-AZ DB instance with a pilot light deployment.
  • D. Use an Amazon RDS Multi-AZ DB instance with a warm standby deployment.

Answer: B

NEW QUESTION 13
A company uses 50 TB of data for reporting. The company wants to move this data from on premises to AWS A custom application in the company's data center runs a weekly data transformation job. The company plans to pause the application until the data transfer is complete and needs to begin the transfer process as soon as possible.
The data center does not have any available network bandwidth for additional workloads A solutions architect must transfer the data and must configure the transformation job to continue to run in the AWS Cloud
Which solution will meet these requirements with the LEAST operational overhead?

  • A. Use AWS DataSync to move the data Create a custom transformation job by using AWS Glue
  • B. Order an AWS Snowcone device to move the data Deploy the transformation application to the device
  • C. Order an AWS Snowball Edge Storage Optimized devic
  • D. Copy the data to the devic
  • E. Create a customtransformation job by using AWS Glue
  • F. Order an AWS
  • G. Snowball Edge Storage Optimized device that includes Amazon EC2 compute Copy the data to the device Create a new EC2 instance on AWS to run the transformation application

Answer: D

NEW QUESTION 14
A company has an on-premises MySQL database that handles transactional data The company is migrating the database to the AWS Cloud The migrated database must maintain compatibility with the company's applications that use the database The migrated database also must scale automatically during periods of increased demand.
Which migration solution will meet these requirements?

  • A. Use native MySQL tools to migrate the database to Amazon RDS for MySQL Configure elastic storage scaling
  • B. Migrate the database to Amazon Redshift by using the mysqldump utility Turn on Auto Scaling for the Amazon Redshift cluster
  • C. Use AWS Database Migration Service (AWS DMS) to migrate the database to Amazon Aurora Turn on Aurora Auto Scaling.
  • D. Use AWS Database Migration Service (AWS DMS) to migrate the database to Amazon DynamoDB Configure an Auto Scaling policy.

Answer: C

NEW QUESTION 15
A company has an application that processes customer of tiers. The company hosts the application on an Amazon EC2 instance that saves the orders to an Amazon Aurora database. Occasionally when traffic Is high, the workload does not process orders fast enough.
What should a solutions architect do to write the orders reliably to the database as quickly as possible?

  • A. Increase the instance size of the EC2 instance when baffle Is hig
  • B. Write orders to Amazon Simple Notification Service (Amazon SNS) Subscribe the database endpoint to the SNS topic
  • C. Write orders to an Amazon Simple Queue Service (Amazon SOS) queue Use EC2 instances in an Auto Scaling group behind an Application Load Balancer to read born the SQS queue and process orders into the database
  • D. Write orders to Amazon Simple Notification Service (Amazon SNS). Subscribe the database endpoint to the SNS topi
  • E. Use EC2 ^stances in an Auto Scaling group behind an Application Load Balancer to read from the SNS topic.
  • F. Write orders to an Amazon Simple Queue Service (Amazon SQS) queue when the EC2 instance reaches CPU threshold limit
  • G. Use scheduled scaling of EC2 instances in an Auto Scaling group behind an Application Load Balancer to read from the SQS queue and process orders into the database

Answer: B

NEW QUESTION 16
A company has migrated a two-tier application from its on-premises data center to the AWS Cloud The data tier is a Multi-AZ deployment of Amazon RDS for Oracle with 12 TB of General Purpose SSD Amazon Elastic Block Store (Amazon EBS) storage The application is designed to process and store documents in the database as binary large objects (blobs) with an average document size of 6 MB
The database size has grown over time reducing the performance and increasing the cost of storage. The company must improve the database performance and needs a solution that is highly available and resilient
Which solution will meet these requirements MOST cost-effectively?

  • A. Reduce the RDS DB instance size Increase the storage capacity to 24 TiB Change the storage type to Magnetic
  • B. Increase the RDS DB instance siz
  • C. Increase the storage capacity to 24 TiB Change the storage type to Provisioned IOPS
  • D. Create an Amazon S3 bucke
  • E. Update the application to store documents in the S3 bucket Store theobject metadata m the existing database
  • F. Create an Amazon DynamoDB tabl
  • G. Update the application to use DynamoD
  • H. Use AWS Database Migration Service (AWS DMS) to migrate data from the Oracle database to DynamoDB

Answer: C

NEW QUESTION 17
A company runs an on-premises application that is powered by a MySQL database The company is migrating the application to AWS to Increase the application's elasticity and availability
The current architecture shows heavy read activity on the database during times of normal operation Every 4 hours the company's development team pulls a full export of the production database to populate a database in the staging environment During this period, users experience unacceptable application latency The development team is unable to use the staging environment until the procedure completes
A solutions architect must recommend replacement architecture that alleviates the application latency issue The replacement architecture also must give the development team the ability to continue using the staging environment without delay
Which solution meets these requirements?

  • A. Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for productio
  • B. Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.
  • C. Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for production Use database cloning to create the staging database on-demand
  • D. Use Amazon RDS for MySQL with a Mufti AZ deployment and read replicas for production Use the standby instance tor the staging database.
  • E. Use Amazon RDS for MySQL with a Multi-AZ deployment and read replicas for productio
  • F. Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.

Answer: C

NEW QUESTION 18
A gaming company has a web application that displays scores. The application runs on Amazon EC2 instances behind an Application Load Balancer. The application stores data in an Amazon RDS for MySQL database. Users are starting to experience long delays and interruptions that are caused by database read performance. The company wants to improve the user experience while minimizing changes to the application's architecture.
What should a solutions architect do to meet these requirements?

  • A. Use Amazon ElastiCache in front of the database.
  • B. Use RDS Proxy between the application and the database.
  • C. Migrate the application from EC2 instances to AWS Lambda.
  • D. Migrate the database from Amazon RDS for MySQL to Amazon DynamoDB.

Answer: C

NEW QUESTION 19
A company wants to run a gaming application on Amazon EC2 instances that are part of an Auto Scaling group in the AWS Cloud. The application will transmit data by using UDP packets. The company wants to ensure that the application can scale out and in as traffic increases and decreases.
What should a solutions architect do to meet these requirements?

  • A. Attach a Network Load Balancer to the Auto Scaling group
  • B. Attach an Application Load Balancer to the Auto Scaling group.
  • C. Deploy an Amazon Route 53 record set with a weighted policy to route traffic appropriately
  • D. Deploy a NAT instance that is configured with port forwarding to the EC2 instances in the Auto Scaling group.

Answer: B

NEW QUESTION 20
A company has an AWS Glue extract. transform, and load (ETL) job that runs every day at the same time. The job processes XML data that is in an Amazon S3 bucket.
New data is added to the S3 bucket every day. A solutions architect notices that AWS Glue is processing all
the data during each run.
What should the solutions architect do to prevent AWS Glue from reprocessing old data?

  • A. Edit the job to use job bookmarks.
  • B. Edit the job to delete data after the data is processed
  • C. Edit the job by setting the NumberOfWorkers field to 1.
  • D. Use a FindMatches machine learning (ML) transform.

Answer: B

NEW QUESTION 21
......

Recommend!! Get the Full SAA-C03 dumps in VCE and PDF From Dumps-files.com, Welcome to Download: https://www.dumps-files.com/files/SAA-C03/ (New 0 Q&As Version)