Examcollection offers free demo for SAA-C03 exam. "AWS Certified Solutions Architect - Associate (SAA-C03)", also known as SAA-C03 exam, is a Amazon-Web-Services Certification. This set of posts, Passing the Amazon-Web-Services SAA-C03 exam, will help you answer those questions. The SAA-C03 Questions & Answers covers all the knowledge points of the real exam. 100% real Amazon-Web-Services SAA-C03 exams and revised by experts!

Online Amazon-Web-Services SAA-C03 free dumps demo Below:

NEW QUESTION 1
A company is deploying a new application lo Amazon Elastic Kubernetes Service (Amazon EKS) with an AWS Fargate duster The application needs a storage solution for data persistence The solution must be highly available and fault tolerant The solution also must be shared between multiple application containers
Which solution will meet these requirements with the LEAST operational overhead?

  • A. Create Amazon Elastic Block Store (Amazon EBS) volumes In the same Availability Zones where EKS worker nodes are place
  • B. Register the volumes In a StorageClass object on an EKS cluster Use EBS Multi-Attach to share the data between containers
  • C. Create an Amazon Elastic File System (Amazon EFS) tile system Register the tile system in a StorageClass object on an EKS cluster Use the same file system for all containers
  • D. Create an Amazon Elastic Block Store (Amazon EBS) volume Register the volume In a StorageClass object on an EKS cluster Use the same volume for all containers.
  • E. Create Amazon Elastic File System (Amazon EFS) file systems In the same Availability Zones where EKS worker nodes are placed Register the file systems in a StorageClass obied on an EKS duster Create an AWS Lambda function to synchronize the data between file systems

Answer: B

NEW QUESTION 2
A company needs to ingested and handle large amounts of streaming data that its application generates. The application runs on Amazon EC2 instances and sends data to Amazon Kinesis Data Streams. which is contained wild default settings. Every other day the application consumes the data and writes the data to an Amazon S3 bucket for business intelligence (BI) processing the company observes that Amazon S3 is not receiving all the data that trio application sends to Kinesis Data Streams.
What should a solutions architect do to resolve this issue?

  • A. Update the Kinesis Data Streams default settings by modifying the data retention period.
  • B. Update the application to use the Kinesis Producer Library (KPL) lo send the data to Kinesis Data Streams.
  • C. Update the number of Kinesis shards lo handle the throughput of me data that is sent to Kinesis Data Streams.
  • D. Turn on S3 Versioning within the S3 bucket to preserve every version of every object that is ingested in the S3 bucket.

Answer: A

NEW QUESTION 3
A company wants to manage Amazon Machine Images (AMls). The company currently copies AMls to the same AWS Region where the AMls were created. The company needs to design an application that captures AWS API calls and sends alerts whenever the Amazon EC2 Createlmage API operation is called within the company's account
Which solution will meet these requirements with the LEAST operational overhead?

  • A. Create an AWS Lambda function to query AWS CloudTrail logs and to send an alert when a Createlmage API call is detected
  • B. Configure AWS CloudTrail with an Amazon Simple Notification Sen/ice (Amazon SNS) notification that occurs when updated logs are sent to Amazon S3 Use Amazon Athena to create a new table and to query on Createlmage when an API call is detected
  • C. Create an Amazon EventBndge (Amazon CloudWatch Events) rule for the Createlmage API call Configure the target as an Amazon Simple Notification Service (Amazon SNS) topic to send an alert when a Createlmage API call is detected
  • D. Configure an Amazon Simple Queue Service (Amazon SQS) FIFO queue as a target for AWS CloudTrail logs Create an AWS Lambda function to send an alert to an Amazon Simple Notification Service (Amazon SNS) topic when a Createlmage API call is detected

Answer: B

NEW QUESTION 4
A company has a web application that runs on Amazon EC2 instances. The company wants end users to authenticate themselves before they use the web application. The web application accesses AWS resources, such as Amazon S3 buckets, on behalf of users who are logged on.
Which combination of actions must a solutions architect take to meet these requirements? (Select TWO).

  • A. Configure AWS App Mesh to log on users.
  • B. Enable and configure AWS Single Sign-On in AWS Identity and Access Management (IAM).
  • C. Define a default (AM role for authenticated users.
  • D. Use AWS Identity and Access Management (IAM) for user authentication.
  • E. Use Amazon Cognito for user authentication.

Answer: BE

NEW QUESTION 5
A solutions architect is designing a two-tier web application The application consists of a public-facing web tier hosted on Amazon EC2 in public subnets The database tier consists of Microsoft SQL Server running on Amazon EC2 in a private subnet Security is a high priority for the company
How should security groups be configured in this situation? (Select TWO )

  • A. Configure the security group for the web tier to allow inbound traffic on port 443 from 0.0.0.0/0.
  • B. Configure the security group for the web tier to allow outbound traffic on port 443 from 0.0.0.0/0.
  • C. Configure the security group for the database tier to allow inbound traffic on port 1433 from the security group for the web tier.
  • D. Configure the security group for the database tier to allow outbound traffic on ports 443 and 1433 to the security group for the web tier.
  • E. Configure the security group for the database tier to allow inbound traffic on ports 443 and 1433 from the security group for the web tier.

Answer: AC

Explanation:
"Security groups create an outbound rule for every inbound rule." Not completely right. Statefull does NOT mean that if you create an inbound (or outbound) rule, it will create an outbound (or inbound) rule. What it does mean is: suppose you create an inbound rule on port 443 for the X ip. When a request enters on port 443 from X ip, it will allow traffic out for that request in the port 443. However, if you look at the outbound rules, there will not be any outbound rule on port 443 unless explicitly create it. In ACLs, which are stateless, you would have to create an inbound rule to allow incoming requests and an outbound rule to allow your application responds to those incoming requests.
https://docs.aws.amazon.com/vpc/latest/userguide/VPC_SecurityGroups.html#SecurityGroupRules

NEW QUESTION 6
A company wants to migrate a Windows-based application from on premises to the AWS Cloud. The application has three tiers, a business tier, and a database tier with Microsoft SQL Server. The company wants to use specific features of SQL Server such as native backups and Data Quality Services. The company also needs to share files for process between the tiers.
How should a solution architect design the architecture to meet these requirements?

  • A. Host all three on Amazon instance
  • B. Use Mmazon FSx File Gateway for file sharing between tiers.
  • C. Host all three on Amazon EC2 instance
  • D. Use Amazon FSx for Windows file sharing between the tiers.
  • E. Host the application tier and the business tier on Amazon EC2 instance
  • F. Host the database tier on Amazon RD
  • G. Use Amazon Elastic File system (Amazon EFS) for file sharing between the tiers.
  • H. Host the application tier and the business tier on Amazon EC2 instance
  • I. Host the database tier on Amazon RD
  • J. Use a Provisioned IOPS SSD (io2) Amazon Elastic Block Store (Amazon EBS) volume for file sharing between the tiers.

Answer: B

NEW QUESTION 7
An ecommerce company wants to launch a one-deal-a-day website on AWS. Each day will feature exactly one product on sale (or a period of 24 hours. The company wants to be able to handle millions of requests each hour with millisecond latency during peak hours.
Which solution will meet these requirements with the LEAST operational overhead?

  • A. Use Amazon S3 to host the full website in different S3 buckets Add Amazon CloudFront distributions Set the S3 buckets as origins for the distributions Store the order data in Amazon S3
  • B. Deploy the full website on Amazon EC2 instances that run in Auto Scaling groups across multiple Availability Zones Add an Application Load Balancer (ALB) to distribute the website traffic Add another ALB for the backend APIs Store the data in Amazon RDS for MySQL
  • C. Migrate the full application to run in containers Host the containers on Amazon Elastic Kubernetes Service (Amazon EKS) Use the Kubernetes Cluster Autoscaler to increase and decrease the number of pods to process bursts in traffic Store the data in Amazon RDS for MySQL
  • D. Use an Amazon S3 bucket to host the website's static content Deploy an Amazon CloudFront distributio
  • E. Set the S3 bucket as the origin Use Amazon API Gateway and AWS Lambda functions for the backend APIs Store the data in Amazon DynamoDB

Answer: D

NEW QUESTION 8
A company maintains a searchable repository of items on its website. The data is stored in an Amazon RDS for MySQL database table that contains more than 10 million rows The database has 2 TB of General Purpose SSD storage There are millions of updates against this data every day through the company's website
The company has noticed that some insert operations are taking 10 seconds or longer The company has determined that the database storage performance is the problem
Which solution addresses this performance issue?

  • A. Change the storage type to Provisioned IOPS SSD
  • B. Change the DB instance to a memory optimized instance class
  • C. Change the DB instance to a burstable performance instance class
  • D. Enable Multi-AZ RDS read replicas with MySQL native asynchronous replication.

Answer: A

Explanation:
https://aws.amazon.com/ebs/features/
"Provisioned IOPS volumes are backed by solid-state drives (SSDs) and are the highest performance EBS volumes designed for your critical, I/O intensive database applications. These volumes are ideal for both IOPS-intensive and throughput-intensive workloads that require extremely low latency."

NEW QUESTION 9
A company hosts its multi-tier applications on AWS. For compliance, governance, auditing, and security, the company must track configuration changes on its AWS resources and record a history of API calls made to these resources.
What should a solutions architect do to meet these requirements?

  • A. Use AWS CloudTrail to track configuration changes and AWS Config to record API calls
  • B. Use AWS Config to track configuration changes and AWS CloudTrail to record API calls
  • C. Use AWS Config to track configuration changes and Amazon CloudWatch to record API calls
  • D. Use AWS CloudTrail to track configuration changes and Amazon CloudWatch to record API calls

Answer: B

NEW QUESTION 10
A company wants to build a scalable key management Infrastructure to support developers who need to encrypt data in their applications.
What should a solutions architect do to reduce the operational burden?

  • A. Use multifactor authentication (MFA) to protect the encryption keys.
  • B. Use AWS Key Management Service (AWS KMS) to protect the encryption keys
  • C. Use AWS Certificate Manager (ACM) to create, store, and assign the encryption keys
  • D. Use an IAM policy to limit the scope of users who have access permissions to protect the encryption keys

Answer: B

NEW QUESTION 11
A company has an application that loads documents into an Amazon 53 bucket and converts the documents into another format. The application stores the converted documents m another S3 bucket and saves the document name and URLs in an Amazon DynamoOB table The DynamoOB entries are used during subsequent days to access the documents The company uses a DynamoOB Accelerator (DAX) cluster in front of the table
Recently, traffic to the application has increased. Document processing tasks are timing out during the scheduled DAX maintenance window. A solutions architect must ensure that the documents continue to load during the maintenance window
What should the solutions architect do to accomplish this goal?

  • A. Modify the application to write to the DAX cluster Configure the DAX cluster to write to the DynamoDB table when the maintenance window is complete
  • B. Enable Amazon DynamoDB Streams for the DynamoDB tabl
  • C. Modify the application to write to the stream Configure the stream to load the data when the maintenance window is complete.
  • D. Convert the application to an AWS Lambda function Configure the Lambda function runtime to be longer than the maintenance window Create an Amazon CloudWatch alarm to monitor Lambda timeouts
  • E. Modify the application to write the document name and URLs to an Amazon Simple Queue Service (Amazon SOS) queue Create an AWS Lambda function to read the SOS queue and write to DynamoDB.

Answer: C

NEW QUESTION 12
A company wants to migrate its on-premises application to AWS. The application produces output files that vary in size from tens of gigabytes to hundreds of terabytes The application data must be stored in a standard file system structure The company wants a solution that scales automatically, is highly available, and requires minimum operational overhead.
Which solution will meet these requirements?

  • A. Migrate the application to run as containers on Amazon Elastic Container Service (Amazon ECS) Use Amazon S3 for storage
  • B. Migrate the application to run as containers on Amazon Elastic Kubernetes Service (Amazon EKS) Use Amazon Elastic Block Store (Amazon EBS) for storage
  • C. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
  • D. Use Amazon Elastic File System (Amazon EFS) for storage.
  • E. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
  • F. Use Amazon Elastic Block Store (Amazon EBS) for storage.

Answer: C

NEW QUESTION 13
A company's application integrates with multiple software-as-a-service (SaaS) sources for data collection. The company runs Amazon EC2 instances to receive the data and to upload the data to an Amazon S3 bucket for analysis. The same EC2 instance that receives and uploads the data also sends a notification to the user when an upload is complete. The company has noticed slow application performance and wants to improve the performance as much as possible.
Which solution will meet these requirements with the LEAST operational overhead?

  • A. Create an Auto Scaling group so that EC2 instances can scale ou
  • B. Configure an S3 event notification to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3 bucket is complete.
  • C. Create an Amazon AppFlow flow to transfer data between each SaaS source and the S3 bucket.Configure an S3 event notification to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3 bucket is complete.
  • D. Create an Amazon EventBridge (Amazon CloudWatch Events) rule for each SaaS source to send output dat
  • E. Configure the S3 bucket as the rule's targe
  • F. Create a second EventBridge (CloudWatch Events) rule to send events when the upload to the S3 bucket is complet
  • G. Configure an Amazon Simple Notification Service (Amazon SNS) topic as the second rule's target.
  • H. Create a Docker container to use instead of an EC2 instanc
  • I. Host the containerized application on Amazon Elastic Container Service (Amazon ECS). Configure Amazon CloudWatch Container Insights to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3 bucket is complete.

Answer: D

NEW QUESTION 14
A company wants to manage Amazon Machine Images (AMIs). The company currently copies AMIs to the same AWS Region where the AMIs were created. The company needs to design an application that captures AWS API calls and sends alerts whenever the Amazon EC2 Createlmage API operation is called within the company's account.
Which solution will meet these requirements with the LEAST operational overhead?

  • A. Create an AWS Lambda function to query AWS CloudTrail logs and to send an alert when a Createlmage API call is detected.
  • B. Configure AWS CloudTrail with an Amazon Simple Notification Service {Amazon SNS) notification that occurs when updated logs are sent to Amazon S3. Use Amazon Athena to create a new table and to query on Createlmage when an API call is detected.
  • C. Create an Amazon EventBridge (Amazon CloudWatch Events) rule for the Createlmage API call.Configure the target as an Amazon Simple Notification Service (Amazon SNS) topic to send an alert when a Createlmage API call is detected.
  • D. Configure an Amazon Simple Queue Service (Amazon SQS) FIFO queue as a target for AWS CloudTrail log
  • E. Create an AWS Lambda function to send an alert to an Amazon Simple NotificationService (Amazon SNS) topic when a Createlmage API call is detected.

Answer: B

NEW QUESTION 15
A company wants to run applications in container in the AWS Cloud. Those applications arc stateless and can tolerate disruptions. What should a solutions architect do to meet those requirements?
What should a solution architect do to meet these requirements?

  • A. Use Spot Instances in an Amazon EC2 Auto Scaling group to run the application containers
  • B. Use Spot Instances in an Amazon Elastic Kubernetes Service (Amazon EKS) managed node group
  • C. Use On-Demand Instances in an Amazon EC2 Auto Scaling group to run the application containers
  • D. Use On-Demand Instances in an Amazon Elastic Kubernetes Service (Amazon EKS) managed node group.

Answer: A

NEW QUESTION 16
A bicycle sharing company is developing a multi-tier architecture to track the location of its bicycles during peak operating hours The company wants to use these data points in its existing analytics platform A solutions architect must determine the most viable multi-tier option to support this architecture The data points must be accessible from the REST API.
Which action meets these requirements for storing and retrieving location data?

  • A. Use Amazon Athena with Amazon S3
  • B. Use Amazon API Gateway with AWS Lambda
  • C. Use Amazon QuickSight with Amazon Redshift.
  • D. Use Amazon API Gateway with Amazon Kinesis Data Analytics

Answer: D

Explanation:
Explanation
https://aws.amazon.com/solutions/implementations/aws-streaming-data-solution-for-amazon-kinesis/

NEW QUESTION 17
A company needs to move data from an Amazon EC2 instance to an Amazon S3 bucket. The company mutt ensure that no API calls and no data aim routed through public internet routes Only the EC2 instance can have access to upload data to the S3 bucket.
Which solution will meet these requirements?

  • A. Create an interlace VPC endpoinl for Amazon S3 in the subnet where the EC2 instance is located Attach a resource policy to the S3 bucket to only allow the EC2 instance's 1AM rote for access
  • B. Create a gateway VPC endpoinl for Amazon S3 in the Availability Zone where the EC2 instance is located Attach appropriate security groups to the endpoint Attach a resource policy to the S3 bucket to only allow the EC2 instance's lAM tote for access
  • C. Run the nslookup toot from inside the EC2 instance to obtain the private IP address of the S3 bucket's service API endpoint Create a route in the VPC route table to provide the EC2 instance with access to the S3 bucket Attach a resource policy to the S3 bucket to only allow the EC2 instance's AM role for access
  • D. Use the AWS provided publicly available ip-ranges |son file to obtam the pnvate IP address of the S3 bucket's service API endpoint Create a route in the VPC route table to provide the EC2 instance with access to the S3 bucket Attach a resource policy to the S3 bucket to only allow the EC2 instance's 1AM role for access

Answer: B

NEW QUESTION 18
A company is developing a new machine learning (ML) model solution on AWS. The models are developed as independent microservices that fetch approximately 1GB of model data from Amazon S3 at startup and load the data into memory Users access the models through an asynchronous API Users can send a request or a batch of requests and specify where the results should be sent
The company provides models to hundreds of users. The usage patterns for the models are irregular. Some models could be unused for days or weeks Other models could receive batches of thousands of requests at a time
Which design should a solutions architect recommend to meet these requirements?

  • A. Direct the requests from the API to a Network Load Balancer (NLB) Deploy the models as AWS Lambda functions that are invoked by the NLB.
  • B. Direct the requests from the API to an Application Load Balancer (ALB). Deploy the models as Amazon Elastic Container Service (Amazon ECS) services that read from an Amazon Simple Queue Service (Amazon SQS) queue Use AWS App Mesh to scale the instances of the ECS cluster based on the SQS queue size
  • C. Direct the requests from the API into an Amazon Simple Queue Service (Amazon SQS) queue Deploy the models as AWS Lambda functions that are invoked by SQS events Use AWS Auto Scaling to increase the number of vCPUs for the Lambda functions based on the SQS queue size
  • D. Direct the requests from the API into an Amazon Simple Queue Service (Amazon SQS) queue Deploy the models as Amazon Elastic Container Service (Amazon ECS) services that read from the queue Enable AWS Auto Scaling on Amazon ECS for both the cluster and copies of the service based on thequeue size

Answer: C

NEW QUESTION 19
A company wants to migrate its on-premises application to AWS. The application produces output files that vary in size from tens of gigabytes to hundreds of terabytes The application data must be stored in a standard file system structure
The company wants a solution that scales automatically, is highly available, and requires minimum operational overhead.
Which solution will meet these requirements?

  • A. Migrate the application to run as containers on Amazon Elastic Container Service (Amazon ECS) Use Amazon S3 for storage
  • B. Migrate the application to run as containers on Amazon Elastic Kubernetes Service (Amazon EKS) Use Amazon Elastic Block Store (Amazon EBS) for storage
  • C. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
  • D. Use Amazon Elastic File System (Amazon EFS) for storage.
  • E. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
  • F. Use Amazon Elastic Block Store (Amazon EBS) for storage.

Answer: C

NEW QUESTION 20
A company needs the ability to analyze the log files of its proprietary application. The logs are stored
in JSON format in an Amazon S3 bucket Queries will be simple and will run on-demand A solutions
architect needs to perform the analysis with minimal changes to the existing architecture
What should the solutions architect do to meet these requirements with the LEAST amount of
operational overhead?

  • A. Use Amazon Redshift to load all the content into one place and run the SQL queries as needed
  • B. Use Amazon CloudWatch Logs to store the logs Run SQL queries as needed from the AmazonCloudWatch console
  • C. Use Amazon Athena directly with Amazon S3 to run the queries as needed
  • D. Use AWS Glue to catalog the logs Use a transient Apache Spark cluster on Amazon EMR to run theSQL queries as needed

Answer: C

Explanation:
Explanation
Amazon Athena can be used to query JSON in S3

NEW QUESTION 21
......

100% Valid and Newest Version SAA-C03 Questions & Answers shared by 2passeasy, Get Full Dumps HERE: https://www.2passeasy.com/dumps/SAA-C03/ (New 0 Q&As)