It is more faster and easier to pass the Amazon AWS-Certified-Big-Data-Specialty exam by using Real Amazon Amazon AWS Certified Big Data - Speciality questuins and answers. Immediate access to the Renew AWS-Certified-Big-Data-Specialty Exam and find the same core area AWS-Certified-Big-Data-Specialty questions with professionally verified answers, then PASS your exam with a high score now.

Free AWS-Certified-Big-Data-Specialty Demo Online For Amazon Certifitcation:

NEW QUESTION 1
The operations team and the development team want a single place to view both operating system
and application logs.
How should you implement this using AWS services? Choose two answers

  • A. Using AWS CloudFormation, create a CloudWatch Logs LogGroup and send the operating system and application logs of interest using the CloudWatch Logs Agent
  • B. Using AWS CloudFormation and configuration management, set up remote logging to send events via UDP packets to CloudTrail
  • C. Using configuration management, set up remote logging to send events to Amazon Kinesis and insert these into Amazon CloudSearch or Amazon Redshift, depending on available analytic tools
  • D. Using AWS CloudFormation, create a CloudWatch Logs LogGrou
  • E. Because the CloudWatch log agent automatically sends all operating system logs, you only have to configure the application logs for sending off-machine
  • F. Using AWS CloudFormation, merge the application logs with the operating system logs, and use IAM Roles to allow both teams to have access to view console output from Amazon EC2

Answer: AC

NEW QUESTION 2
You have been asked to handle a large data migration from multiple Amazon RDS MySQL instances to
a DynamoDB table. You have been given a short amount of time to complete the data migration. What will allow you to complete this complex data processing workflow?

  • A. Create an Amazon Kinesis data stream, pipe in all of the Amazon RDS data, and direct data toward DynamoDB table
  • B. Write a script in you language of choice, install the script on an Amazon EC2 instance, and then use Auto Scaling groups to ensure that the latency of the mitigation pipelines never exceeds four seconds in any 15-minutes period.
  • C. Write a bash script to run on your Amazon RDS instance that will export data into DynamoDB
  • D. Create a data pipeline to export Amazon RDS data and import the data into DynamoDB

Answer: D

NEW QUESTION 3
A company has reproducible data that they want to store on Amazon Web Services. The company may want to retrieve the data on a frequent basis. Which Amazon web services storage option allows the customer to optimize storage costs and still achieve high availability for their data?

  • A. Amazon S3 Reduced Redundancy Storage
  • B. Amazon EBS Magnetic Volume
  • C. Amazon Glacier
  • D. Amazon S3 Standard Storage

Answer: A

NEW QUESTION 4
Your customers located around the globe require low-latency access to private video files. Which
configuration meets these requirements?

  • A. Use Amazon CloudFront with signed URLs
  • B. Use Amazon EC2 with provisioned IOPS Amazon EBS volumes
  • C. Use Amazon S3 with signed URLs
  • D. Use Amazon S3 with access control lists

Answer: A

NEW QUESTION 5
A customer has an Amazon S3 bucket. Objects are uploaded simultaneously by a cluster of servers from multiple streams of data. The customer maintains a catalog of objects uploaded in Amazon S3 using an Amazon DynamoDB table. This catalog has the following fields StreamName, TimeStamp, and ServerName, TimeStamp, and ServerName, from which ObjectName can be obtained.
The customer needs to define the catalog to support querying for a given stream or server within a defined time range.
Which DynamoDB table scheme is most efficient to support these queries?

  • A. Define a Primary Key with ServerName as Partition Key and TimeStamp as Sort Ke
  • B. Don NOT define a Secondary Index or Global Secondary Index.
  • C. Define a Primary Key with StreamName as Partition Key and TimeStamp followed by ServerName as Sort Ke
  • D. Define a Global Secondary Index with ServerName as Partition Key and TimeStamp followed by StreamName.
  • E. Define a Primary Key with ServerName as Partition Ke
  • F. Define a Local Secondary Index with StreamName as Partition Ke
  • G. Define a Global Secondary Index with TimeStamp as Partition Key.
  • H. Define a Primary Key with ServerName as Partition Ke
  • I. Define a Local Secondary Index with TimeStamp as Partition Ke
  • J. Define a Global Secondary Index with StreamName as Partition key and TimeStamp as Sort Key.

Answer: A

NEW QUESTION 6
A user has provisioned 2000 IOPS to the EBS volume. The application hosted on that EBS is experiencing less IOPS than provisioned. Which of the below mentioned options does not affect the IOPS of the volume?

  • A. The application does not have enough IO for the volume
  • B. The instance is EBS optimized
  • C. The EC2 instance has 10 Gigabit Network connectivity
  • D. The volume size is too large

Answer: D

NEW QUESTION 7
A photo sharing service stores pictures in Amazon Simple Storage Service (S3) and allows application
signin using an Open ID Connect compatible identity provider. Which AWS Security Token approach to temporary access should you use for the Amazon S3 operations?

  • A. SAML-based identity Federation
  • B. Cross-Account Access
  • C. AWS identity and Access Management roles
  • D. Web identity Federation

Answer: A

NEW QUESTION 8
When an Auto Scaling group is running in Amazon Elastic Compute Cloud (EC2), your application
rapidly scales up and down in response to load within a 10-minutes window; however, after the load peaks, you begin to see problems in your configuration management system where previously terminated Amazon EC2 resources are still showing as active.
What would be a reliable and efficient way to handle the cleanup of Amazon EC2 resources with your configuration management systems? Choose 2 answers

  • A. Write a script that is run by a daily cron job on an Amazon EC2 instance and that executes API Describe calls of the EC2 Auto Scaling group and removes terminated instances from the configuration management system
  • B. Configure an Amazon Simple Queue Service (SQS) queue for Auto Scaling actions that has a script that listens for new messages and removes terminated instances from the configuration management system
  • C. Use your existing configuration management system to control the launching and bootstrapping of instances to reduce the number of moving parts in the automation
  • D. Write a small script that is run during Amazon EC2 instance shutdown to de-register the resource from the configuration management system
  • E. Use Amazon Simple Workflow Service (SWF) to maintain an Amazon DynamoDB database that contains a whitelist of instances that have been previously launched, and allow the Amazon SWFworker to remove information from the configuration management system

Answer: AD

NEW QUESTION 9
In AWS, which security aspects are the customer’s responsibility? Choose 4 answers

  • A. Life-Cycle management of IAM credentials
  • B. Security Group and ACL settings
  • C. Controlling physical access to compute resources
  • D. Path management on the EC2 instance’s operating system
  • E. Encryption of EBS volumes
  • F. Decommissioning storage devices

Answer: ABDE

NEW QUESTION 10
You run a small online consignment marketplace. Interested sellers complete an online application in
order to allow them to sell their products on your website. Once approved, they can their product using a custom interface. From that point, you manage the shopping cart process so that when a buyer decides to buy a product, you handle the billing and coordination the shipping. Part of this process requires sending emails to the buyer and the seller at different stages. Your system has been running on AWS for a few months. Occasionally, products are shipped before payment has cleared and emails are sent out of order. Furthermore, sometimes credit cards are being charged twice.
How can you resolve these problems?

  • A. Use the Amazon Simple Queue Service (SQS), and use a different set of workers for each task
  • B. Use the Amazon Simple Workflow Service (SWF), and use a different set of workers for each task.
  • C. Use the Simple Email Service (SES) to control the correct order of email delivery
  • D. Use the AWS Data Pipeline service to control the process flow of the various tasks
  • E. Use the Amazon Simple Queue Service (SQS), and use a single set of workers for each task

Answer: E

NEW QUESTION 11
A systems engineer for a company proposes digitalization and backup of large archives for customers.
The systems engineer needs to provide users with a secure storage that makes sure that data will never be tempered with once it has been uploaded. How should this be accomplished?

  • A. Create an Amazon Glacier Vaul
  • B. Specify a “Deny” Vault lock policy on this vault to block “glacier:DeleteArchive”.
  • C. Create an Amazon S3 bucke
  • D. Specify a “Deny” bucket policy on this bucket to block “s3:DeleteObject”.
  • E. Create an Amazon Glacier Vaul
  • F. Specify a “Deny” vault access policy on this Vault to block “glacier:DeleteArchive”.
  • G. Create a secondary AWS containing an Amazon S3 bucke
  • H. Grant “s3:PutObject” to the primary account.

Answer: A

NEW QUESTION 12
When attached to an Amazon VPC which two components provide connectivity with external
networks? Choose 2 answers

  • A. Elastic IPS (EIP)
  • B. NAT Gateway (NAT)
  • C. Internet Gateway {IGW)
  • D. Virtual Private Gateway (VGW)

Answer: CD

NEW QUESTION 13
A game company needs to properly scale its game application, which is backed by DynamoDB.
Amazon Redshift has the past two years of historical data. Game traffic varies throughout the year based on various factors such as season, movie release, and holiday season. An administrator needs to calculate how much read and write throughput should be previsioned for DynamoDB table for each week in advance.
How should the administrator accomplish this task?

  • A. Feed the data into Amazon Machine Learning and build a regression model
  • B. Feed the data into Spark Mlib and build a random forest model
  • C. Feed the data into Apache Mahout and build a multi-classification model
  • D. Feed the data into Amazon Machine Learning and build a binary classification model

Answer: B

NEW QUESTION 14
You are working with customer who has 10 TB of archival data that they want to migrate to Amazon Glacier. The customer has a 1Mbps connection to the Internet. Which service or feature provide the fastest method of getting the data into Amazon Glacier?

  • A. Amazon Glacier multipart upload
  • B. AWS Storage Gateway
  • C. VM Import/Export
  • D. AWS Import/Export

Answer: D

NEW QUESTION 15
An Amazon EMR cluster using EMRFS has access to Megabytes of data on Amazon S3, originating
from multiple unique data sources. The customer needs to query common fields across some of the data sets to be able to perform interactive joins and then display results quickly.
Which technology is most appropriate to enable this capability?

  • A. Presto
  • B. MicroStrategy
  • C. Pig
  • D. R Studio

Answer: A

NEW QUESTION 16
A user is planning to setup infrastructure on AWS for the Christmas sales. The user is planning to use
Auto Scaling based on the schedule for proactive scaling. What advise would you give to the user?

  • A. It is good to schedule now because if the user forgets later on it will not scale up
  • B. The scaling should be setup only one week before Christmas
  • C. Wait till end of November before scheduling the activity
  • D. It is not advisable to use scheduled based scaling

Answer: C

NEW QUESTION 17
A photo-sharing service stores pictures in Amazon Simple Storage Service (S3) and allows application sign-in using an opened connect-compatible identity provider. Which AWS Security Token Service approach to temporary access should you use for the Amazon S3 operations?

  • A. Cross-Account Access
  • B. AWS identity and Access Management roles
  • C. SAML-based Identity Federation
  • D. Web identity Federation

Answer: C

NEW QUESTION 18
An online photo album app has a key design feature to support multiple screens (e.g. desktop,
mobile phone, and tablet) with high quality displays. Multiple versions of the image must be saved in different resolutions and layouts.
The image processing Java program takes an average of five seconds per upload, depending on the image size and format. Each image upload captures the following image metadata: user, album, photo label, upload timestamp
The app should support the following requirements:
• Hundreds of user image uploads per second
• Maximum image metadata size of 10 MB
• Maximum image metadata size of 1 KB
• Image displayed in optimized resolution in all supported screens no later than one minute after image upload
Which strategy should be used to meet these requirements?

  • A. Write images and metadata to Amazon Kinesis, Use a Kinesis Client Library (KCL) application torun the image processing and save the image output to Amazon S3 and metadata to the app repository DB
  • B. Write image and metadata RDS with BLOB data typ
  • C. Use AWS Data Pipeline to run the image processing and save the image output to Amazon S3 and metadata to the app repository DB
  • D. Upload image with metadata to Amazon S3 use Lambda function to run the image processing and save the image output to Amazon S3 and metadata to the app repository DB
  • E. Write image and metadata to Amazon kinesi
  • F. Use Amazon Elastic MapReduce (EMR) with Spark Streaming to run image processing and save image output to Amazon

Answer: D

NEW QUESTION 19
You have been asked to use your department’s existing continuous integration (CI) tool to test a
three- tier web architecture defined in an AWS CloudFormation template. The tool already supports AWS APIs and can launch new AWS CloudFormation stacks after polling version control. The CI tool reports on the success of the AWS CloudFormation stack creation by using the DescribeStacks API to look for the CREATE_COMPLETE status.
The architecture tiers defined in the template consist of:
. One load balancer
. Five Amazon EC2 instances running the web application
. One multi-AZ Amazon RDS instance How would you implement this? Choose 2 answers

  • A. Define a WaitCondition and a WaitConditionhandle for the output of a output of a UserData command that does sanity checking of the application’s post-install state
  • B. Define a CustomResource and write a script that runs architecture-level integration tests through the load balancer to the application and database for the state of multiple tiers
  • C. Define a WaitCondition and use a WaitConditionHandle that leverages the AWS SDK to run the DescribeStacks API call until the CREATE_COMPLETE status is returned
  • D. Define a CustomResource that leverages the AWS SDK to run the DescribeStacks API call until the CREATE_COMPLETE status is returned
  • E. Define a UserDataHandle for the output of a UserData command that does sanity checking of the application’s post-install state and runs integration tests on the state of multiple tiers through load balancer to the application
  • F. Define a UserDataHandle for the output of a CustomResource that does sanity checking of the application’s post-install state

Answer: AF

NEW QUESTION 20
A large oil and gas company needs to provide near real-time alerts when peak thresholds are exceeded in its pipeline system. The company has developed a system to capture pipeline metrics such as flow rate, pressure and temperature using millions of sensors. The sensors deliver to AWS IoT.
What is a cost-effective way to provide near real-time alerts on the pipeline metrics?

  • A. Create an AWS IoT rule to generate an Amazon SNS notification
  • B. Store the data points in an Amazon DynamoDB table and polite peak metrics data from an Amazon EC2 application
  • C. Create an Amazon Machine Learning model and invoke with AWS Lambda
  • D. Use Amazon Kinesis Streams and a KCL-based application deployed on AWS Elastic Beanstalk

Answer: BD

NEW QUESTION 21
You have launched an Amazon Elastic Compute Cloud (EC2) instance into a public subnet with a primary private IP address assigned, an internet gateway is attached to the VPC, and the public route table is configured to send all internet-based internet. Why is the internet unreachable from this instance?

  • A. The Internet gateway security group must allow all outbound traffic
  • B. The instance does not have a public IP address
  • C. The instance “Source/Destination check” property must be enabled
  • D. The instance security group must allow all inbound traffic

Answer: B

NEW QUESTION 22
Company A operates in Country X, Company A maintains a large dataset of historical purchase orders
that contains personal data of their customers in the form of full names and telephone numbers. The dataset consists of 5 text files. 1TB each. Currently the dataset resides on- premises due to legal requirements of storing personal data in-country. The research and development department need to run a clustering algorithm on the dataset and wants to use Elastic Map Reduce service in the closes AWS region. Due to geographic distance the minimum latency between the on-premises system and the closet AWS region is 200 ms.
Which option allows Company A to do clustering in the AWS Cloud and meet the legal requirement of maintaining personal data in-country?

  • A. Anonymize the personal data portions of the dataset and transfer the data files into Amazon S3 in the AWS regio
  • B. Have the EMR cluster read the dataset using EMRFS.
  • C. Establishing a Direct Connect link between the on-premises system and the AWS region to reduce latenc
  • D. Have the EMR cluster read the data directly from the on-premises storage system over Direct Connect.
  • E. Encrypt the data files according to encryption standards of Country X and store them in AWS region in Amazon S3. Have the EMR cluster read the dataset using EMRFS.
  • F. Use AWS Import/Export Snowball device to securely transfer the data to the AWS region and copy the files onto an EBS volum
  • G. Have the EMR cluster read the dataset using EMRFS.

Answer: B

NEW QUESTION 23
A user is trying to setup a recurring Auto Scaling process. The user has setup one process to scale up every day at 8 am and scale down at 7 PM. The user is trying to setup another recurring process which scales up on the 1st of every month at 8 AM and scales down the same day at 7 PM. What will Auto Scaling do in this scenario?

  • A. Auto Scaling will execute both processes but will add just one instance on the 1st
  • B. Auto Scaling will add two instances on the 1st of the month
  • C. Auto Scaling will schedule both the processes but execute only one process randomly
  • D. Auto Scaling will throw an error since there is a conflict in the schedule of two separate Auto Scaling Processes

Answer: D

NEW QUESTION 24
......

100% Valid and Newest Version AWS-Certified-Big-Data-Specialty Questions & Answers shared by Simply pass, Get Full Dumps HERE: https://www.simply-pass.com/Amazon-exam/AWS-Certified-Big-Data-Specialty-dumps.html (New 243 Q&As)