Amazon ECS with Fargate launch MCQ


Hello friends if you are looking for Amazon ECS with Fargate launch Multiple choice questions | Amazon ECS with Fargate launch MCQ with Answers | Amazon ECS with Fargate launch Objective type questions | Amazon ECS with Fargate launch Questions with answers then here you will get the right answers

Join our Telegram Channel for Daily Updates on Accenture Examshttps://t.me/+U0BWlikjjG5jNDVl

Please use ‘FIND IN PAGE OPTION’ in google chrome to find any questions

A company recently implemented hybrid cloud connectivity using AWS Direct Connect and is migrating
data to Amazon S3. The company is looking for a fully managed solution that will automate and accelerate
the replication of data between the on-premises storage systems and AWS storage services.
Which solution should a solutions architect recommend to keep the data private?

A. Deploy an AWS Storage Gateway volume gateway for the on-premises environment Configure it to store data
locally, and asynchronously back up point-in-time snapshots to AWS.
B. Deploy an AWS Storage Gateway file gateway for the on-premises environment. Configure it to store data
locally, and asynchronously back up point-in-lime snapshots to AWS.
C. Deploy an AWS DataSync agent tor the on-premises environment Configure a sync job to replicate the data
and connect it with an AWS service endpoint.
D. Deploy an AWS DataSync agent for the on-premises environment. Schedule a batch job to replicate point-In-
time snapshots to AWS.

Ans: c

2.Your website has been suffering performance issues, and you have been able to determine that this is due to a spike in traffic to your servers. The servers are behind an ELB and the CPU on both Amazon EC2 instances hovers around 95% during this time frame. Your
boss has asked you to find a way improve performance without impacting cost any more
than is absolutely necessary, What should you do?

A. Create an EC2 Auto Scaling group and bave Amazon CloudTrail trigger an autoscale
event to scale up when the CPU teaches 80% and seale down when the CPU drops to
40%.
B. Creare an EC2 Auto Scaling group and have Amazon Cloud Watch trigger an
autoscale event to scale up when the CPU reaches 80% and scale down when the CPU
drops to 40%,
C. Create an EC2 Auto Scaling group and have Amazon Cloud Watch trigger an
autoscale event to scale up when the CPU reaches 95% and scale down when the CPU
drops to 40%,
D. Create an EC2 Auto Scaling group and have Amazon Cloud Watch trigger an
auroscale event to scale up when the CPU reaches 80% and scale down when the CPU drops to 75%,

Ans: b

3.A Developer wants to debug an application by searching and filtering log data. The application logs are stored in Amazon CloudWatch Logs. The Developer creates a new metric filter to count exceptions in the
application logs. However, no results are
returned from the logs.
What is the reason that no filtered results
are being returned?

  1. A setup of the Amazon CloudWatch interface VPC endpoint is required for filtering the CloudWatch Logs in the VPC
  2. cloudWatch Logs only publishes metric data for events that happen after the filter is created
  3. The log group for CloudWatch Logs should be first streamed to Amazon Elasticsearch Service before metric filtering returns the results
  4. Metric data points for logs groups can be filtered only after they are exported to an Amazon $3 bucket

Ans: 2

4. When a developer calls the Amazon CloudWatch API, he receives HTTP 400: ThrottlingException errors sporadically. When a call is not successful, no data is obtained. Which best practice should be implemented first in order to remedy this issue?

A. Contact AWS Support for a limit increase.
B. Use the AWS CLI to get the metrics
C. Analyze the applications and remove the API call
D. Retry the call with exponential backoff

Ans: b

5.Your company has a set of resources hosted on the AWS(Amazon Web Service) Cloud. As a part of the
new governing model, there is a requirement that all activity on AWS(Amazon Web Service) resources
should be monitored. What is the most efficient way to have this implemented?

Options are:
@ Use VPC Flow Logs to monitor all activity in your VPC.
@ Use AWS(Amazon Web Service) Trusted Advisor to monitor all of your AWS(Amazon Web Service) resources.
@ Use AWS(Amazon Web Service) Inspector to inspect all of the resources in your account.
@ Use AWS(Amazon Web Service) CloudTrail to monitor all API activity.

Answer :Use AWS(Amazon Web Service) CloudTrail to monitor all API activity.

6.A data-processing application runs on an i3.large EC2 instance with a single 100 GB EBS gp2 volume. The
application stores temporary data in a small database (less than 30 GB) located on the EBS root volume.
The application is struggling to process the data fast enough, and a Solutions Architect has determined
that the I/O speed of the temporary database is the bottleneck.
What is the MOST cost-efficient way to improve the database response times?

A. Enable EBS optimization on the instance and keep the temporary files on the existing volume.
B. Move the temporary database onto instance storage.
C. Put the temporary database on a new 50-GB EBS io1 volume with a 3-K lOPS provision.
D. Put the temporary database on a new 50-GB EBS gp2 volume.

Ans: b

7. A media company asked a Solutions Architect to design a highly available
storage solution to serve as a centralized document store for their Amazon EC2
instances. The storage solution needs to be POSIX-compliant, scale dynamically, and be able to serve up to 100 concurrent EC2 instances.

Ans – Create an Amazon Elastic File System (Amazon EFS) to store and share the documents.

8.A company has 500 TB of data in an on-premises file share that needs to
be moved to Amazon S3 Glacier. The migration must not saturate the
company’s low-bandwidth internet connection, and the company must
complete it within a few weeks. What is the MOST cost-effective solution?

  1. Upload the files to Amazon S3 Glacier using the available bandwidth.
  2. Use AWS Global Accelerator to accelerate upload and optimize usage
    of the available bandwidth.
  3. Order 7 AWS Snowball appliances and select an S3 Glacier vault as the an destination.
  4. Order 7 AWS Snowball appliances and select an Amazon S3 bucket as
    the destination. Create a lifecycle policy to transition the S3 objects to
    Amazon S3 Glacier

Ans: 4

9.A Solutions Architect is designing a stateful web application that will run for one year (24/7) and then be decommissioned. Load on this platform will be constant, using a number of r4.8xlarge instances. Key drivers for this system include high availability, but elasticity is not required. What is the MOST cost-effective way to purchase compute for this platform?

Ans – Standard Reserved Instances

10.A web application requires a minimum of six Amazon Elastic Compute Cloud (EC2) instances running at all times. You are tasked to deploy the application to three availability zones in the EU Ireland region (eu-west-la, eu-west-Ib, and eu- west-Ic). It is required that the system is fault-tolerant up to the loss of one Availability Zone. Which of the following setup is the most cost-effective
solution which also maintains the fault-tolerance of your
system?

A. 6 instances in eu-west-la, 6 instances in eu-west-lb, and 6
instances in eu-west-Ic
B. 3 instances in eu-west-la, 3 instances in eu-west-lb, and 3
instances in eu-west-lc
C. 6 instances in eu-west-la, 6 instances in eu-west-lb, and no
instances in eu-west-lc
D. 2 instances in eu-west-la, 2 instances in eu-west-lb, and 2
instances in eu-west-lc

Ans: b

On Amazon EC2, a business hosts an ecommerce application. The application is composed of a
stateless web layer that needs a minimum of 10 instances and a maximum of 250 instances to run.
80% of the time, the program needs 50 instances.
Which solution should be adopted in order to keep expenses down?

A. Purchase Reserved Instances to cover 250 instances.
B. Purchase Reserved Instances to cover 80 instances. Use Spot Instances to cover the remaining
instances.
C. Purchase On-Demand Instances to cover 40 instances. Use Spot Instances to cover the
remaining instances.
D. Purchase Reserved Instances to cover 50 instances. Use On-Demand and Spot Instances to
cover the remaining instances.

Ans: d

A solutions architect is designing a two-tier web application The application consists of a public-facing web
tier hosted on Amazon EC2 in public subnets The database tier consists of Microsoft SQL Server running
on Amazon EC2 in a private subnet Security is a high priority for the company How should security groups
be configured in this situation? (Select TWO )

A.Configure the security group for the database tier to allow inbound traffic on ports 443 and 1433 from the
security group for the web tier
B. Configure the security group for the web tier to allow inbound traffic on port 443 from 0 0 0 0/0
C. Configure the security group for the database tier to allow inbound traffic on port 1433 from the security group
for the web tier
D. Configure the security group for the web tier to allow outbound traffic on port 443 from 00 0 0/0
E. Configure the security group for the database tier to allow outbound traffic on ports 443 and 1433 to the
security group for the web tier

Ans: BC

A company developed a set of APIs that are being served through the Amazon API! Gateway. The API calls
need to be authenticated based on OpenID identity providers such as Amazon or Facebook. The APIs should
allow access based on a custom authorization model.
Which is the simplest and MOST secure design to use to build an authentication and authorization model for
the APIs?

A. Use Amazon Cognito user pools and a custom authorizer to authenticate and authorize users based on JSON Web Tokens.
B. Build a OpenID token broker with Amazon and Facebook. Users will authenticate with these identify providers and pass the JSON Web Token to the AP! to authenticate each API call.
C. Store user credentials in Amazon DynamoDB and have the application retrieve temporary credentials from AWS STS. Make API calls by passing user credentials to the APIs for authentication and
authorization.
D. Use Amazon RDS to store user credentials and pass them to the APIs for authentications and authorization.

Ans: A

When an enterprise migrates an application to the cloud as is, without making any modifications, what is this called?

Ans – Rehost

A solutions architect is designing a high performance computing (HPC) workload on Amazon EC2 The EC2
instances need to communicate to each other frequently and require network performance with low latency
and high throughput Which EC2 configuration meets these requirements’?

@ A. Launch the EC2 instances in a cluster placement group in one Availability Zone
© B. Launch the EC2 instances in a spread placement group in one Availability Zone
© C. Launch the EC2 instances in an Auto Scaling group in two Regions and peer the VPCs
© D. Launch the EC2 instances in an Auto Scaling group spanning multiple Availability Zones

Ans: A

A company’s legacy application is currently relying on a single-instance Amazon RDS MySQL database
without encryption Due to new compliance requirements, all existing and new data in this database must be
encrypted How should this be accomplished?

A. Take a snapshot of the RDS instance Create an encrypted copy of the snapshot Restore the RDS instance
from the encrypted snapshot
© B. Enable RDS Multi-AZ mode with encryption at rest enabled Perform a failover to the standby instance to
delete the original instance
© C. Create an RDS read replica with encryption at rest enabled Promote the read replica to master and switch the
application over to the new master Delete the old RDS instance.
© D. Create an Amazon S3 bucket with server-side encryption enabled Move all the data to Amazon $3 Delete the
RDS instance

Ans: A

A company needs to ingest terabytes of data each hour from thousands of sources that are delivered almost continually throughout the day. The volume of messages generated varies over the course of the day. Messages must be delivered in real time for fraud detection and live operational dashboards. Which approach will meet these
requirements?

Ans – Use Amazon Kinesis Data Streams with Kinesis Client Library to ingest and deliver messages

A company Is Penang to migrate a business-critical dataset to Amazon S3. The current solution design uses a single S3 bucket in the us-east-1 Region with versioning enabled to store the dataset. The company’s disaster recovery policy states that all data multiple AWS Regions.
How should a solutions architect design the S3 solution?

A. Create an additional $3 bucket in another Region and configure cross-Region replication.
B. Create an additional S3 bucket with versioning in another Region and configure cross-Region replication.
C. Create an additional $3 bucket with versioning in another Region and configure cross-origin resource
(CORS).
D. Create an additional $3 bucket in another Region and configure cross-origin resource sharing (CORS).

Ans: b

A solutions architect needs to ensure that API calls to Amazon DynamoDB from Amazon EC2 instances ina
VPC do not traverse the internet What should the solutions architect do to accomplish this? (Select TWO )

A. Create a route table entry for the endpoint
B. Create a gateway endpoint for DynamoDB
C. Create a new DynamoDB table that uses the endpoint
D. Create an ENI for the endpoint in each of the subnets of the VPC
E. Create a security group entry in the default security group to provide access

Ans: a b

A company’s production application runs online transaction processing (OLTP) transactions on an Amazon RDS MySQL DB instance The company is launching a new reporting tool that will access the same data The reporting tool must be highly available and not impact the performance of the production application How can this be achieved’?

A. Create a Multi-AZ RDS Read Replica of the production RDS DB instance
B. Create hourly snapshots of the production RDS DB instance
C. Create a Single-AZ RDS Read Replica of the production RDS DB instance Create a second Single-AZ RDS
Read Replica from the replica
D. Create multiple RDS Read Replicas of the production RDS DB instance Place the Read Replicas in an Auto
Scaling group

Ans: A

ASERT application is using an Amazon SQS queue. The processing layer that is retrieving messages from the queue is not able to keep up with the number of messages being placed in the queue. What is the FIRST step the developer should take to increase the number of messages the application receives?

1: Use the API to update the WaitTimeSeconds parameter to a value other than 0
2: Add additional Amazon SQS queues and have the application poll those queues
3: Use the ReceiveMessage API to retrieve up to 10 messages at a time 4: Configure the queue to use short
polling


Ans: 3

You are managing an online platform which allows people to easily buy, sell, spend, and
manage their cryptocurrency. To meet the strict IT audit requirements, each of the API calls on
all of your AWS(Amazon Web Service) resources should be properly captured and recorded.
You used CloudTrail in your VPC to help you in the compliance, operational auditing, and risk
auditing of your AWS(Amazon Web Service) account.
In this scenario, where does CloudTrail store all of the logs that it creates?

Ans – Amazon S3

A company’s web application is running on Amazon EC2 instances behind an Application Load Balancer. The company
recently changed its policy, which now requires the application to be accessed from one specific country only.
Which configuration will meet this requirement?

A. Configure the security group for the EC2 instances.
B. Configure the network ACL for the subnet that contains the EC2 instances.
C. Configure the security group on the Application Load Balancer.
D. Configure AWS WAF on the Application Load Balancer in a VPC.

Ans: d

A company runs multiple Amazon EC2 Linux instances in a VPC with applications that use a hierarchical directory structure.
The applications need to rapidly and concurrently read and write to shared storage How can this be achieved?

A. Create file systems on Amazon EBS volumes attached to each EC2 instance. Synchronize the Amazon EBS volumes across the
different EC2 instances.
B. Create a file system on an Amazon EBS Provisioned IOPS SSD (io1) volume. Attach the volume to all the EC2 instances.
C. Create an Amazon S3 bucket and permit access from all the EC2 instances in the VPC.
D. Create an Amazon EFS file system and mount it from each EC2 instance.

Ans: d

A company recently expanded globally and wants to make its application accessible to users in those geographic locations.
The application is deploying on Amazon EC2 instances behind an Application Load balancer in an Auto Scaling group. The
company needs the ability shift traffic from resources in one region to another.
What should a solutions architect recommend?

a . Configure an Amazon Route 53 geoproximity fouling policy.
B. Configure an Amazon Route 53 geolocation routing policy
C. Configure an Amazon Route 53 latency routing policy
D. Configure an Amazon Route 53 multivalue answer routing policy

Ans: b

A company offers an online product brochure that is delivered from a static website running on Amazon S3. The company’s customers are mainly in
the United States, Canada, and Europe. The company is looking to cost-effectively reduce the latency for users in these regions.
What is the most cost-effective solution to these requirements?

A. Create an Amazon CloudFront distribution and use Lambda@Edge to run the website’s data processing closer to the users
B. Create an Amazon CloudFront distribution that uses origins in U.S, Canada and Europe
C. Create an Amazon CloudFront distribution and set the price class to use all Edge Locations for best performance
D. Create an Amazon CloudFront distribution and set the price class to use only U.S, Canada and Europe.

Ans: d

A Solutions Architect must select the most appropriate database service for two use cases. A team of data scientists perform complex queries on a data warehouse that take several hours to complete. Another team of scientists need
to run fast, repeat queries and update dashboards for customer support staff.
Which solution delivers these
requirements MOST cost-
effectively?

Answer: Redshift for both use cases.

A Solutions Architect must design a storage solution for incoming billing reports in CSV format. The data
does not need to be scanned frequently and is discarded after 30 days.
Which service will be MOST cost-effective in meeting these requirements?

A. Use AWS Data Pipeline to import the logs into a DynamoDB table.
B. Write the files to an S3 bucket and use Amazon Athena to query the data.

C. Import the logs into an RDS MySQL instance.
D. Import the logs to an Amazon Redshift cluster

Ans: B

An application hosted on AWS is experiencing performance problems, and the application vendor wants to
f perform an analysis of the log file to troubleshoot further. The log file is stored on Amazon S3 and is 10 GB in
size. The application owner will make the log file available to the vendor for a limited time.
What is the MOST secure way to do this?

A. Enable public read on the S3 object and provide the link to the vendor.
B. Upload the file to Amazon WorkDocs and share the public link with the vendor.
C. Generate a presigned URL and have the vendor download the log file before it expires.
D. Create an IAM user for the vendor to provide access to the S3 bucket and the application. Enforce multi-
factor authentication.

Ans: C

An application hosted on AWS is experiencing performance problems, and the application vendor wants to
perform an analysis of the log file to troubleshoot further. The log file is stored on Amazon S3 and is 10 GB in
size. The application owner will make the log file available to the vendor for a limited time.
What is the MOST secure way to do this?

A. Enable public read on the S3 object and provide the link to the vendor.
B. Upload the file to Amazon WorkDocs and share the public link with the vendor.
C. Generate a presigned URL and have the vendor download the log file before it expires.
D. Create an IAM user for the vendor to provide access to the S3 bucket and the application. Enforce multi-
factor authentication.

Ans: C

A company runs an application on a group of Amazon Linux EC2 instances. The application writes log files
using standard API calls. For compliance reasons, all log files must be retained indefinitely and will be
analyzed by a reporting tool that must access all files concurrently.
Which storage service should a solutions architect use to provide the MOST cost-effective solution?

A. Amazon EBS
B. Amazon EFS
C. Amazon EC2 instance store
D. Amazon S3

Ans: D

A company’s web application is using multiple Linux Amazon EC2 instances and
storing data on Amazon EBS volumes. The company is looking for a solution to
increase the resiliency of the application in case of a failure and to provide storage
that complies with atomicity, consistency, isolation, and durability (ACID).
What should a solutions architect do to meet these requirements?

A. Launch the application on EC2 instances in each Availability Zone. Attach EBS
volumes to each EC2 instance.
B. Create an Application Load Balancer with Auto Scaling groups across multiple
Availability Zones. Mount an instance store on each EC2 instance.
C. Create an Application Load Balancer with Auto Scaling groups across multiple
Availability Zones. Store data on Amazon EFS and mount a target on each instance.
D. Create an Application Load Balancer with Auto Scaling groups across multiple
Availability Zones. Store data using Amazon S3 One Zone-Infrequent Access (S3
One Zone-IA).

Ans: C

Accompany hosts a static website within an Amazon $3 bucket. A solutions architect needs to ensure that data can be
recovered in case of accidental deletion.
D18912E1457D5D1DDCBD40AB3BF70D5D
Which action will accomplish this?

A. Enable an Amazon S3 lifecycle policy
B. Enable Amazon S3 Intelligent-Tiering.
C. Enable Amazon S3 cross-Region replication.
D. Enable Amazon S3 versioning

Ans: D

A marketing company is storing CSV files in an Amazon $3 bucket for statistical analysis An application on an Amazon EC2 instance needs permission to efficiently process the CSV data stored in the $3 bucket.
Which action will MOST securely grant the EC2 instance access to the S3 bucket?

A. Attach a resource-based policy to the S3 bucket
B. Create an IAM user for the application with specific permissions to the S3 bucket
C. Store AWS credentials directly on the EC2 instance for applications on the instance to use for API calls
D. Associate an IAM role with least privilege permissions to the EC2 instance profile

Ans: D

A solutions architect is deploying a distributed database on multiple Amazon EC2 instances The database stores all data on
multiple instances so it can withstand the loss of an instance The database requires block storage with latency and throughput
to support several million transactions per second per server Which storage solution should the solutions architect use?

A. Amazon EBS
B. Amazon EC2 instance store
C. Amazon EFS
D.Amazon S3

Ans: A

You are managing an online platform which allows people to easily buy, sell, spend, and manage
their cryptocurrency. To meet the strict IT audit requirements, each of the API calls on all of your
AWS(Amazon Web Service) resources should be properly captured and recorded. You used
CloudTrail in your VPC to help you in the compliance, operational auditing, and risk auditing of
your AWS(Amazon Web Service) account.
In this scenario, where does CloudTrail store all of the logs that it creates?

Answer : Amazon S3

What steps do you need to take to share the encrypted EBS snapshot with the
Prod account? (choose 2)

Answer »Modify.the permissions.on the encrypted snapshot to share it with
the Prod account Share’the custom keyused to encrypt the volume

A company’s production application runs online transaction processing (OLTP) transactions on an Amazon
RDS MySQL DB instance The company is launching a new reporting tool that will access the same data The
reporting tool must be highly available and not impact the performance of the production application How
can this be achieved’?

A. Create a Multi-AZ RDS Read Replica of the production RDS DB instance
B. Create hourly snapshots of the production RDS DB instance
C. Create a Single-AZ RDS Read Replica of the production RDS DB instance Create a second Single-AZ RDS
Read Replica from the replica
D. Create multiple RDS Read Replicas of the production RDS DB instance Place the Read Replicas in an Auto
Scaling group

Ans: A

A solutions architect needs to ensure that API calls to Amazon DynamoDB from Amazon EC2 instances ina
VPC do not traverse the internet What should the solutions architect do to accomplish this? (Select TWO )

A. Create a route table entry for the endpoint
B. Create a gateway endpoint for DynamoDB
C. Create a new DynamoDB table that uses the endpoint
D. Create an ENI for the endpoint in each of the subnets of the VPC
E. Create a security group entry in the default security group to provide access

Ans: A B

A company Is Penang to migrate a business-critical dataset to Amazon S3. The current solution design
uses a single S3 bucket in the us-east-1 Region with versioning enabled to store the dataset. The
company’s disaster recovery policy states that all data multiple AWS Regions.
How should a solutions architect design the S3 solution?

A. Create an additional S3 bucket in another Region and configure cross-Region replication.
B. Create an additional S3 bucket with versioning in another Region and configure cross-Region replication.
Cc. Create an additional S3 bucket with versioning in another Region and configure cross-origin resource
(CORS).
D. Create an additional S3 bucket in another Region and configure cross-origin resource sharing (CORS).

Ans: B


Leave a Reply

Your email address will not be published. Required fields are marked *