考题解析 | 使用 Amazon QuickSight 实现可视化仪表盘监控


  题目

A company is setting up a centralized logging solution on AWS and has several requirements. The company wants its Amazon CloudWatch Logs and VPC Flow logs to come from different sub accounts and to be delivered to a single auditing account. However, the number of sub accounts keeps changing. The company also needs to index the logs in the auditing account to gather actionable insight.
How should a DevOps Engineer implement the solution to meet all of the company’s requirements?

A. Use AWS Lambda to write logs to Amazon ES in the auditing account. Create an Amazon CloudWatch subscription filter and use Amazon Kinesis Data Streams in the sub accounts to stream the logs to the Lambda function deployed in the auditing account.
B. Use Amazon Kinesis Streams to write logs to Amazon ES in the auditing account. Create a
CloudWatch subscription filter and use Kinesis Data Streams in the sub accounts to stream the logs to the Kinesis stream in the auditing account.
C. Use Amazon Kinesis Firehose with Kinesis Data Streams to write logs to Amazon ES in the auditing account. Create a CloudWatch subscription filter and stream logs from sub accounts to the Kinesis stream in the auditing account.
D. Use AWS Lambda to write logs to Amazon ES in the auditing account. Create a CloudWatch
subscription filter and use Lambda in the sub accounts to stream the logs to the Lambda function deployed in the auditing account.

  参考答案

C

  参考解析

技巧:排除明显错误选项,在没有明显错误的选项中选择最合理的选项。

该题目的需求是,需要在AWS上设置一个集中式的日志解决方案,实现来自不同的子账户的 Amazon CloudWatch Logs和VPC Flow logs,被传送到一个单独的审计账户。同时子账户的数量是不断变化的。公司需要在审计账户中对日志进行索引,以便获取可操作的见解。

A. 不正确。使用 AWS Lambda 将日志写入审计账户中的 Amazon ES(Elasticsearch Service)。在子账户中创建 Amazon CloudWatch 订阅过滤器,并使用 Amazon Kinesis Data Streams 将日志流传输送到部署在审计账户中的Lambda函数。这个选项使用 Lambda 来写入Amazon ES,Lambda与ES的直接集成不是标准做法,它还需要额外的配置来确保日志能够正确地从子账户传输到审计账户。
B. 不正确。使用Amazon Kinesis Streams将日志写入审计账户中的Amazon ES(Elasticsearch Service)。在子账户中创建CloudWatch订阅过滤器,并使用Kinesis Data Streams将日志流传输送到审计账户中的Kinesis流。这个方案依赖于Kinesis Streams作为日志的传输中介,但Kinesis Streams本身并不直接支持将日志写入ES。还需要额外的步骤或服务来将Kinesis Streams中的数据传输到ES。
C. 正确。使用 Amazon Kinesis Firehose 与 Kinesis Data Streams 配合,将日志写入审计账户中的Amazon ES(Elasticsearch Service)。在子账户中创建CloudWatch订阅过滤器,并将日志从子账户流传输送到审计账户中的Kinesis Stream。这个选项是合理的。CloudWatch Logs的订阅过滤器可以将日志数据发送到Kinesis Data Streams。然后,Kinesis Firehose可以从Kinesis Data Streams中读取数据,并将其直接写入Amazon ES,这样可以对日志进行索引和搜索。这种架构能够灵活地处理来自多个子账户的日志,并且当子账户数量变化时,只需要调整订阅过滤器和Kinesis Firehose的配置即可。
D. 不正确。使用 AWS Lambda 将日志写入审计账户中的 Amazon ES。在子账户中创建CloudWatch 订阅过滤器,并使用子账户中的 Lambda 将日志流传输送到部署在审计账户中的 Lambda 函数。这个选项也依赖于Lambda来处理日志,但Lambda通常不是用于将数据直接写入ES的推荐方法。此外,它还在每个子账户中使用Lambda来触发日志的传输,这样的设计会增加每个子账户的复杂性和管理成本,如果子账户数量很多,这种方法可能不太实际。