Intermediate Big Data on AWS introduces you to cloud-based big data solutions such as Amazon Elastic MapReduce (EMR), Amazon Redshift, Amazon Kinesis, and the rest of the AWS Data Analytics platform.
n this course, we will show you how to use Amazon EMR to process data using the broad ecosystem of Hadoop tools like Hive and Hue. We will also teach you how to create big data environments; work with Amazon DynamoDB, Amazon Redshift, Amazon QuickSight, Amazon Athena, and Amazon Kinesis; and leverage best practices to design big data environments for security and cost-effectiveness.
Što ćete naučiti
- Making decisions based on the AWS-recommended architectural principles and best practices
- Leveraging AWS services to make your infrastructure scalable, reliable, and highly available
- Leveraging AWS managed services to enable greater flexibility and resiliency in an infrastructure
- Making an AWS-based infrastructure more efficient in order to increase performance and reduce costs
- Using the Well-Architected Framework to improve architectures with AWS solutions
Kome je namijenjeno
This course is intended for:
- Solutions architects
- Solution design engineers
- 1. Core AWS Knowledge 2. Designing Your Environment 3. Making Your Environment Highly Available 4. Forklifting an Existing Application onto AWS 5. Event-Driven Scaling 6. Automating and Decoupling Your Infrastructure 7. Designing Storage at Scale 8. Hosting a New Web Application on AWS 9. The Four Pillars of the Well-Architected Framework 10. Disaster Recovery and Failover Strategies 11. Troubleshooting Your Environment 12. Large-Scale Design Patterns and Case Studies