AWS - Big Data on AWS
Course Description
Big Data on AWS training introduces you to cloud-based big data solutions and Amazon Elastic MapReduce (EMR), the AWS big data platform. In this course, we show you how to use Amazon EMR to process data using the broad ecosystem of Hadoop tools like Pig and Hive. We also teach you how to create big data environments, work with Amazon DynamoDB and Amazon Redshift, understand the benefits of Amazon Kinesis, and leverage best practices to design big data environments for security and cost-effectiveness.
This course leads to AWS Certified Data Analytics - Specialty Certification.
Note: Lab time available only for class duration & not beyond. Additional lab charges apply for “repeat students”.
Course Objectives
- Apache Hadoop in the context of Amazon EMR
- The architecture of an Amazon EMR cluster
- Launch an Amazon EMR cluster using an appropriate Amazon Machine Image and Amazon EC2 instance types
- Appropriate AWS data storage options for use with Amazon EMR
- Ingesting, transferring, and compressing data for use with Amazon EMR
- Use common programming frameworks available for Amazon EMR including Hive, Pig, and Streaming
- Work with Amazon Redshift to implement a big data solution
- Leverage big data visualization software
- Appropriate security options for Amazon EMR and your data
- Perform in-memory data analysis with Spark and Shark on Amazon EMR
- Options to manage your Amazon EMR environment cost-effectively
- Benefits of using Amazon Kinesis for big data
Who Should Attend?
This course is intended for: -Individuals responsible for designing and implementing big data solutions, namely Solutions Architects and SysOps Administrators
- Data Scientists and Data Analysts interested in learning about big data solutions on AWS
Prerequisites
Required
- Familiarity with big data technologies, including Apache Hadoop and HDFS
- Knowledge of big data technologies such as Pig, Hive, and MapReduce is helpful but not required
- Working knowledge of core AWS services and public cloud implementation
- Students should complete the AWS Essentials course or have equivalent experience
- Basic understanding of data warehousing, relational database systems, and database design
- AWS Technical Essentials
Recommended
- CCC Big Data Foundation (BDF)
Course Outline:
Note: The curricula below comprise activities typically covered in a class at this skill level. The instructor may, at his/her discretion, adjust the lesson plan to meet the needs of the class based on regional location and/or language in which the class is served.
- Overview of Big Data
- Data Ingestion, Transfer, and Compression
- AWS Data Storage Options
- Using DynamoDB with Amazon EMR
- Using Kinesis for Near Real-Time Big Data Processing
- Introduction to Apache Hadoop and Amazon EMR
- Using Amazon Elastic MapReduce
- The Hadoop Ecosystem
- Using Hive for Advertising Analytics
- Using Streaming for Life Sciences Analytics
- Using Hue with Amazon EMR
- Running Pig Scripts with Hue on Amazon EMR
- Spark on Amazon EMR
- Running Spark and Spark SQL Interactively on Amazon EMR
- Using Spark and Spark SQL for In-Memory Analytics
- Managing Amazon EMR Costs
- Securing your Amazon EMR Deployments
- Data Warehouses and Columnar Datastores
- Introduction to Amazon Redshift
- Optimizing Your Amazon Redshift Environment
- The Big Data Ecosystem on AWS
- Visualizing and Orchestrating Big Data
- Using Tibco Spotfire to Visualize Big