Designs, modifies, develops, and implements solutions for ingesting, provisioning and securing data in our Lake (S3), Warehouse (Redshift) and Marts (Redshift and MySQL) using Scala, Apache Spark, Hive, Apache Pig, MapReduce, AWS Data Pipeline, AWS Glue, AWS EMR and Apache Kafka plus nice to have some exposure or experience with Informatica and Talend. Participates in the testing process through test review and analysis, test witnessing and certification of software. Requires 5-7 years of experience in the field or in a related area. Has knowledge of commonly-used concepts, practices, and procedures within a particular field.
- Hands on experience in Java, Scala and Spark
- Hands on experience with Kafka, NiFi, AWS.
- Hands on experience using Maven, Jira, Stash and Bamboo
- Hands on experience to write MapReduce jobs
- Experience in Talend is a plus
- AWS or Spark certification is a plus