- Senior Big Data (Cloud) Developer – (this person needs to have a few years of experience with both Hadoop and AWS, with work focus on data ingestion and management)
- The Senior Big Data (Cloud) Developer shall have a minimum of ten (10) years of IT experience with demonstrated hands-on expertise in:
- Development of complex Python scripts (at least 5 years of experience).
- Development and support activities in cloud-based environments using big data technologies and programming languages and tools such as Scala, Spark, PySpark, Presto, Ranger, Hadoop, and Hive etc. (at least 3 years of experience).
- Working in Amazon Cloud environment (AWS) utilizing tools such as S3, EMR, Databricks, Data Lakes, AWS Glue, Amazon StageMaker, Amazon Redshift etc.
- Design, development and tuning of complex ETL/ELT processes involving massive data (order of TB).
- Design, development and support of Web Services (RESTful/SOAP) & Web Scraping (HTTP, CSS and HTML) ETL/ELT processes.
- Working with complex file formats of structured, unstructured and semi-structured data including JSON, XML, CSV, Avro, Parquet etc.
- Data modelling using RDBMS such as Amazon Relation Database Services (RDS), and PostgreSQL, etc.
- Development of clear and concise technical documentations including data models, architecture and data workflow diagrams, design documents, and deployment documents.
- Working with Git, and GitLab.
- AWS developer certification is a plus.
- Bachelor's degree in computer programming or equivalent training required.
- 10+ years experience required.