Who We Are
Role Description
Skills: Matillion, AWS and DevSecOps
Required Experience and Skills:
- 3-6 years of relevant experience
- Solid experience with AWS services (S3, IAM, Redshift, Sagemaker, Glue, Lambda, Step Functions, CloudWatch)
- Experience with platforms like Databricks, Dataiku
- Proficient in Python / Java
- Proficient with SQL – Redshift preferred
- Proficient in Jenkins, CloudFormation, Terraform, Git, Docker
- 2-3 years of Spark – PySpark
- Ability to work in cross functional teams
Primary Responsibilities:
- Design, develop and maintain data pipelines to extract data from a variety of sources and populate data lake and data warehouse
- Develop the various data transformation rules
- Collaborate with Product Analyst, Data Scientists, and Engineers to identify and transform data to make data understandable.
- Work with data governance team and implement data quality checks and maintain data catalogs
- Use Orchestration, logging, and monitoring tools to build resilient pipelines
- Use test driven development methodology when building ELT/ETL pipelines
- Analyze data
- Use Git for version control and understand various branching strategies
- Work as part of an agile team
- Create technical documentation as needed
Education:
- B. Tech. or higher degree in computer sciences or an equivalent field required
We Expect You to Have:
Oops! Something went wrong while submitting the form.