About the Role
Tasks, duties and responsibilities
Building data pipelines
Designing and implementing ETL transformations
Provisioning data processing infrastructure
Optimizing existing data processing architectures
Designing data warehousing
Sourcing data using REST APIs, s3 etc.
Data modelling and database management
Requirements and tech stack
At least 2 years experience in Python
Experience using modern ETL tools (e.g., Airflow, FiveTran, Glue, Matillion, etc.)
Cloud Data warehouse experience (e.g., Snowflake, BigQuery/BigTable, RedShift)
Git
Tech Stack
PythonETLAirflowFiveTranGlueMatillionSnowflakeBigQueryRedShiftSQLREST APIsGit