About the Role
Kogo poszukujemy?
Requirements:
Minimum 3 years of commercial experience in the position of Data Engineer or related.
Very good knowledge of SQL and relational databases (PostgreSQL, MS SQL Server).
Practical experience in designing, building, and orchestrating ETL/ELT processes, including task scheduling.
Good knowledge of Python language in the context of data processing and transformation.
Experience in collaboration with BI tools (e.g. Power BI, Apache Superset) in terms of feeding data models and reports.
Understanding of data quality issues, schema versioning, and basic CI/CD practices in data projects.
Ability to effectively collaborate with data analysts and business stakeholders, good communication skills, and independence in problem-solving.
Additional advantages:
Experience working with Mage.ai or other data orchestration tools.
Experience in cloud environments.
Czym będziesz się zajmować?
Responsibilities:
Design, implementation, and development of ETL/ELT processes from multiple source systems to the Data Warehouse's data staging layer.
Design and maintenance of data architecture and schemas in the data staging layer, with a focus on performance, scalability, and reliability.
Data integration from source systems using APIs and other data exchange mechanisms.
Creation, optimization, and maintenance of advanced SQL queries used in data loading processes and by analytical teams.
Utilization of Python for data transformation, validation, and processing, process automation, and building components supporting ETL pipelines.
Tech Stack
SQLPostgreSQLMS SQL ServerETLELTPythonPower BIApache SupersetData WarehousingAPIs