About the Role
Senior SQL Engineer with BigQuery
Date: Mar 13, 2026
Location:
Kraków, PL, 30-302 Warszawa, PL, 00-839 Łódź, PL, 90-118 Poznań, WP, PL, 61-569
Working place: Hybrid
Type of contract: employment contract
Salary range: 14500 – 22300 PLN gross/m
We are looking for a hands-on Senior SQL Developer with a strong BigQuery background and experience in working with large data platforms. This role is about expert SQL skills, modern DataOps, and building data pipelines in the cloud. You will focus on designing reliable data models and creating efficient ETL/ELT workflows using Data Vault. You will join our agile team to ensure our data solutions are secure, stable, and ready for business use.
Location & Model: Hybrid. You can work from one of our offices in Krakow, Warsaw, Lodz, Poznan, or Wroclaw. Candidates based outside of Krakow are required to visit the Krakow office for 2 days once a month.
Your tasks
Design, build, and deploy data models and transformations in BigQuery (complex SQL, stored procedures, partitioning, clustering)
Create and manage ETL/ELT pipelines to move and transform data into Data Vault models
Optimize SQL queries and data jobs to make them faster and more cost-effective
Connect data from different sources (APIs, SFTP) and ensure it is accurate
Manage code and CI/CD pipelines using tools like Git, Jenkins, and Ansible
Use Google Secret Manager to keep applications and credentials secure
Monitor data pipelines, fix bugs, and suggest improvements
Talk to business stakeholders to understand their needs and create technical plans
Your skills
At least 3–4 years of experience in SQL optimization and complex data work in BigQuery
Experience in Data Vault modeling and usage
Good knowledge of Python and Terraform for automation
Hands-on experience with Cloud Composer (Airflow), Cloud Run, and Pub/Sub
Practical skills with Git for version control
Experience with CI/CD tools (Ansible, Jenkins) for cloud applications
Ability to work in a DataOps model and Agile environment
Strong problem-solving and analytical skills
Willingness to learn new things and work well in a team
English level B2+
Nice to have
Experience with GCP Data Fusion or CDAP (ingesting and parsing CSV, JSON, XML data)
Experience with Dataproc and handling data from RESTful/SOAP APIs or SFTP servers
Understanding of Data Contract best practices
Basic Java development skills (e.g., for custom Data Fusion plugins)
Experience with automated testing tools for cloud data solutions
We offer
Hybrid work in one of our locations: Lodz, Poznan, Krakow, Warszawa, Wroclaw (2 office days per week) - to można wkleić lub sobie usunąć
Working in a highly experienced and dedicated team
Benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
On-line training and certifications fit for career path
Access to e-learning platform
Mindgram - a holistic mental health and wellbeing platform
Work From Anywhere (WFA) - the temporary option to work remotely outside of Poland for up to 140 days per year (including Italy, Spain, the UK, Germany, Portugal, and Bulgaria)
Tech Stack
SQLBigQueryData VaultETLELTPythonTerraformCloud ComposerAirflowGitCI/CDGCP