About the Role
Senior SQL Consultant with BigQuery
Date: Mar 13, 2026
Location:
Kraków, PL, 30-302 Łódź, PL, 90-118 Poznań, WP, PL, 61-569 Warszawa, PL, 00-839
Working place: Hybrid
Type of contract: B2B contract
Salary range: 125 - 163 net/h
We are looking for a hands-on Senior SQL Developer with a strong BigQuery background and experience in working with large data platforms. This role is about expert SQL skills, modern DataOps, and building data pipelines in the cloud. You will focus on designing reliable data models and creating efficient ETL/ELT workflows using Data Vault. You will join our agile team to ensure our data solutions are secure, stable, and ready for business use.
Location & Model: Hybrid. You can work from one of our offices in Krakow, Warsaw, Lodz, Poznan, or Wroclaw. Candidates based outside of Krakow are required to visit the Krakow office for 2 days once a month.
Your tasks
Design, build, and deploy data models and transformations in BigQuery (complex SQL, stored procedures, partitioning, clustering)
Create and manage ETL/ELT pipelines to move and transform data into Data Vault models
Optimize SQL queries and data jobs to make them faster and more cost-effective
Connect data from different sources (APIs, SFTP) and ensure it is accurate
Manage code and CI/CD pipelines using tools like Git, Jenkins, and Ansible
Use Google Secret Manager to keep applications and credentials secure
Monitor data pipelines, fix bugs, and suggest improvements
Talk to business stakeholders to understand their needs and create technical plans
Your skills
At least 3–4 years of experience in SQL optimization and complex data work in BigQuery
Experience in Data Vault modeling and usage
Good knowledge of Python and Terraform for automation
Hands-on experience with Cloud Composer (Airflow), Cloud Run, and Pub/Sub
Practical skills with Git for version control
Experience with CI/CD tools (Ansible, Jenkins) for cloud applications
Ability to work in a DataOps model and Agile environment
Strong problem-solving and analytical skills
Willingness to learn new things and work well in a team
English level B2+
Nice to have
Experience with GCP Data Fusion or CDAP (ingesting and parsing CSV, JSON, XML data)
Experience with Dataproc and handling data from RESTful/SOAP APIs or SFTP servers
Understanding of Data Contract best practices
Basic Java development skills (e.g., for custom Data Fusion plugins)
Experience with automated testing tools for cloud data solutions
Tech Stack
SQLBigQueryData VaultPythonTerraformGitJenkinsAnsibleCloud ComposerAirflowGCPCI/CD