Oferty pracy Data Engineer i Data Science330+ Ofert

Znajdź oferty pracy data engineering, data science i AI. Machine learning, analityka i big data.

Senior Data Scientist (GenAI / Applied AI)

AVENGA · Gdańsk, Poland, PL

<p><strong><em>This is us</em></strong><em><br><br></em>At Avenga, we believe that human creativity empowers technology that matters. Operating globally, our 6000+ specialists provide a full spectrum of services, including business and tech advisory, enterprise solutions, CX, UX and Ul design, managed services, product development, and software development. <br></p><p><strong><em>This is the job</em></strong><br></p><p>Within the telecommunications domain, we are actively seeking a Senior Data Scientist to strengthen our team focused on building AI-driven solutions that enhance customer experience and operational efficiency.</p><p>You will be working on advanced Generative AI use cases, including conversational AI (chatbots and voice bots), as well as internal AI tools that support business teams in their daily operations. The role involves designing and scaling solutions such as RAG-based systems and intelligent assistants, enabling both customer-facing and internal automation scenarios.</p><p><strong><em>This is the team</em></strong><em><br></em></p><p>In this role, you'll become a vital member of a diverse and international Data Science team within the IT organization.</p><p>The team works closely with multiple business units (B2C, B2B, Finance, Technology), delivering data-driven solutions that go beyond proof of concept and are designed for production from the very beginning.</p><p>You will collaborate with data engineers, product stakeholders, and non-technical partners, contributing to end-to-end AI solution development - from ideation and prototyping to deployment and scaling.<br><strong><em><br>This is you</em></strong></p><ul><li><p>Master’s degree in AI, Machine Learning, Data Science, or a related field</p></li><li><p>3–6 years of experience as a Data Scientist (preferably at Senior level)</p></li><li><p>Strong expertise in AI/ML and practical experience delivering business-impacting solutions</p></li><li><p>Hands-on experience with Generative AI, especially LLMs</p></li><li><p>Experience with NLP use cases (e.g. chatbots, voice bots, text processing)</p></li><li><p>Solid programming skills in Python and experience working with APIs, SQL, and Git</p></li><li><p>Experience designing and implementing production-ready solutions</p></li><li><p>Ability to communicate complex technical concepts to non-technical stakeholders</p></li></ul><p>Nice-to-have skills:</p><ul><li><p>Experience with telecom domain or customer-facing digital products</p></li><li><p>Knowledge of Agentic AI, RAG systems, vector databases, and chunking strategies</p></li><li><p>Familiarity with LLM frameworks such as LangChain, LlamaIndex, LangGraph, AutoGen, or similar</p></li><li><p>Experience with AWS and/or on-premise deployments</p></li><li><p>Knowledge of Docker, Airflow, CI/CD pipelines (e.g. GitHub Actions/Runners)</p></li><li><p>Experience with Google AI/NLP ecosystem</p></li></ul><p><strong><em>This is your role</em></strong></p><ul><li><p>Design, develop, and scale AI/ML solutions with a strong focus on Generative AI</p></li><li><p>Build and optimize NLP-based systems, including chatbots and voice assistants</p></li><li><p>Develop and productionize RAG-based solutions and AI-powered internal tools</p></li><li><p>Collaborate with cross-functional teams to translate business needs into technical solutions</p></li><li><p>Deliver proof-of-concepts and evolve them into scalable, production-ready systems</p></li><li><p>Work closely with data engineers to ensure robust deployment and integration</p></li><li><p>Contribute to building a data-driven culture by demonstrating measurable business impact</p></li></ul><p>&nbsp;</p><p><strong><em>What awaits you at Avenga?</em></strong></p><p><em><br>At Avenga, everyone matters. We provide equal opportunities in recruitment, career development, and leadership, regardless of race, ethnicity, gender identity, sexual orientation, disability, age, religion, or any other characteristic. We are committed to fostering a work environment where our diverse community of employees, candidates, and business partners actively shapes our growth. By bringing together people from different backgrounds and experiences, we build a workplace where everyone feels free to be themselves while honoring the boundaries of others.</em></p>

Full TimedirectData & AI
Salary not disclosed1 month ago

As Senior Data Engineer, you will be responsible for developing and maintaining data platforms and delivering with our team of Data Engineers through challenges. This key role embraces participating in the entire engineering lifecycle and ensuring successful data platform implementation. The ideal candidate will build and design data platform frameworks and data pipelines, act as a trusted advisor for Databricks and Airflow technology for clients, and enhance process effectiveness. A data engineer collaborates with the project team on project delivery and standards, with expertise in data warehouse and lakehouse development for large-scale customers. Essential skills include communication, collaboration, problem-solving, a data-driven mindset, and a passion for automation. What makes you a great fit? Key technologies Databricks | Data Platforms | SQL | Python | CI/CD | Azure | Data Services | ETL 5+ years of experience in configuration/release management, deployment, and maintenance and familiarity with various data platforms and practices in the data area Strong, hands-on experience in the Databricks ecosystem is a must. Very good coding skills with knowledge of best practices in SQL, Python Experience with Git-based repositories (Bitbucket, GitHub, GitLab) Familiarity with tools like Spark, Hadoop, Azure Data Factory, Apache Kafka/Event Hub, Airflow, dbt, or other data platform tools. Conceptual and technical knowledge on CI/CD pipelines, automation practices, and technical operations monitoring Strong sense of ownership, flexibility, and independence Nice to have Hands-on experience of automated testing frameworks (Greater Expectations, customer-made, etc.) Familiarity with Oracle or Apache ecosystem Soft skills Analytical ThinkingAbility to interpret data, identify trends, and provide actionable insights to drive business decisions. Problem-Solving SkillsProactively identify and resolve issues related to BI systems, ensuring smooth operations and data accuracy. Effective CommunicationAbility to explain technical concepts to non-technical stakeholders clearly and concisely, fostering understanding and collaboration. Attention to DetailMeticulous in data validation, report generation, and ensuring data quality across BI platforms. Team CollaborationStrong interpersonal skills to work effectively in cross-functional teams, coordinating with IT, business analysts, and other stakeholders. Adaptability and Continuous LearningOpen to learning new tools and technologies in the rapidly evolving BI landscape and adjusting to changing business needs. What will you do? Design and build cloud-based data platform frameworks and architectures. Integrate data from various structured and unstructured sources and formats. Implement automation for data workflows and support CI/CD processes. Optimize performance and resource usage of cloud data solutions. Maintain up-to-date knowledge of new technologies in data and cloud engineering. Monitoring, troubleshooting, and optimizing data systems for performance and cost. Supporting deployment of analytics and machine learning solutions on cloud platforms. Documenting processes, infrastructure, and best practices for future scalability Our benefits Your journey with us starts here: 1. Initial Screening: If you meet our requirements, our recruiter will reach out to you for a chat about your motivations and expectations. Get ready to share your passion! 2. Technical Interview: Next, you'll be invited to showcase your skills in an interview with one of our technical experts or team members. This is your chance to shine and demonstrate your expertise. 3. Final Interview: Finally, you'll have the opportunity to meet your future Team Lead. This is the perfect moment to learn more about the role, the team, and to ask any questions you might have.

Full TimedirectData & AI
Salary not disclosed1 week ago

Data Scientist - Consumer Analytics India Apply now   Growth through diversity, equity, and inclusion. As an ethical business, we do what is right — including ensuring equal opportunities and fostering a safe, respectful workplace for each of us. We believe diversity fuels both personal and business growth. We're committed to building an inclusive community where all our people thrive regardless of their backgrounds, identities, or other personal characteristics.   Tasks: Running end-to-end initiatives (Business understanding, Data understanding/preparation, Modeling, Evaluation and Deployment)  Analyzing and interpreting the findings  Drawing conclusions and recommendations- including expected benefits and measuring ROI for enhancing business processes   Pre-sales activities (at senior consultant level)    What We're Looking For:     Commercial experience with various classical data science and Machine Learning (ML) models (e.g. decision trees, ensemble-based tree models, linear regression etc.)  Familiarity with customer analytics DS concepts or advanced forecasting  Model hyperparameter tuning  Model validation frameworks    Experience with business requirements gathering, transforming them into technical plan, data processing, feature engineering, models evaluation  Previous experience in an analytical role supporting business will be a plus  Fluency in Python, basic working knowledge of SQL  Knowledge of specific DS/ML libraries  Solid experience in one of the cloud computing platforms (Databricks or GCP or Azure)     Nice To Have: Understanding of Causal machine learning  Experience in working with big data and distributed environments would be a plus  Commercial experience proven by multiple successful projects in the areas of forecasting would be a big plus  Experience with OOP in Python  Experience with MLOps  Familiarity with other languages R, Scala would be a plus    General:  Basic computer programming skills and familiarity with programming concepts   Strong business acumen  Experience with deep learning, reinforcement learning or other advanced modeling concepts in classical Data Science problems  Ability to come up with creative solutions to address customer problems   Missing one or two of these qualifications? We still want to hear from you! If you bring a positive mindset, we'll provide an environment where you feel valued and empowered to learn and grow.   We offer: Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites. “Office as an option” model. You can choose to work remotely or in the office, depending on your location.   Flexibility regarding working hours and your preferred form of contract.   Comprehensive online onboarding program with a “Buddy” from day 1.    Cooperation with top-tier engineers and experts.   Unlimited access to the Udemy learning platform from day 1. Certificate training programs. Lingarians earn 500+ technology certificates yearly.  Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly. Internal Gallup Certified Strengths Coach to support your growth. Grow as we grow as a company. 76% of our managers are internal promotions.   A diverse, inclusive, and values-driven community.    Autonomy to choose the way you work. We trust your ideas.   Create our community together. Refer your friends to receive bonuses.   Activities to support your well-being and health. Plenty of opportunities to donate to charities and support the environment.   Modern office equipment. Purchased for you or available to borrow, depending on your location. Share:

Full TimedirectData & AI
Salary not disclosed3 months ago

AI Engineer

hireforyou.pro · Poland

For our client - a fast-growing AI-driven fitness and wellness product - we are looking for an AI Engineer who will design and build intelligent AI coaching systems that plan, adapt, and personalise workouts in real time. You will develop AI agents and recommendation engines that combine structured user data, health signals, and fitness science principles to deliver safe, effective, and highly personalised training experiences. This includes building multi-step reasoning pipelines, retrieval systems, evaluation frameworks, monitoring, and guardrails, ensuring an AI coach performs reliably in production and continuously improves over time. Responsibilities: Building AI agents that plan workouts, adapt in real time, and coach users - including tool use, multi-step reasoning, memory management, and context pipelines Building and improving recommendation systems - exercise selection, weight recommendations, intensity scaling - using structured user data Integrating health and recovery signals (Apple Health, body scans, fitness tests) into the coach's decision-making process Working with the Fitness Science team to define what "good" looks like for subjective tasks - such as whether a workout fits a user's age, injury history, and goals - then building the evaluation frameworks that enforce those standards end-to-end Analysing where agents fail, identifying patterns in those failures, and addressing the ones that matter most Designing monitoring and guardrails so agents don't fail silently in production Strong Python skills Solid software engineering fundamentals - you ship production code, not just prototypes Experience building AI agents - tool orchestration, multi-step workflows, context management, retrieval systems Experience building recommendation or scoring systems on structured data - collaborative filtering, calibration, rule-based systems, or hybrid approaches Strong analytical instincts - you can define what "correct" means for a subjective recommendation, design metrics that reflect user experience, and spot when a metric is lying to you You've fine-tuned or distilled language models to hit cost and latency targets You know how to make retrieval work - embeddings, vector search, and semantic similarity Experience with classification and regression on structured data - user health signals are messy and continuous Ready to learn a new domain deeply - you'll pick up exercise physiology, recovery science, and training principles on the job Experience building multi-agent systems Background in health, fitness, or wellness Experience with active learning or human-in-the-loop workflows Experience with time-series analysis or forecasting (fitness data is mostly sequences over time) Background in ML ops - model versioning, serving infrastructure, pipeline orchestration Work on a fast-growing AI-driven fitness and lifestyle product AI and LLMs are core not only to the product, but also to day-to-day engineering work Unique intersection of B2C scale and large B2B partnerships Growth comes from tackling complex technical challenges, deep ownership, and broad influence Small, people-oriented team with a strong engineering culture We look forward to receiving your CV and learning more about your experience! Dear Candidates, due to a high volume of applications, only selected candidates will be contacted for interviews. We appreciate your understanding. Thank you for considering a career with us! Work type Full-time Location Remote

Full TimedirectData & AI
Salary not disclosed1 week ago

Senior/Staff Data Analyst

Zoolatech · Poland

Senior/Staff Data Analyst Location: Other, Central Europe Seniority: Senior Technologies: Data, Product/Analyst We are looking for a Data Analyst to help ensure the quality and reliability of data used across our client’s data platform.Our partner is a leading European online fashion & beauty retailer that uses modern data technologies to power personalized shopping experiences and data-driven decision-making.In this role, you will work with Databricks and analytical tools to compare existing and new data streams, identify discrepancies, investigate data quality issues, and support the reliability of key business metrics. You will collaborate with data engineers and platform teams to troubleshoot issues, track improvements, and present insights to stakeholders in a clear and understandable way.And then there's Zoolatech! Just imagine a workplace and a team environment that you never want to leave once you have found it. Sound enticing? Apply to our position today and we can get you there.Build required Databricks notebooks/visualisations to compare existing streams and new streams as defined in Data Comparison Playbook. Schedule them so they can run daily and present fresh data every morning.Troubleshoot and identify the root cause for missing data and volume discrepanciesOpen data quality issues to respective teams and follow up the fixesReport data quality improvements and current comparisons in an understandable format to all stakeholders and leadership.8+ years of commercial experienceStrong SQL skills and experience working with large datasetsExperience with DatabricksSolid analytical and problem-solving skills with the ability to investigate data discrepanciesExperience with data quality checks, data validation, or data comparisonAbility to work with data pipelines and event-based data streamsExperience creating dashboards or data visualizations (Looker, Tableau, Power BI, or similar)Ability to communicate findings clearly to technical and non-technical stakeholdersAttention to detail and a proactive approach to troubleshooting data issues Discover what it’s like to work with us These might interest you Join Our Team!

Full TimedirectData & AI
Salary not disclosed1 week ago

AI Engineer

GFT Technologies SE · Kraków, PL

AI Engineer Date:  Mar 20, 2026 Location:  Kraków, PL, 30-302 Working place:  Hybrid Location: Kraków Type of contract: Employment contract Salary range: 15 870–24 330 PLN gross/m   What will you do?   You will join a team of engineers developing cutting‑edge AI and Machine Learning solutions within the Credit & Lending domain in an international banking environment. The project focuses on leveraging state‑of‑the‑art AI technologies to build scalable systems that support business processes, data‑driven decision‑making, and operational efficiency. You will work across the full lifecycle of AI/ML development—from discovery and prototyping, through model and system design, to production deployment and maintenance—using LLMs, RAG, modern data platforms, containerized environments, and cloud‑native solutions in Azure or GCP.   Your tasks   LLM engineering and RAG development Designing, testing, and optimizing prompts for LLM‑based applications Developing and integrating RAG systems and vector search solutions Implementing responsible AI/ML solutions with fairness, transparency, and accountability Designing and building scalable architectures leveraging LLMs and ML Deploying AI systems using Azure or GCP with containerization and Kubernetes Building applications following engineering best practices and clean code principles Conducting data exploration, preparation, and modeling Collaborating with domain experts on data pipelines and domain‑specific datasets Applying deep learning and NLP techniques to business problems Working closely with engineering, product, and business stakeholders Presenting technical solutions, constraints, risks, and recommendations Participating in the full AI product lifecycle from idea to production   Your skills   Python (3+ years); Java is a plus Experience with LangChain, FastAPI, Data containers, Spring Knowledge of microservices architecture, observability, monitoring, API design, concurrency models Hands‑on experience with Azure or GCP, containers, Kubernetes, CI/CD (Jenkins, Azure DevOps, GCP Cloud Build) Practical experience with TDD/BDD Strong understanding of LLM architectures and domain‑specific data modeling Experience with RAG, prompt engineering, agentic architectures, deep learning, NLP, ML lifecycle Ability to design, develop, and scale AI‑driven systems Strong software engineering fundamentals and problem‑solving skills Ability to experiment quickly and iterate toward production‑quality solutions Strong communication and presentation skills in English Ability to explain technical solutions and business implications clearly Comfort working with cross‑functional and multicultural teams   We offer   Hybrid work in one of our locations: Lodz, Poznan, Krakow, Warszawa, Wroclaw (2 office days per week) Working in a highly experienced and dedicated team Benefit package tailored to your needs (medical, sport, lunch subsidy, life insurance, etc.) Online training and certifications Access to e‑learning platform Mindgram wellbeing platform Work From Anywhere (up to 140 days/year abroad) Social events

Full TimeRemotedirectData & AI
PLN 15,870 - 24,330/month1 month ago

Data Engineer

Visa Technology Europe sp. z o.o. · Poland

Visa is accelerating the delivery of data analytics and AI powered products to support client growth and strategic decision-making across regions. We are seeking a Data Engineer to execute on the design, delivery and evolution of scalable data engineering capabilities that underpin Data Science, AI and client facing products for all European markets. The role requires understanding and translating business needs into data models, creating robust data pipelines, and developing and maintaining databases. The candidate should be able to define and manage data load procedures, implement data strategies, and ensure robust operational data management systems. Collaborating with stakeholders across the organization to understand their data needs and deliver solutions is also a key part of this role. The ideal candidate will be proficient in big data tools like Hadoop, Hive, and Spark, programming languages such as Python and SQL and have strong analytical skills related to working with structured and unstructured datasets. Primary Responsibilities: Requirement Analysis: Understand and translate business needs into data models supporting long-term solutions Build and manage large scale ETL processes to generate data assets for the region Build modular and reusable code considering the configurability and scalability while adhering to low-level design Perform thorough unit testing of development tasks and document the test results using standard defined templates Build, schedule, and manage DAGs in Apache Airflow efficiently Monitor data processing tasks using Airflow Ensure quality control of data assets, reconciling data loaded across different stages in the data pipeline Utilize strong data analytics skills to identify, discuss, and promptly fix data issues Apply debugging skills to quickly rectify execution errors, ensuring minimal delays and impact on business operations Collaborate and communicate with stakeholders for requirement understanding and clarifications Maintain the highest level of quality and detail-oriented approach in daily tasks This is a hybrid position. This requires 3 days per week attendance in the office. Qualifikationen Basic Qualifications: 5 or more years of relevant work experience with a Bachelor's Degree or at least 2 years of work experience with an Advanced degree (e.g. Masters, MBA, JD, MD) or 0 years of work experience with a PhD Preferred Qualifications: 6 or more years of work experience with a Bachelor's Degree or 4 or more years of relevant experience with an Advanced Degree (e.g. Masters, MBA, JD, MD) or up to 3 years of relevant experience with a PhD 2-4 years development experience in building data pipelines and writing ETL code using Hive, PySpark, SQL and Unix Experience in writing and optimizing SQL queries in a big data environment Experience working in Linux/Unix environment and exposure to command line utilities Experience creating/supporting production software/systems and a proven track record of identifying and resolving performance bottlenecks for production systems Exposure to code version control systems (git) Experience working with cloud services (e.g. AWS, GCP, Azure) Strong communication skills Ability to understand business requirements of the broader business Good understanding of agile working practices and related program management skills Good communication and presentation skills with ability to interact with different cross-functional team members at varying levels Advanced degree in technical field (e.g. Computer Science, statistics, etc.) Experience with visualization tools like Tableau and Power BI Exposure to Financial Services or the Payments Industry

Full TimeRemotedirectData & AI
Salary not disclosed2 months ago

Experienced Azure Data Engineer (Mid/Senior)

Future Mind · Warsaw, Tychy OR REMOTE

Experienced Azure Data Engineer (Mid/Senior) B2B: 14.000 - 22.000 PLN Location: Warsaw, Tychy OR REMOTE. APPLY NOW Future Mind is one of the most awarded digital advisory and delivery companies in the region. A brilliant, inspiring team. We value proactive professionals who enjoy teamwork, solving problems, and sharing knowledge. Together we create high-quality solutions that fulfill the business needs of our clients and impact the lives of their customers every day. We strive for continuous professional development with the active support of experienced mentors and team leaders along a well-structured professional development path – we allways keep learning. As a team, we have received several industry awards for delivering some of the best eCommerce applications in Poland, listed among the most popular and best reviewed applications in app stores. Our expert engineers, testers, designers, project managers, and analysts work on projects ranging from top mobile commerce apps used daily by millions of customers to IoT experiments. In 2023, we joined forces with Solita, a Finnish tech powerhouse with a vibrant community of over 2000 specialists across Europe, that combines data, business, and technology skills to build and improve digital services for leading organizations in manufacturing, medical, shipping, and other major industries, including such clients as NATO, Nokian Tires, Pfizer and many others. Together we are dedicated to delivering cutting-edge, data-driven solutions. Get hired in 3 steps! No one likes long recruitment processes. You can expect our recruitment process to be brief and effective. We'll tell you about us as much as we want to learn about you. 01. We get to know each other on a brief call 02. You complete a short recruitment task 03. We discuss your area of expertise Role description: As an Azure Data Engineer, your role involves designing and implementing ingestion and transformation pipelines, and data models while collaborating closely with the business stakeholders. You have worked with the Azure data stack and cloud for several years and understand the value created for customers. This job is all about: Building end-to-end ELT data pipelines on top of an Azure Data Services ; Designing data models based on large, complex data sets that meet both functional and non-functional business requirements; Holistic platform view (source systems, networking, orchestration, etc.) with the ability to deliver business value; Collaborating within our project teams to meet client needs and deliver high-quality solutions. Here’s what we’re looking for: Vast experience with Azure Data Services; Deep understanding of dimensional data modeling techniques (data vault would be a plus); Proficiency with combining SQL and Python with a modern ELT toolset (Fabric, Synapse, Spark, or equivalent); Experience with ADF, Airflow, or other orchestration tools; Effective communication skills, proficient in Polish and English. You also must have the legal right to work in the EU to apply for this position. …and here’s what we offer: A dynamic work environment where innovation and collaboration are valued. Access to cutting-edge projects and technologies in a variety of industries. A supportive community of experts to foster your professional growth and development. Competitive compensation, comprehensive benefits, and a focus on work-life balance. Opportunities for continuous learning and career advancement, including specialized training in big data technologies and Snowflake certifications. The ability to work fully remotely or check into one of our offices whenever you like, Fully paid private health insurance, subsidized sports membership, mental health support, and language courses.

Full TimeRemotedirectData & AI
PLN 14,000 - 22,000/month1 month ago

About PathwayAt Pathway we are shaking the foundations of artificial intelligence by introducing the world’s first post-transformer model that adapts and thinks just like humans.  Our breakthrough architecture outperforms Transformer and provides the enterprise with full visibility into how the model works. Combining the foundational model with the fastest data processing engine on the market, Pathway enables enterprises to move beyond incremental optimization and toward truly contextualized, experience-driven intelligence. We are trusted by organizations such as NATO, La Poste, and Formula 1 racing teams. Pathway is led by co-founder & CEO Zuzanna Stamirowska, a complexity scientist who created a team consisting of AI pioneers, including CTO Jan Chorowski who was the first person to apply Attention to speech and worked with Nobel laureate Geoff Hinton at Google Brain, as well as CSO Adrian Kosowski, a leading computer scientist and quantum physicist who obtained his PhD at the age of 20.  The company is backed by leading investors and advisors, including Lukasz Kaiser, co-author of the Transformer (“the T” in ChatGPT) and a key researcher behind OpenAI’s reasoning models. Pathway is headquartered in Palo Alto, California. The OpportunityWe are currently seeking Machine Learning/AI Software Engineering interns with a strong track record in research on machine learning models. You Will Support training of LLMs, Benchmark LLMs, Prepare and evaluate training datasets, Support the core Pathway Research Team. The results of your work will play an important role in advancing the AI field. Cover letter It's always a pleasure to say hi! If you could leave us 2-3 lines, we'd really appreciate that. You are expected to meet at least one of the following criteria: You were an ICPC World Finalist, or an IOI, IMO, IOAI or IPhO medalist in High School. You have published a research paper at an A-rated o A*-rated venue (according to ICORE). You have completed coding projects - ideally with a GitHub repository showcasing previous work. You were an intern at a leading Machine Learning research center (e.g. at: Google Brain / Deepmind, Apple, Meta, Anthropic, Nvidia, MILA). You can get a warm recommendation from you university faculty member. You Have some experience in deep learning research, with a track record in Language Models and/or RL (candidates with a Vision or Robotics ML background are also welcome to apply). Are interested in improving foundational architectures and creating new benchmarks. Are experienced at hands-on experiments and model training (PyTorch, Jax, or Tensorflow). Have a good understanding of GPU architecture, memory design, and communication. Have a good understanding of graph algorithms. Have some familiarity with model monitoring, git, build systems, and CI/CD. Are respectful of others. Are fluent in English . Why You Should Apply Join an intellectually stimulating work environment. During your internship, you will collaborate on a cutting edge research project. Be a pioneer: you get to work with a new type of "Live AI" challenges. Be part of one of an early-stage AI startup that believes in impactful research and foundational changes. Further details Preferable joining date: August 2025. The positions are open until filled – please apply immediately. Duration: 3-6 months Compensation: based on profile and location. Location: Hybrid - regular presence in our office in Palo Alto, CA is required. Possibility to work or meet with other team members in one of our other offices: Paris, France or Wroclaw, Poland. As a general rule, permanent residence will be required in the EU, UK, US, or Canada. If you meet our broad requirements but are missing some experience, don’t hesitate to reach out to us.

Full TimeRemotedirectData & AI
Salary not disclosed3 months ago

Data Analyst

Synerise S.A. · Cracow, Poland

Having such great solutions, we are looking for a Data Analyst to join our brave and brilliant Business Development Team. ‍ What will you do on a daily basis? Work with Synerise's ever-evolving data platform, enhanced by AI. Turn complex data into clear insights through advanced analyses, reports, and dashboards. Translate data findings into practical conclusions and recommendations based on real use cases. Ensure high quality, accuracy, and consistency across all analyses. Suggest improvements and new analytical capabilities to support platform development. Prepare insights and actively participate in meetings with key customers. Cooperate with internal teams: Customer Success & Implementation, Product Development, Marketing and Business Development. What will make us a perfect match? Minimum 2-3 years of experience in data analytics. Experience working with e‑commerce data, including product, transactional, and behavioral datasets. Strong analytical skills with the ability to dissect complex data and extract actionable insights. Knowledge of web analytics. Very good knowledge and experience in MS Excel. The ability to infer and synthesize data, the ability to combine data from various sources - knowledge of Power Query, Power BI. The ability to tell stories with data, making insights accessible to non-technical teams. Teamwork skills, the ability to easily establish relations, and motivate others to collaborate. Time management skills, completing projects and tasks within deadlines, accuracy, curiosity, and perceptiveness. Ability to prioritize tasks and find solutions. Higher technical education, preferably in computer science, cognitive science, statistics, or mathematics. Motivation to develop yourself and acquire new knowledge. Fluency in Polish (C2) and English (C1). What will convince us even more?  Basic programming skills in languages like Python or R to automate tasks and perform advanced data analysis. Experience in data cleaning and pre-processing techniques to ensure data accuracy and consistency. Knowledge of the statistical analysis. Knowledge and experience with Google Analytics 4 (GA4).

Full TimedirectData & AI
Salary not disclosed2 months ago

AI Engineer / Researcher

Creotech Instruments · Poland

AI Engineer / Researcher Warsaw Tasks Analyze business problems and select suitable ML/LLM approaches for specific use cases. Build proofs of concept, run model benchmarks, and evaluate quality, cost, and scalability. Design and maintain data pipelines for preparation, cleaning, anonymization, and feature engineering. Build dataset repositories and data versioning processes for experiments and production workloads. Develop the internal AI platform: inference APIs, prediction monitoring, and automated training/deployment. Integrate language models with company systems, including RAG layers and prompt management. Support product teams with architecture guidance, reusable AI components, documentation, and training. Requirements At least 3 years of experience in AI/ML or Data Science Engineering. Hands-on experience with Python and the ML/LLM ecosystem (e.g., OpenAI, Hugging Face). Experience building data pipelines and working with production-scale datasets. Understanding of MLOps practices and model deployment automation. Ability to design APIs and integrate AI services with business systems. Knowledge of data security, access control, and AI governance. Strong analytical thinking and technical communication skills. Good command of English for technical documentation and collaboration. Nice to Have Experience with cloud platforms and GPU workloads. Knowledge of RAG solutions, vector databases, and LLM response quality evaluation. Experience in regulated or mission-critical environments. Familiarity with model drift monitoring and AI ethics risk management. We Offer A key role in building an internal AI platform with real organizational impact. Work with a modern tool stack and high technical autonomy. Growth opportunities in GenAI, MLOps, and data platform architecture. Stable employment terms and a collaborative work environment.

Full TimedirectData & AI
Salary not disclosed1 week ago

Analytics Engineer

deepsense.ai Sp. z o.o. · Poland

Must have: Cloud, Kafka, Kubernetes, Pandas, Polars, PostgreSQL, Pulumi, Python, Spark, SQL, Terraform Employment type: B2B Operating mode: Hybrid, Remote Location: Poland About deepsense.ai At deepsense.ai, we won’t just build AI solutions – we’ll shape how companies around the world use them. By joining us, you’ll: Work with partners like OpenAI, NVIDIA, Anyscale, LangChain, Crusoe, and ElevenLabs. Explore and apply the newest tech: LLMs & RAG, MLOps, Edge Solutions, Computer Vision, Predictive Analytics. Tackle challenges in software & tech, pharma & healthcare, manufacturing, retail, telecoms & media. Contribute to open-source projects – just take a look at our latest solution, ragbits, an agentic RAG framework with over 1.6k stars on GitHub. And the best part of working at deepsense.ai? Spread your wings with clear career paths, technical or leadership. Collaborate with 100+ AI experts with 15+ years of applied AI experience, as well as PhD-level researchers with academic backgrounds. Tap into domain expertise and knowledge sharing whenever you need it. What we believe in? Team Strength – sharing and exchanging knowledge is key to our daily work Accountability – we take responsibility for the tasks entrusted to us so that ultimately the client receives the best possible quality Balance – we value work-life balance Commitment – we want you to be fully part of the team Openness – we don’t want you to be locked into one solution, we want to look for alternatives, explore new possibilities Responsibilities Designing, developing, and maintaining data pipelines (ETL/ ELTs). Working with a variety of data sources (such as: streaming data, batch-data, data from production systems, data from ML models, time-series data). Ensuring the stability and performance of data pipelines, including monitoring and troubleshooting. Collaborating with Data Scientists, Software Engineers, and other specialists to implement data-driven solutions. Analyzing new potential data sources and integrating them into existing pipelines. Assisting DevOps/MLops in configuring system infrastructure in “Infrastructure as a Code” manner (including deployment of your code solution). Creating meaningful insights and visualizations from the processed data Gathering requirements, planning your own work, and prioritizing. You must have Interest in the AI/ML area. Good understanding of Python and SQL. Experience with data-wrangling libraries such as Pandas or Polars. Good understanding of statistics and data analysis. Experience with any of the major SQL databases (PostgreSQL preferred). Experience with cloud computing platforms (Azure/GCP/AWS). Experience with containerization technologies (Docker/Kubernetes). Experience with ETL and Big Data elements (Spark, Kafka, etc.). Basic experience in DevOps (Terraform/CloudFormation/Pulumi). Ability to work both in a team and independently, and a proactive approach to problem-solving. You may have Experience with Hierarchical/Graph Datasets Experience with Google BigQuery, Redshift/Glue/S3/Lambda/Athena. Familiarity with Snowflake, Databricks. Any experience with data integration or orchestration platforms, such as: Airbyte, Fivetran, Airflow, Dagster, etc.  Experience with machine learning. Experience with CI/CD pipelines such as Jenkins. We offer opportunity to work on cutting-edge AI projects with a diverse range of clients and industries, driving solutions from development to production, collaborative and supportive work environment, where you can grow and learn from a team of talented professionals, an opportunity to participate in conferences and workshops around the world, an opportunity to participate in Tech Talks (internal training and seminar sessions), flexible working hours and remote work options. We Offer Impactful AI projects Tackle industry-grade challenges: from LLMs for drug discovery to GenAI on edge devices, AI voicebots, and open-source initiatives with global reach. Collaborate directly with our partners for early access to tools before public release, testing in production, and bringing know-how from AI leaders into our projects. Contribute to open-source initiatives like ragbits (1.6k+ stars on GitHub), adopted and appreciated by the ML community. Growth & Knowledge Sharing Join AI specialists who share expertise through Tech Talks, workshops, and internal trainings. Present your work at conferences, run experiments, and stay ahead of the curve. Choose your own career path and get support for your development. Flexibility & Culture Work fully remote, from one of our two offices (Warsaw, Bydgoszcz), or from coworking spaces in Poznań, Łódź, Wrocław, and Gdańsk. Enjoy flexible working hours. Benefit from a culture that prevents burnout and supports balance in daily work. Start with onboarding – from day one you are matched with a buddy. Get high-end equipment (laptops, dual monitors, pro peripherals). Access a premium AI development suite: OpenAI ChatGPT, Claude, Gemini Advanced, GitHub Copilot, Cursor AI IDE, Claude Code, NotebookLM, plus the latest emerging AI tools to support your daily work.

Full TimeRemotedirectData & AI
Salary not disclosed1 month ago

Senior Big-Data Engineer

ControlUp · Poland

ControlUp is a leader in Digital Employee Experience (DEX), evolving IT operations with agentic AI to deliver Autonomous Endpoint Management (AEM) and a digital workspace that runs itself. Our platform transforms millions of real-time signals into intelligent action, bridging the gap between deep visibility and automated remediation. We turn "IT headaches" and employee frustration into self-healing operations, allowing organizations to move beyond reactive troubleshooting toward a future where technology works seamlessly in the background. We’re here to ensure the workday runs without disruptions. No tool sprawl, no wasted time, and no friction. Just technology that works for people, not against them, so they can stay focused on what they do best. One platform. One powerful shift in how work flows. About the Role: We’re looking for an experienced Data Engineer to join our DEX Platform and help drive the next stage of our growth. In this role, you’ll design, build, and scale data infrastructure and advanced processing pipelines that power real-time insights and intelligent automation across our platform. As a senior data engineer, you’ll own the full service lifecycle—from architecture and development to deployment, monitoring, and scaling—while ensuring data quality, reliability, and compliance. You’ll work closely with backend, AI, and data teams to deliver robust, production-ready systems that support our AI-first strategy and our mission to make the workplace run itself. \n What you'll bring: 5+ years of backend/data engineering experience Proficiency with Java (Vert.x, Spring or similar) - a must Hands-on experience with Kafka and streaming frameworks like Kafka Streams, Flink, Spark, or Beam - a must Solid understanding of microservices architecture and cloud platforms (Azure/AWS/GCP) Familiarity with AI-first development tools (GitHub Copilot, Cursor) – an advantage Experience working in production-aware, observability-driven environments - monitoring troubleshooting, and optimizing Knowledge of Postgres, Redis, MySQL or similar, Clickhouse Experience in Node.js (NestJS/Express) - a big advantage Strong foundation in object-oriented design, design patterns, and clean code practices Comfortable designing, deploying, and maintaining backend services and data flows Passion for learning, experimenting with new tech, and building reliable systems at scale Why you'll love it here: Global, Not Isolated: You’ll work with a diverse team spanning the US, EMEA, and Israel. We’re known for our vibrant social culture, and once a year, the entire global company comes together for a major offsite to reconnect in person. Meaningful Impact: Make a real impact from day one, working with cutting-edge technology in a fast-paced, global environment. Collaborative & Growth-Driven: Join a collaborative, empowering culture where learning and professional growth are encouraged. True Flexibility: We offer a hybrid model. We care about the output of your expertise, not the hours you spend in a specific chair. Ready to join? Apply now and help us shape the future of IT and employee experience. \n

Full TimedirectData & AI
Salary not disclosed2 months ago

<Big Data> · <BI & Analytics>

Addepto · Warsaw, Poland

Addepto is a leading AI consulting (https://addepto.com/ai-consulting/) and data engineering (https://addepto.com/data-engineering-services/) company that builds scalable, ROI-focused AI solutions for some of the world’s largest enterprises and pioneering startups, including Rolls Royce, Continental, Porsche, ABB, and WGU. With an exclusive focus on Artificial Intelligence and Big Data, Addepto helps organizations unlock the full potential of their data through systems designed for measurable business impact and long-term growth. The company’s work extends beyond client engagements. Drawing from real-world challenges and insights, Addepto has developed its own product – ContextClue – and actively contributes open-source solutions to the AI community. This commitment to transforming practical experience into scalable innovation has earned Addepto recognition by Forbes as one of the top 10 AI consulting companies worldwide. As part of KMS Technology, a US-based global technology group, Addepto combines deep AI specialization with enterprise-scale delivery capabilities—enabling the partnership to move clients from AI experimentation to production impact, securely and at scale. As a Data Engineer, you will have the opportunity to support and further develop an Azure-based data integration solution built primarily around Azure Data Factory (ADF). The current environment includes Azure Functions and ingestion components, but daily delivery is strongly centered on ADF pipeline design, orchestration, monitoring, and continuous improvement. The project focuses on expanding and stabilizing the existing data platform, including the warehouse layer (Azure SQL Database or Synapse), optional dbt-based transformations, and occasional Power BI reporting support. You will work closely with the client-side Product Owner and Architect, proactively aligning business needs with technical implementation decisions and ensuring high-quality, scalable solutions. Additionally, you will contribute to Azure DevOps CI/CD pipelines and release processes to maintain reliable deployments across environments. Discover our perks and benefits: Work in a supportive team of passionate enthusiasts of AI & Big Data. Engage with top-tier global enterprises and cutting-edge startups on international projects. Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.  Accelerate your professional growth through career paths, knowledge-sharing initiatives, language classes, and sponsored training or conferences, including a partnership with Databricks, which offers industry-leading training materials and certifications. Choose your preferred form of cooperation: B2B or a contract of mandate, and make use of 20 fully paid days off. Participate in team-building events and utilize the integration budget. Celebrate work anniversaries, birthdays, and milestones. Access medical and sports packages, eye care, and well-being support services, including psychotherapy and coaching. Get full work equipment for optimal productivity, including a laptop and other necessary devices. With our backing, you can boost your personal brand by speaking at conferences, writing for our blog, or participating in meetups. Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture. In this position, you will: Design, develop, and maintain Azure Data Factory pipelines, including orchestration, parameterization, and trigger management. Configure and manage linked services and datasets within ADF. Monitor, troubleshoot, and optimize ADF pipelines to ensure performance and reliability. Develop and maintain ETL/ELT processes and support the evolution of the data warehouse layer (Azure SQL Database or Synapse). Translate business requirements into technical solutions in close collaboration with the Product Owner and Architect. Develop and maintain Python-based components (e.g., Azure Functions, API integrations, automation scripts). Contribute to CI/CD processes in Azure DevOps, including pipelines, releases, and environment promotion. Support occasional Power BI reporting and dashboarding needs. Ensure proactive communication, stakeholder alignment, and visibility of risks and impacts. Take ownership of assigned tasks and actively contribute to continuous platform improvement. What you’ll need to succeed in this role: At least 2+ years of hands-on experience with Azure Data Factory (must-have), including: pipeline and orchestration design, linked services, datasets, triggers, and parameterization, operational monitoring, troubleshooting, and performance tuning, and deployment-aware ADF development in enterprise environments. Excellent knowledge of Python (Azure Functions, REST APIs, automation). Strong SQL skills and solid understanding of data modeling for ETL/ELT and warehouse workloads. Experience with CI/CD processes in Azure DevOps (pipelines, releases, multi-environment deployments). Solid understanding of Azure services and cloud-based data solutions (e.g., Azure SQL Database, Azure Key Vault). Experience with Power BI for occasional dashboarding and reporting. Experience working with modern development practices and tools. Consulting mindset with proactive communication and strong stakeholder alignment skills. Ability to effectively collaborate with Product Owner and Architect during planning and delivery. Independent and responsible approach to delivering high-quality solutions. Excellent command of English (at least C1 level).

Full TimeRemotedirectData & AI
PLN 12,600 - 21,000/month1 month ago

Lead Data Engineer (Databricks)

SOLID.Jobs · Wrocław

Kogo poszukujemy? Must have: 7+ years of experience in Data Engineering (including min. 5 years with Databricks) Strong experience in Databricks Data Platform for distributed data processing Excellent programming skills in Python and SQL Strong understanding of data modeling, data lakehouse architecture, and ELT/ETL patterns Experience designing scalable cloud-based data platforms (AWS / Azure) Knowledge of data governance, security, and access control best practices (Unity Catalog, dbt) Experience leading or mentoring engineers is a strong advantage Strong analytical thinking and problem-solving skills Excellent communication and collaboration skills Fluency in English (at least B2) Nice to have: Data Streaming: Kafka Databases: MS SQL (SSIS, SSAS), PostgreSQL, MySQL BI tools: PowerBI This is a hybrid role, which means we'd like you to work in the office occasionally, especially during client visits or other important company meetings. We'd also like you to be willing to take an occasional short business trips to Warsaw (approximately four times a year). Czym będziesz się zajmować? Responsibilities Design and implement scalable data platforms and pipelines using Apache Spark on Databricks Lead the development of distributed data processing pipelines using PySpark and SparkSQL Build and manage Databricks Workflows for orchestration, scheduling, monitoring, and error handling Optimize Spark workloads by applying join strategies, shuffle optimization, caching, and partitioning techniques Design and maintain Delta Lake architectures, including schema evolution, ACID transactions, and performance tuning Implement data governance and access control using Unity Catalog, including permissions, lineage, and secure data sharing Collaborate with architects and engineering teams to design cloud-native data platforms Ensure data quality, observability, and reliability across pipelines and data products Lead performance optimization of large-scale data processing workloads Mentor and support other Data Engineers, contributing to engineering standards and best practices Participate in architecture discussions and contribute to the evolution of the company’s data engineering practices

Full TimeRemotedirectData & AI
Salary not disclosed2 months ago

Senior Data Engineer

CatchTheGeek · Poland

CLIENT ZONE Senior Data Engineer Wrocław Senior Data Engineer | Microsoft Fabric Globalny projekt modernizacji data platform Aktualnie poszukujemy Senior Data Engineera do długofalowego projektu modernizacji stacku technologicznego opartego o ekosystem Microsoft, ze szczególnym naciskiem na Microsoft Fabric. Rola wymaga pełnej dostępności (pełny etat) oraz realnego wpływu na architekturę i decyzje technologiczne. 📌 O projekcie Klient: skandynawska firma produkcyjna z oddziałami na całym świecieCzas trwania: ok. 3 lataZakres: kompleksowa modernizacja platformy danychGłówny stream: Data (Microsoft Fabric) Projekt realizowany jest w skali globalnej i obejmuje przebudowę istniejących rozwiązań danych w kierunku nowoczesnej, skalowalnej platformy opartej o Microsoft Fabric oraz usługi Azure. Poszukujemy eksperta, który wniesie praktyczne doświadczenie produkcyjne i będzie partnerem do rozmów zarówno dla IT, jak i biznesu. 🎯 Profil poszukiwanej osoby Senior Data Engineer z doświadczeniem w Microsoft Fabric w środowiskach produkcyjnych silne kompetencje techniczne połączone z umiejętnością zbierania i doprecyzowywania wymagań biznesowych bardzo dobra znajomość ograniczeń, założeń i decyzji architektonicznych Microsoft Fabric gotowość do pełnego zaangażowania (pełny etat) 🛠 Zakres odpowiedzialności projektowanie i implementacja end-to-end data pipelines w oparciu o Microsoft Fabric i usługi Azure budowa oraz utrzymanie architektur Lakehouse i Warehouse współpraca z zespołami BI, architektami danych oraz interesariuszami biznesowymi udział w decyzjach architektonicznych i standaryzacji platformy danych ✅ Wymagania techniczne Doświadczenie w obszarach: Lakehouse Architecture Data Pipelines Dataflows Gen2 modele semantyczne integracja z Power BI integracja Microsoft Fabric z Azure Data Factory, Azure Data Lake oraz systemami zewnętrznymi Umiejętności techniczne: SQL Spark / PySpark modelowanie danych optymalizacja wydajności i projektowanie skalowalnych architektur znajomość zasad data platform governance 🔍 Co oferujemy udział w strategicznym, wieloletnim projekcie o globalnym zasięgu realny wpływ na architekturę i kierunek rozwoju platformy danych praca z nowoczesnym stackiem Microsoft (Fabric, Azure) współpraca z dojrzałym biznesem i zespołami międzynarodowymi otwarty budżet na stanowisko - podaj nam swoje oczekiwania, a jak będą szły w parze z umiejętnościami to dogadamy się :) KRAZ 19622

Full TimedirectData & AI
Salary not disclosed2 months ago

ML/AI Engineer - Architect Poland Apply now Growth through diversity, equity, and inclusion. As an ethical business, we do what is right — including ensuring equal opportunities and fostering a safe, respectful workplace for each of us. We believe diversity fuels both personal and business growth. We're committed to building an inclusive community where all our people thrive regardless of their backgrounds, identities, or other personal characteristics.     Tasks:     Building high-performing, scalable, enterprise-grade LLM/AI applications in cloud environment. Working with Data Science teams to analyze requirements, build architecture conceptions, lead a implementation GenerativeAI and Machine Learning models into production (Tech Lead). Practical and innovative implementations of LLM/ML/AI automation, for scale and efficiency. Design, delivery and management of industrialized processing pipelines. Defining and implementing best practices in GenAI/ML models life cycle and ML operations/LLM operations. Implementing AI /MLOps/LLMOps frameworks and supporting Data Science teams in best practices. Gathering and applying knowledge on modern techniques, tools and frameworks in the area of ML Architecture and Operations. Gathering technical requirements & estimating planned work. Presenting solutions, concepts and results to internal and external clients. Creating technical documentation including diagrams.   What We're Looking For:   At least 8+ years of experience in production-ready Python code development (e.g., microservices, APIs, etc.). At least 8+ years of experience in production-ready ML-related code development. At least 2 years of experience in production-ready LLM-related code development, preferably based on the Retrieval Augmented Generation concept (RAG). At least 3+ years of experience in Cloud Architecture. Good understanding and experience with GenerativeAI models APIs (Large Language Models/Large Multimodal Models). Good understanding and experience with LLM orchestrators (e.g., Langchain, etc.) and concepts (RAG, in-context learning, fine-tuning). Good understanding of LLM evaluators, validators, and guardrails. Good understanding of LLMOps concepts like GenAI operationalization\scaling (e.g., LLMs serving, performance & API Gateways, LLMs tracking & monitoring). Experience in developing GenAI apps in rapid frameworks (e.g., Streamlit). Experience in MLOps/LLMOps tools like AzureML/AzureAI or GCP VertexAI. Good understanding of ML/AI concepts: types of algorithms, machine learning frameworks, model efficiency metrics, model life-cycle, AI architectures. Good understanding of Cloud concepts and architectures, as well as working knowledge with selected cloud services, preferably Azure or GCP. UML notation. Good communication skills. Ability to work in a team and support others. Taking responsibility for tasks and deliverables. Great problem-solving skills and critical thinking. Fluency in written and spoken English.   What Will Set You Apart:   Experience in designing, programming ML algorithms, and data processing pipelines using Python. Good understanding of CI/CD and DevOps concepts, and experience in working with selected tools (preferably GitHub Actions, GitLab, or Azure DevOps). Experience in productizing ML solutions using technologies like Spark/Databricks or Docker/Kubernetes. BPMN, Archimate.

Full TimedirectData & AI
Salary not disclosed1 month ago

Senior Applied AI Engineer (Europe)

Kalepa · EUROPE (REMOTE)

Senior Applied AI Engineer (Europe) ENGINEERING EUROPE (REMOTE) FULL TIME APPLY FOR JOB About Kalepa: Commercial insurance is a trillion-dollar industry still run out of Microsoft Outlook. Kalepa is changing that. Kalepa is an AI Underwriting Platform built to deliver Professional-Grade AI in production - helping the world's most important insurers centralize and prioritize submission data, surface critical risk insights, and make faster, more confident decisions. Customers see meaningful improvements to both speed and portfolio quality as soon as they implement Kalepa - so much so that clients call it “truly an underwriter’s dream.” Kalepa's team members bring experience from top technology companies, including Facebook, Google, Amazon, Mastercard, and Uber. Kalepa is backed by leading VCs like IA Ventures and Inspired Capital. Our Values (This is important): Many organizations have a dusty list of corporate values that no one ever follows. Kalepa is not one of those companies. Our values are designed to unlock the potential of our employees. Success at Kalepa is bred from five core principles: Hustle and Determination - We hire people who take full ownership of their craft and relentlessly pursue excellence with speed and determination. We choose the hard problems and do not give up. This is the foundation of how we work and how we win. Deliver Customer Impact - We're obsessed with customer impact. Every feature, every line of code, every decision is measured against one question: does this help insurers make better decisions? If it doesn't drive speed, accuracy, or profitability for our customers, we don't build it. Meritocracy - We're building something exceptional, and that requires exceptional people and ideas. We have a high bar but we reward excellence with rapid growth. If you're the best at what you do, you'll thrive here. Transparency - We value honest, transparent communication over politics. We challenge ideas, not people. We say what needs to be said, even when it's hard. This is how we solve problems fast and find the truth. Experiment Relentlessly - Many of the problems we face have never been solved before. We tackle them by testing quickly, measuring rigorously, and iterating until we find the path forward. Speed of learning is our advantage. In addition to our NY office, we have remote employees speaking 10+ languages across the globe. But we invest in bringing our people together both for regional meetups and global offsites (2021 - Playa Del Carmen | 2022 - Rome | 2023 - Buenos Aires | 2024 - Lisbon | 2025 - Cartagena). Kalepa’s culture isn’t for everyone, and that’s ok. But for the people who are a fit, they can’t imagine working elsewhere. About the role: Salary range: $85k – $165k USD* Kalepa is looking for a Senior Applied AI Engineer with 5+ years of experience to lead the framing, development, and deployment at the scale of machine learning models. As an Applied AI Engineer you will lead the framing, development, and deployment at scale of machine learning models to understand the risk of various classes of businesses. You will be turning vast amounts of structured and unstructured data from many sources (web data, geolocation, satellite imaging, etc.) into novel insights about behavior and risk. Team members are given full ownership over their projects and are expected to drive the project’s direction and maintain focus. The team works in a two-week sprint, and ML Engineers will work closely with Product Management and Software Engineers. About you: You have 5+ years of experience in engineering and data science. You love to hustle: finding ways to get things done, destroying obstacles, and never taking no for an answer. The words “it can’t be done” don’t exist in your vocabulary. You have in-depth understanding of applied machine learning algorithms, especially NLP, and statistics You are experienced in Python and its major data science libraries, and have deployed models and algorithms in production You are comfortable with data science as well as with the engineering required to bring your models to production. You are excited about using a wide set of technologies, ultimately focused on finding the right tool for the job. You value open, frank, and respectful communication. As a plus: You have experience with AWS You have hands-on experience with data analytics and data engineering. What you’ll get: Competitive salary (based on experience level)*. Significant equity options package. Work with an ambitious, smart, global, and fun team to transform a $1T global industry. 20 days of PTO a year. Global team offsites. Phone reimbursement. Gym reimbursement. Student stipend. *The salary range listed is an estimate and will vary based on a variety of factors. Final compensation will be determined during the offer stage based on relevant experience, performance during the interview process, and geographic location, and may therefore differ from the posted range. APPLY FOR JOB See how Kalepa helps insurers improve speed, consistency, and portfolio performance. BOOK YOUR DEMO TRUSTED BY TOP-TIER INSURERS. PROVEN IN PRODUCTION. Stay ahead with underwriting intelligence: Insights, product updates and industry trends By subscribing you agree to with our Privacy Policy and provide consent to receive updates from our company. PLATFORM Submission Ingestion Clearance Triage Risk Analysis Rating Quote & Bind Portfolio Management SOLUTIONS BY ROLE For CUOs For COOs For IT/AI Leaders BY SEGMENT Carriers MGAs Mutuals RESOURCES Case Studies Blog Events ABOUT Company Careers Newsroom © 2026 Kalepa. All rights reserved. Design by Altalogy Privacy Policy Product Security

Full TimeRemotedirectData & AI
USD 85,000 - 165,000/year3 months ago

Team Lead - Data Engineer

Clouds on Mars · Warsaw

Looking for stable projects with modern tech and room to grow?   We are Clouds on Mars, an IT consulting company specializing in architecting and delivering solutions that exceed current technological and domain standards. From cloud and data engineering through AI-driven analytics to web apps and automation – we manage the full project cycle.  How we work Our 120+ experts, organized into 4 Business Units, manage projects end-to-end - from technical presales and solution design to final delivery. Each team combines specialists from different fields - Data Engineers, BI Consultants, AI Engineers, Solution Architects, Power Platform Consultants, and Project Managers, enabling us to deliver truly comprehensive solutions. Every role is end-to-end: we don’t just execute tasks, we act as trusted advisors, guiding clients and leading projects from A to Z.  What we stand for  You’ll always have a team to count on and space to try out new ideas. We believe in ownership where your work will drive real change, not just tick boxes.  Here, learning never stops - we grow together and raise the bar step by step. And because we work with respect and passion, you won’t just deliver to clients but build real solutions along with them.  For those who share our mindset, we offer not just projects, but a full-time, long-term partnership.  Technologies we work with Data Platforms: Microsoft Fabric, Databricks, and Snowflake  Power BI, Power Platform: Power Automate, Power Apps, Copilot Studio  Azure Machine Learning, AI Foundry, Cognitive Services, MCP Servers  Python, React.js, App Services, Container Apps    Who are we looking for  We are looking for Team Lead - Data Engineer with proven expertise in leading consulting projects for external clients across various industries, delivering advanced data solutions using Microsoft Azure, Fabric, Databricks, and modern data engineering technologies. In this role, you will work directly with external clients on modern data platform implementations, while also supporting the development, performance, and engagement of a team of engineers working across multiple projects.  We are currently looking for Team Lead - Data Engineer with experience in AI/ML or a strong interest in developing their skills in this area. Candidates with relevant expertise or growth potential in this field will be given priority during the selection process. What yo'll do Design and deliver scalable ETL/ELT pipelines, data models, and reporting solutions using technologies such as Azure Data Factory, Synapse, Fabric, Databricks, and Power BI.  Architect modern data platforms (data warehouse, data lake, or lakehouse) with performance, scalability, and reliability in mind.  Collaborate with technical stakeholders to translate business requirements into robust, production-ready data solutions.  Contribute to CI/CD processes, code quality standards, automated testing, and documentation.  Act as a senior technical contributor across one or more client projects in a consulting setting.   Act as direct people manager for a team of data engineers, responsible for performance reviews, development planning, and career coaching.  Conduct regular 1:1s, provide feedback, and support skills growth through mentoring and learning opportunities.  Balance and allocate team members across multiple projects, ensuring workload clarity and capacity planning.  Foster a healthy, collaborative team culture by promoting open communication, continuous improvement, and psychological safety.  Monitor team morale and resolve delivery blockers that affect individual or team performance.   Work closely with Project Managers and Technical Leads to align team priorities and remove conflicts or inefficiencies.  Ensure consistency in delivery standards, team engagement, and cross-project coordination.  Support team members in navigating overlapping priorities or challenges across multiple engagements.  What you bring 5+ years of experience delivering data engineering solutions in a consulting or multi-client project environment.  Strong technical expertise with the Microsoft Azure ecosystem (Data Factory, Synapse, Fabric), Databricks, and Power BI.  Solid background in ETL/ELT design, data integration, data modeling, and performance optimization.  Proven people management experience, including performance evaluation, coaching, and career development.  Excellent communication skills and ability to engage with both technical and business stakeholders.  Fluency in English and Polish.

Full TimedirectData & AI
Salary not disclosed1 month ago

AI Engineer / Researcher

Creotech · Poland

AI Engineer / Researcher Warszawa Zadania Analiza problemów biznesowych i dobór podejść ML/LLM pod konkretne przypadki użycia. Budowa proof of concept, benchmarków modeli oraz ewaluacja jakości, kosztu i skalowalności rozwiązań. Projektowanie i utrzymanie pipeline do przygotowania danych, czyszczenia, anonimizacji i feature engineering. Budowa repozytorium datasetów oraz wersjonowanie danych na potrzeby eksperymentów i wdrożeń. Tworzenie i rozwój wewnętrznej platformy AI: API do inference, monitoring predykcji, automatyzacja treningu i wdrożeń. Integracja modeli językowych z systemami firmy, budowa warstwy RAG i zarządzanie promptami. Współpraca z zespołami produktowymi: konsultacje architektoniczne, tworzenie komponentów AI do ponownego użycia, dokumentacja i szkolenia. Wymagania Minimum 3 lata doświadczenia w obszarze AI/ML lub Data Science Engineering. Praktyczna znajomość Python oraz ekosystemu ML/LLM (np. OpenAI, Hugging Face). Doświadczenie w budowie pipeline danych i pracy z danymi w skali produkcyjnej. Znajomość zasad MLOps oraz automatyzacji wdrożeń modeli. Umiejętność projektowania API i integracji usług AI z systemami biznesowymi. Znajomość zagadnień bezpieczeństwa danych, kontroli dostępu i governance AI. Umiejętność analitycznego myślenia oraz komunikacji technicznej. Dobra znajomość języka angielskiego pozwalająca na pracę z dokumentacją techniczną. Mile widziane Doświadczenie z platformami chmurowymi i workloadami GPU. Znajomość rozwiązań RAG, vector databases i oceny jakości odpowiedzi LLM. Doświadczenie w projektach regulowanych lub mission-critical. Znajomość narzędzi do monitoringu dryfu modeli i ryzyk etycznych. Oferujemy Udział w budowie wewnętrznej platformy AI o realnym wpływie na organizację. Pracę z nowoczesnym stosem narzędzi i dużą autonomią techniczną. Możliwość rozwoju w obszarze GenAI, MLOps i architektury danych. Stabilne warunki zatrudnienia i wspierające środowisko współpracy.

Full TimedirectData & AI
Salary not disclosed1 week ago

Senior Data Engineer

fusionSpan · Poland

fusionSpan is a fast-growing company with development centers in the USA, Canada, El-Salvador, Poland, and India. fusionSpan's focus is on software development, enterprise CRM implementations, digital strategy, and enterprise middleware.  A data engineer at fusionSpan will be part of the cross-functional Data & Integrations team at fusionSpan. This team handles all data analysis and ETL issues across multiple projects.The ideal candidate needs to be able to work autonomously and adapt to an evolving work structure. If you love working with data, then this is the perfect position for you. This is a hybrid role with 2-3 days on-site in our Rockville, MD office. ResponsibilitiesUtilize extract/transform/load ETL technologies using snowflake and other cloud data platforms Interpret data, analyze results using statistical techniques and provide ongoing reports, Develop and implement databases, data collection systems, data analytics, and other strategies that optimize statistical efficiency and quality, Acquire data from primary or secondary data sources and maintain databases/data systems, Evaluate and optimize data structures, Identify, analyze, and interpret trends or patterns in complex data sets, Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems, Monitor, troubleshoot, and improve pipeline transparency, performance, scalability, and reliability, using Snowflake OpenFlow and related ELT/ETL tools Ensure AI/ML readiness of data by preparing and maintaining semantic models, ensuring robust data quality, and establishing and enforcing data access Produce field mapping and translation documentation for use in both manual and scripted migrations,  Work within Agile methodology managing tasks and tickets as assigned,  Communicate with clients and team members for requirements gathering, clarification, and planning for data conversions,  Document work and work processes for use by team members. Qualifications:2+ years of experience with an ETL tool is required such as Talend, Fivetran, Matillion, Airbyte, 2+ Experience using SQL and RDBMS is required 1+ Experience with AWS/Azure cloud databases/data lakes 1+ years of active and frequent coding experience in Python required. Experience with ERP solutions such as SAP and CRM solutions such as Salesforce preferred Mastery of Excel  Compensation: The anticipated salary range for this role is $60,000 - $100,000 annually. Actual compensation will be determined based on factors such as experience, skills, qualifications, and may vary accordingly. What We Offer: Health (PPO) dental & vision plan – 100% covered for employee  Long/Short-term disability insurance – 100% covered for employee  Life and AD&D insurance – 100% covered for employee  401K up to 4% matching contribution 15 days of paid vacation – increases with tenure 10 paid federal holidays 12 weeks for parental leave  Find out more about our benefits here. About fusionSpan fusionSpan is a fast-paced, high-energy global firm with a highly motivated team. This role will experience high work demands under tight timelines requiring a flexible and adaptable approach to daily priorities. We are open to qualified candidates worldwide even though our job opportunities are posted for a specific region. Check out our Great Place to Work Certified Badge here. Our Company Values: Trust: We believe trust is the foundation of success, and build it through unwavering integrity, transparency, and open communication. We deliver on promises, address challenges directly, and hold ourselves accountable to excellence in all interactions with clients, partners, and teammates. Innovation: We understand that excellence and innovation go hand-in-hand and are committed to developing forward-looking, creative solutions that meet our clients’ evolving needs and move the industry forward. We embrace change, celebrate creativity, and prioritize quality to create a new standard of performance. Community: We are a community-first organization committed to creating a culture of collaboration, inclusivity, and respect where each voice is heard and all contributions are valued. We prioritize responsible and sustainable practices on our path to positively impact those we serve.

Full TimedirectData & AI
USD 60,000 - 100,000/year3 months ago

AI Engineer

Tooploox · Warszawa, PL

<h2>Hi there! 👋🏻</h2><p>We’re <strong>Tooploox 💎</strong>, a <strong>Solvd Inc</strong>. company.</p><p>We create<strong> AI-powered products</strong> and services that make a real difference. Our team of nearly 200 specialists - including a 40+ person R&amp;D team with many PhDs - has pioneered AI solutions across industries from healthcare to e-commerce. We’ve published research in top-tier venues like NeurIPS and ICML, while delivering real-world applications that help our clients innovate and scale.</p><p><strong>Now, as part of Solvd Inc.</strong>, we’re building on our strengths with even greater global reach and resources — while keeping the creative, human-scale culture that defines us here in Poland. If you want to work on meaningful AI projects, with space to grow and the freedom to innovate, you’ll feel right at home.</p><p>We are seeking a dynamic and motivated <strong>AI Engineer </strong>with a strong focus on developing <strong>Production-Grade AI Agents.</strong> If you have a deep understanding of modern AI stacks and thrive on moving beyond simple demos to build reliable, evaluation-driven solutions, we want to hear from you. This role involves not only technical expertise but also collaboration with clients to understand and refine project requirements.<br></p><h3>What you will do:</h3><ul><li><p><span>Collaborate with cross-functional teams to design and build stateful, multi-agent workflows using modern orchestration patterns.</span></p></li><li><p><span>Engage with clients to define requirements and oversee the full software development lifecycle, from proof of concept through evaluation and implementation to production.</span></p></li><li><p><span>Implement rigorous evaluation pipelines (e.g., LLM-as-a-Judge) to quantitatively measure agent performance and reliability before deployment.</span></p></li><li><p><span>Proactively drive productivity and adapt to evolving demands, taking initiative consistently.</span></p></li><li><p><span>Manage backend tasks while contributing to frontend development to ship functional end-to-end prototypes.</span></p></li><li><p><span>Stay informed about rapid advancements in the field (such as reasoning models, SLMs, or new prompting paradigms), continuously learning and integrating new knowledge.</span></p></li><li><p><span>Employ a pragmatic, evaluation-driven approach to adopting innovative AI solutions.</span></p></li></ul><h3>Experience and skills you need to join us:</h3><ul><li><p><span>Solid full-stack expertise in <strong>Python</strong>, with a focus on writing clean, modular code, complemented by experience with <strong>TypeScript</strong> or <strong>Node.js</strong>.</span></p></li><li><p><span>Deep practical experience with modern orchestration and state-management frameworks like <strong>LangGraph</strong> or <strong>LlamaIndex Workflows</strong>.</span></p></li><li><p><span>Strong understanding of <strong>Retrieval-Augmented Generation (RAG)</strong> including advanced techniques like hybrid search, re-ranking, and graph-based retrieval.</span></p></li><li><p><span>Experience implementing tracing and monitoring for complex agent flows using tools like <strong>LangSmith</strong>, <strong>LangFuse</strong>, or Arize Phoenix.</span></p></li><li><p><span>Proven ability to design <strong>Evals</strong> and test pipelines to prevent regression and hallucinations in production apps.</span></p></li><li><p><span>Experience with <strong>API/SDKs</strong> of major model providers (OpenAI, Anthropic, Gemini) as well as Open Source models.</span></p></li><li><p><span>Experience configuring services on <strong>cloud platforms</strong> (e.g., AWS, Azure, GCP) and containerization (Docker/K8s).</span></p></li><li><p><span>Proficient in automating CI/CD processes and understanding <strong>DevOps</strong> practices.</span></p></li><li><p><span>Openness to client collaboration, fostering clear communication and effective partnership.</span></p></li><li><p><span>Fluency in English (you will attend meetings with English speaking clients).</span></p></li></ul><h3>It would be great if you also have:</h3><ul><li><p><span>Experience running and serving <strong>local LLMs</strong> (e.g., via Ollama, vLLM, or TGI).</span></p></li><li><p><span>Familiarity with programmatic prompting optimization tools like <strong>DSPy</strong>.</span></p></li><li><p><span>Hands-on experience with the <strong>Model Context Protocol (MCP)</strong>.</span></p></li><li><p><span>Proficiency in <strong>Vector Stores</strong> (Pinecone, Qdrant, Weaviate, or pgvector).</span></p></li><li><p><span>Familiarity with Voice Agents concepts and frameworks like Pipecat.</span></p></li><li><p><span>Experience with graph databases (e.g., Neo4j) or GraphRAG approaches.</span></p></li><li><p><span>Fundamental knowledge of machine learning principles or model fine-tuning.</span></p></li></ul><h3><strong>How we work:</strong></h3><p><span>At </span><strong>Tooploox</strong><span> 💎, you have the </span><strong>flexibility to choose your working hours ⏰ and location </strong><span>📍. While we value </span><strong>remote work</strong><span>, we also believe in </span><strong>building relationships </strong><span>🤝and invite you to join us in our </span><strong>Warsaw and Wrocław offices </strong><span>🏢. Enjoy </span><strong>a relaxed atmosphere </strong><span>🍃 and try some “home-made” pizza 🍕 from our office pizza oven. We l</span><strong>ove having pets 🐶 in the office, </strong><span>so feel free to bring yours along.😁</span></p><p><strong>Join us and shape the future of AI while working the way you like!</strong></p><p></p>

Full TimeRemotedirectData & AI
Salary not disclosed4 months ago

Lead Data Engineer

Elitmind · Poland

Are you passionate about solving complex data challenges while making a real impact for clients? We're looking for a Lead Data Engineer who combines technical excellence with a client-first mindset. What Makes This Role Unique? This isn't just about writing code—it's about shaping the future of data platforms while working directly with clients to turn their challenges into opportunities. You'll be the trusted advisor who bridges business needs with cutting-edge technical solutions, helping organizations unlock the full potential of their data. What makes you a great fit? Technical Expertise: Deep experience with modern data platforms: Databricks and/or Snowflake, Microsoft Fabric Strong proficiency in SQL and Python with a solid grasp of software engineering best practices Hands-on knowledge of cloud-native services, ETL/ELT frameworks, and orchestration tools (ADF, Airflow, dbt) Experience building automation frameworks, testing tools, and reusable components that improve team productivity Understanding of GenAI applications in data engineering for automation, code generation, and workflow optimization The Mindset That Sets You Apart: Continuous learner: You actively track industry news, product announcements, and emerging technologies—always eager to explore new tools and patterns Technical enthusiast: You thrive on diving deep into complex problems and finding elegant solutions Collaborative leader: You excel at working with other technical enthusiasts, sharing knowledge, and elevating the entire team's capabilities Innovation driver: You look for opportunities to leverage GenAI and automation to eliminate repetitive tasks and focus on strategic initiatives Positive energy: You bring optimism and constructive thinking to every challenge, even in fast-paced, changing environments Client-focused problem solver: Your ultimate goal is fixing client problems and delivering tangible business value Flexible and adaptable: You excel in dynamic environments where priorities shift and requirements evolve Natural communicator: You're comfortable presenting technical decisions, leading architectural discussions, and building trust with stakeholders What will you do? Design and build scalable data platforms using Databricks and/or Snowflake, Microsoft Fabric Create reusable frameworks and automation tools that accelerate delivery and improve code quality across projects Leverage GenAI to streamline workflows, automate code generation, enhance documentation, and optimize data pipeline development Architect modern data platform that transforms how clients work with their data Lead and collaborate with fellow technical enthusiasts, mentoring team members and driving best practices across engineering teams Stay ahead of the curve by continuously learning and implementing the latest data engineering innovations Lead technical conversations with clients, explaining not just the "how" but the "why" behind every solution Adapt quickly to evolving project requirements and shifting client priorities with a positive, solution-oriented approach Solve real-world problems with hands-on engineering while fostering a culture of knowledge sharing and technical excellence Why This Opportunity? This role is designed for engineers who want more than just technical growth—it's for those who want to influence outcomes, co-create solutions, and be truly visible and impactful in client relationships. You'll work on diverse projects, constantly learn new technologies including GenAI applications, build innovative frameworks that scale across teams, and see the direct impact of your work on client success. You'll be surrounded by technical enthusiasts who share your passion for innovation, continuous improvement, and solving meaningful problems together. If you're ready to combine your technical passion with meaningful client collaboration and technical leadership, let's talk. Our benefits Your journey with us starts here: 1. Initial Screening: If you meet our requirements, our recruiter will reach out to you for a chat about your motivations and expectations. Get ready to share your passion! 2. Technical Interview: Next, you'll be invited to showcase your skills in an interview with one of our technical experts or team members. This is your chance to shine and demonstrate your expertise. 3. Final Interview: Finally, you'll have the opportunity to meet your future Team Lead. This is the perfect moment to learn more about the role, the team, and to ask any questions you might have.

Full TimedirectData & AI
Salary not disclosed1 week ago
Strona 1 z 14Następna