Accelerator - Data & AI Engineering - Paris
Role details
Job location
Tech stack
Job description
Design and implement end-to-end data pipelines that transform raw data into valuable insights, ensuring scalability and reliability in cloud environments Develop and optimize data models with a focus on query performance and efficient workloads Collaborate with cross-functional teams to translate business requirements into technical solutions, defining clear interface contracts between data products and applications Ensure data quality, standardization, observability, and governance across systems, aligning with industry compliance standards and data privacy requirements Automate data ingestion processes and monitoring systems to track operational KPIs, troubleshoot issues, and maintain pipeline health Build and maintain ETL processes and machine learning workflows, providing clean AI-ready data for downstream applications Actively contribute to transversal data engineering best practices, including design patterns, CI/CD integrations, containerized deployments, and participate in peer review of code and technical documentation
You are a dynamic Data & AI Engineer interested in challenging the status quo to ensure the seamless creation and operation of the data pipelines that are needed for the enterprise data and analytics initiatives following industry standard practices and tools for the betterment of our global patients and customers.
You are a valued influencer and leader who has contributed to making key datasets available to data scientists, analysts, and consumers throughout the enterprise to meet vital business use needs. You have a keen eye for improvement opportunities while continuing to fully comply with all data quality, security, and governance standards.
Requirements
Experience
Proven track record in designing and implementing data pipelines and data warehouse solutions in cloud environments Hands-on experience with data modeling, ETL/ELT processes, and pipeline orchestration in production settings Experience working in cross-functional teams, translating business needs into technical solutions Experience working within compliance (e.g.: quality, regulatory, data privacy, GxP) and cybersecurity requirements Experience in the healthcare industry is a strong plus
Soft Skills
Collaborative mindset with strong problem-solving abilities Self-motivated and able to take initiative in a fast-paced environment Effective communication skills to work with both technical and business stakeholders
Technical Skills
Strong proficiency in SQL and data warehousing platforms (Snowflake preferred) Hands-on experience with ETL tools (IICS or equivalent), data transformation tools (dbt preferred), and Python for data processing Strong experience with cloud platforms (AWS preferred) including orchestration frameworks (Airflow preferred) Advanced knowledge of CI/CD practices (GitHub Actions preferred), version control (Git), and infrastructure as code principles (Terraform preferred) Experience in designing and implementing engineering patterns and technical standards Good knowledge of logging and monitoring tools such as Datadog, Grafana Familiarity with containerization (Docker) and container orchestration (Kubernetes/Openshift a plus) Possessing relevant cloud certifications (AWS, Snowflake, IICS) is a plus
Education
Master's Degree or equivalent in Computer Science, Engineering, or relevant field
Languages
English, French