PUBLICIDADE

Última atualização: 10 de Agosto de 2025

Data engineer Sr

🌍 100% Remoto💬 Inglês✈️ Vaga internacional🧓🏽 Sênior

Via Rysconnect

Remuneração

$6,500.00

USD / Mensal

Sobre

We are looking for a talented Senior Data Engineer to join the team that builds the essential data infrastructure at the heart of our product suite. You should be a quick learner who is enthusiastic about learning new technologies. You should be a passionate engineer who will empower fellow engineers to write clean and efficient data pipelines. You will implement technical solutions to empower internal/external customers and scale the existing data platform. You will work closely with end users to develop and define key business questions, lead fellow engineers to define technical solutions, then work with Product Managers to build the data sets that answer those questions. You will work with huge datasets and build ETL’s that bring data together to answer business questions, conduct reporting, and drive growth.

Responsibilities

  • Design and implement ETL pipelines in Apache Airflow, Big Query, Python, and Spark to transform various upstream sources data into curated data assets for use across our vast array of data driven products
  • Stay up to date with and advocate internally for Data Engineering best practices, technologies and testing frameworks that will help shape our technical future
  • Architect and plan projects involving highly distributed and high performance data platform Systems
  • Provide technical guidance to other engineers in order to help promote the growth of their knowledge and technical capabilities
  • Communicate complex concepts and the results of the analyses in a clear and effective manner to Product and Engineering Managers
  • Identify areas for improvement in existing pipelines and processes
  • Understand business requirements and convert them to technical solutions

Qualifications and Skills

  • Strong SQL and Spark experience
  • Expert in at least one data pipelining orchestration framework (Airflow, Luigi, etc.)
  • Experience building ETL pipelines
  • Strong software engineering skills in an Object Oriented language such as Java or Python
  • Experience building and optimizing a data warehouse on a major cloud platform (Big Query preferred but not required)
  • Experience and deep understanding of Big Data Technologies such as Hadoop and Spark
  • Experience with building and optimizing large scale and high-performance systems
  • Extensive knowledge of data related tools and architecture concepts in a major cloud platform
  • Knowledge of Kubernetes, Docker, Airflow, Git, and CD/CI best practices
  • Strong collaboration and communication skills within and across teams
  • B.S. or M.S. in Computer Science or a related technical field
  • At least 4 years of software engineering experience, ideally in data engineering

Outras Informações

Selecionamos as principais informações da posição. Para conferir o descritivo completo, clique em "acessar" 


Hey!

Cadastre-se na Remotar para ter acesso a todos os recursos da plataforma, inclusive inscrever-se em vagas exclusivas e selecionadas!