Data Engineer III

Expedia Group
Expedia Group

Job Overview

Expedia Group (EG) revolutionizes travel through the power of technology. Across the globe, our brands make travel easier for millions of people who want to step out of their homes and into the world. We’re a hardworking and focused team from all over the world – with 20,000 + employees, in over 30 countries. Together we seek new ideas, innovative ways of thinking, diverse backgrounds, and approaches because averages can lie and sameness is dangerous.

We at Cost Transparency Team is looking for an experienced Data Engineer to build and support our Cloud Cost Platform which powers cloud cost analytics, governance and optimization for all the Expedia brands. Your main responsibility will be working with the team and partners to define, design, and implement next-generation cloud finance management solution, apart from maintaining and growing the existing business needs.

What will you do

  • You will build/extend our full stack cloud cost management suite which includes a large scale data pipeline, APIs, Web UI and analytics dashboards
  • Build solutions to help improve the unit economics for Expedia Group by providing deeper visibility into cost of our platforms at transactional level
  • Interact with our platform teams and peer groups to understand technical/business requirements and implement end-to-end solutions for them
  • Drive data investigations across organization inside and outside of team to deliver a resolution of technical, procedural, and operational issues
  • You will influence large scale projects to set the future direction for cloud finance management and analytics across Expedia Group
  • Who you are

  • 2+ years of hands-on experience designing and operating data pipelines using Spark, Hive, MySQL, Airflow or similar frameworks
  • Ability to work with Java, Scala, Python or similar programming languages and experience in Maven, SBT, Git, Jenkins
  • Have basic understanding of OOP, SOLID and other commonly used design principles with ability to write clean unit testable code
  • Specialist in SQL. Good understanding of Data Warehousing and ETL
  • Good understanding of BigData Technologies like Hadoop, Spark, Hive, Kafka etc
  • Prior experience in API/UI development using any language framework is a huge plus
  • Qubole, Airflow, Dremio, Presto, BI tools (Tableau, Looker, Superset) knowledge is a plus
  • Experience in implementing SDLC methodologies and working in Agile SCRUM
  • BS in Computer Science, Mathematics, Statistics, or related field
  • Background in cloud billing, financial systems, e-commerce, or a comparable reporting and analytics role would be a plus
  • View More
    Job Detail
    Shortlist Never pay anyone for job application test or interview.