Data Engineer III

Expedia Group
Expedia Group

Job Overview

Expedia Group(EG) revolutionizes travel through the power of technology. Across the globe, our brands make travel easier for millions of people who want to step out of their homes and into the world. We’re a hardworking and focused team from all over the world – with 20,000 + employees, in over 30 countries. Together we seek new ideas, innovative ways of thinking, diverse backgrounds, and approaches because averages can lie and sameness is dangerous.

The People Technology Team is looking for an experienced Data Engineer to support our People data warehouse and build People data Lake powering people analytics, data science for Expedia Corporate, and all the Expedia brands. Your main responsibility will be working with the team and our customers to define, design, and implement our next-generation data warehouse and Data Lake on AWS, in addition to maintaining and growing the existing People Data Business needs.

What you’ll do:

  • You will optimize and automate ingestion and export processes for a variety of data sources, primarily Workday and other HR-related systems into EG Data Lake.
  • You will build/extend data products in EG Data Lake to enable people analytics and data science.
  • You will provide engineering leadership for our redesign and migration of the legacy HR data warehouse from MS SQL to EG Data lake
  • You will build/extend toolsets, create/maintain batch jobs, and create systems documentation
  • You will drive data investigations across organizations inside and outside of HR to deliver a resolution of technical, procedural, and operational issues
  • You will influence large scale projects to set the future direction for people data and analytics across Expedia
  • Who you are:

  • 5+ years of hands-on experience designing and operating large data platforms
  • 2+ years of experience with Java, Scala, or similar programming languages and experience in Maven, SBT, Spring framework.
  • 2+ years leading data warehousing and analytics projects, including using AWS technologies—Redshift, S3, EC2, and other data technologies.
  • Specialist in SQL. Good understanding in Data Warehousing and ETL.
  • Better Understanding of BigData Technologies like Hadoop, Spark, Hive, etc.
  • Experience in implementing SDLC methodologies and working in Agile SCRUM
  • BS in Computer Science, Mathematics, Statistics, or related field
  • Background in business measurement, Human Resources, e-commerce, or a comparable reporting and analytics role
  • View More
    Job Detail
    Shortlist Never pay anyone for job application test or interview.