Data Engineer

Kforce Technology Staffing
Kforce Technology Staffing

  • US
  • Post Date: 25 August 2020
  • Views 6
Job Overview

RESPONSIBILITIES Kforce has a client that is seeking a Data Engineer in Beaverton, OR. Summary Our client has embraced big data technologies to enable data-driven decisions. We’re looking to expand our Data Engineering team to keep pace. As a Senior Data Engineer, you will work with a variety of talented teammates and be a driving force for building first-class solutions for Technology and its business partners, working on development projects related to supply chain, commerce, consumer behavior and web analytics among others. Role responsibilities Design and implement data products and features in collaboration with product owners, data analysts, and business partners using Agile Scrum methodology Contribute to overall architecture, frameworks and patterns for processing and storing large data volumes Translate product backlog items into engineering designs and logical units of work Profile and analyze data for the purpose of designing scalable solutions Define and apply appropriate data acquisition and consumption strategies for given technical scenarios Design and implement distributed data processing pipelines using tools and languages prevalent in the big data ecosystem Build utilities, user defined functions, libraries, and frameworks to better enable data flow patterns Implement complex automated routines using workflow orchestration tools Work with architecture, engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and adhered to Anticipate, identify and solve issues concerning data management to improve data quality Build and incorporate automated unit tests and participate in integration testing efforts Utilize and advance continuous integration and deployment frameworks Troubleshoot data issues and perform root cause analysis Work across teams to resolve operational performance issues REQUIREMENTS MSBS in Computer Science, or related technical discipline 5 years of experience in large-scale software development, 3 years of big data experience Strong programming experience, Python preferred Extensive experience working with Hadoop and related processing frameworks such as Spark, Hive, etc. Experience with messagingstreamingcomplex event processing tooling and frameworks with an emphasis on Spark Streaming or Structured Streaming and Apache Nifi Good understanding of file formats including JSON, Parquet, Avro, and others Familiarity with data warehousing, dimensional modeling, and ETL development Experience with RDBMS systems, SQL and SQL Analytical functions Experience with workflow orchestration tools like Apache Airflow Familiarity with data warehousing, dimensional modeling, and ETL development Experience with performance and scalability tuning The following skills and experience are also relevant to our overall environment, and nice to have Experience with Scala or Java Experience working in a public cloud environment, particularly AWS, and with services like EMR, S3, Lambda, ElastiCache, DynamoDB, SNS, SQS, etc. Familiarity with cloud warehouse tools like Snowflake Experience building RESTful API’s to enable data consumption Familiarity with build tools such as Terraform or CloudFormation and automation tools such as Jenkins or Circle CI Familiarity with practices like Continuous Development, Continuous Integration and Automated Testing Experience in AgileScrum application development Kforce is an Equal OpportunityAffirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.

View More
Job Detail
Shortlist Never pay anyone for job application test or interview.