Data Engineer – Architect level – W2/1099/Self Cop

Onspotserve
Onspotserve

  • US
  • Post Date: 27 August 2020
  • Views 13
Job Overview

Lead Data Engineer Duration: 12 months Location: Richmond, VA Top 4 1. AWS (if they have GCP and Azure this may be transferrable) 2. Spark 3. HDFS 4. Microservices Background 5. APIs Who you are: You yearn to be part of cutting edge, high profile projects and are motivated by delivering world-class solutions on an aggressive schedule Someone who is not intimidated by challenges; thrives even under pressure; is passionate about their craft; and hyper focused on delivering exceptional results You love to learn new technologies and mentor junior engineers to raise the bar on your team It would be awesome if you have a robust portfolio on Github and / or open source contributions you are proud to share Passionate about intuitive and engaging user interfaces, as well as new/emerging concepts and techniques. The Job: Collaborating as part of a cross-functional Agile team to create and enhance software that enables state of the art, next generation Big Data & Fast Data applications Building efficient storage for structured and unstructured data Developing and deploying distributed computing Big Data applications using Open Source frameworks like Apache Spark, Apex, Flink, Nifi, Storm and Kafka on AWS Cloud Utilizing programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift Utilizing Hadoop modules such as YARN & MapReduce, and related Apache projects such as Hive, Hbase, Pig, and Cassandra Leveraging DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test Driven Development to enable the rapid delivery of working code utilizing tools like Jenkins, Maven, Nexus, Chef, Terraform, Ruby, Git and Docker Performing unit tests and conducting reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance Basic Qualifications: Bachelors Degree or military experience At least 3 years of professional work experience in data warehousing / analytics At least 3 years of experience in open source programming languages for large scale data analysis At least 2 years of Java development for modern data engineering At least 2 years of data modeling development At least 1 year of experience working with cloud data capabilities Preferred Qualifications: Master’s Degree or PhD 3 years Java development experience 3 years of experience in Python, Scala, or R for large scale data analysis 3 years’ experience with Relational Database Systems and SQL (PostgreSQL or Redshift) 3 years of UNIX/Linux experience 2 years of Agile engineering experience 2 years of experience with the Hadoop Stack 2 years of experience with Cloud computing (AWS) 1 years of experience with supervised machine learning

View More
Job Detail
Shortlist Never pay anyone for job application test or interview.