Data Engineer with AWS, Spark, Scala and Java

Application deadline date has been passed for this Job.
I-Link Solutions
I-Link Solutions

Job Overview

Job description MUST HAVE AWS (s3, redshift, EMR, EC2, lambda, SNS), unix shell scripting, python, spark, scala PREFER TO HAVE Snowflake Presto Arrow Airflow Hadoop Hive You’ll bring solid experience in emerging and traditional technologies such as node.js, Java, AngularJS, React, Python, REST, JSON, XML, Ruby, HTML HTML5, CSS, NoSQL databases, relational databases, Hadoop, Chef, Maven, iOS, Android, and AWSCloud Infrastructure to name a few. You Will Work with product owners to understand desired application capabilities and testing scenarios Continuously improve software engineering practices Work within and across Agile teams to design, develop, test, implement, and support technical solutions across a full-stack of development tools and technologies Lead the craftsmanship, availability, resilience, and scalability of your solutions Bring a passion to stay on top of tech trends, experiment with and learn new technologies, participate in internal external technology communities, and mentor other members of the engineering community Encourage innovation, implementation of cutting-edge technologies, inclusion, outside-of-the-box thinking, teamwork, self-organization, and diversity Basic Qualifications Bachelorrsquos Degree At least 3 years of SDLC experience using Java technologies At least 3 years experience with leading big data technologies like Cassandra, Accumulo, Python, HBase, Scala, Hadoop, HDFS, AVRO, MongoDB, or Zookeeper At least 1 years experience in one of the following Cloud technologies AWS, Azure, OpenStack, Docker, Ansible, Chef or Terraform Preferred Qualifications Master’s Degree 2+ year experience with Spark 3+ years experience developing software solutions to solve complex business problemsData Engineer with AWS, Spark, Scala and Java 1

View More
Job Detail
Shortlist Never pay anyone for job application test or interview.