AI Cloud DevOps Engineer – Data Science, Senior Consultant with Security Clearance

Guidehouse
Guidehouse

Job Overview

Overview Guidehouse is a leading management consulting firm serving the public and commercial markets. We guide our clients forward towards new futures that build trust in society and your professional skills along the journey. Join us at Guidehouse. Responsibilities As part of Guidehouse’s Advanced Data Analytics team, you will work on high-impact and high-visibility projects, helping to shape not only Guidehouse’s current business, but its long-term strategy. Build the future of Data Science as part of the Artificial Intelligence Center of Excellence (CoE). The CoE is a unique team within Guidehouse, focusing on solving our client’s most critical challenges using Data and Advanced Analytics, AI, and Automation. The CoE works on a wide variety of projects; from predictive analytics models to support our healthcare, financial, and energy services divisions, to open source analysis for federal agencies, to applying deep learning models (i.e., NLP, image recognition) to solve more complex problems. This role involves working in a multi-functional, Agile team environment with other data scientists, engineers, and UI/UX developers to develop and productionize analytics solutions. The Cloud DevOps Engineer is involved in various aspects of customer engagement. From collaborating with multiple team members and customers to supporting stakeholders, discover the information hidden in their vast amounts of data, data-driven decision-making, and to ultimately deliver better products. * Lead team initiatives to continuously refine our AWS deployment practices for improved reliability, repeatability and security. You’ll create/contribute to plans, collaborate with other team members. These high-visibility initiatives will help to increase service levels, lower costs, and deliver features more quickly. * Work closely with Data Science team to automate deployment and configuration of infrastructure to support roll out of data products/projects on AWS Data Stack. This includes building Machine Learning workflows in AWS that comprise the full stack from front-end to back end. * Design effective monitoring / alerting (for conditions such as application-errors, high memory usage) and log aggregation approaches (to quickly access logs for troubleshooting, or generate reports for trend analysis) to proactively notify business stakeholders of issues and communicate metrics, working closely with these stakeholders, using tools including AWS CloudWatch, SageMaker, EMR, Glue etc. * Write code and scripts to automate provisioning of AWS services and to configure services, using tools and languages including AWS CLI / API, Terraform, Ansible, Chef, Python, Bash, and Git. * Configure build pipelines to support automated testing and deployments using tools including Jenkins, CircleCI, AWS CodeDeploy. You’ll configure these pipelines for specific products and help optimize them for performance and scalability. * Help refine DevSecOps security practices (including regular security patching, minimum-permissions accounts and policies, encrypt-everything) in compliance with Health IT, government and other standards regulations, implement, and verify them, using tools like Sonarqube, VeraCode to analyze and verify compliance. * Document and diagram deployment-specific aspects of architectures and environments, working closely with Software Engineers, Data Scientists, Software Engineers in Test, and others in DevOps. * Troubleshoot issues in production and other environments, applying debugging and problem-solving techniques (e.g., log analysis, non-invasive tests) , working closely with development and product teams. * Suggest deployment patterns & practices improvements based on learnings from past deployments and production issues; collaborate with DevOps team to implement these. * Promote a DevOps culture, including building relationships with other technical and business teams. * Work closely with InterOps to deploy and configure the platform to on-board clinics. * Work to ensure system and data security is maintained at a high standard, ensuring the confidentiality, integrity and availability of the Navigating Cancer’s applications is not compromised. Qualifications Minimum Security Clearance: None Minimum Years of Experience: 4 Minimum Education: Advanced degree * Ability to automate away manual interactions and have a passion for helping enable developers to write code that works * A strong understanding of Linux administration including Bash scripting * An understanding of automation and RPA orchestration tools such as UIPath * Networking expertise including VPCs, SDNs (e.g., Amazon / Azure) / VLANs, routers and firewalls * Familiarity with at least one IAC / CM tool such as Terraform or Ansible * Familiarity with at least one code build / deploy tool such as Jenkins, Circle CI * Familiarity with DB setup, configuration and monitoring * Work in terms of enabling capabilities through a blend of process and technology Minimum Qualifications: * Bachelor’s degree in Computer Science, Engineering, Applied Mathematics, Statistics, Data Management or related fields. * 2+ years AWS administration experience / training including provisioning EC2 instances, VPCs, Lambda functions, RDS databases, S3 storage, IAM security, ECS containers, Cloudwatch metrics & logs, and AWS Cognito pools * 2+ years of experience developing and / or deploying serverless functions using AWS Lambda, Azure Functions, or Google Cloud Functions * 1+ years of experience operating and administering Kubernetes deployments, clusters, or configurations * 1+ years of experience using infrastructure as code tools such as Terraform or Ansible * 1+ years of experience with SQL; Adept in using an RDBMS such as PostgreSQL * 1+ years of experience designing and deploying machine learning experiments * 1+ years of experience analyzing large and complex data sets, including a demonstrated thorough aptitude for conducting quantitative and qualitative analysisperience with monitoring / alerting tools such as New Relic, Grafana, Prometheus, Sysdig * Experience with log aggregation tools such as Datadog, ELK, Splunk * Experience in Python as well as at least one other programming language such as Ruby, Java, Scala, JavaScript / Node.js, Go, C#, or C/C++. * AWS Certified DevOps Engineer Desired Experience * 2+ years of experience in building cloud Data Lakes to support Data Analytics and Machine Learning tasks. * 1+ years of experience in AWS RDS, schema design, system performance & optimization, capacity planning. Preferably AWS Big Data Architect Certification or equivalent. * Demonstrable in-depth understanding of data structures and ETL processes (including SSIS) * Experience with structured and unstructured data, including relational databases (SQL Server), graph databases (Neo4J), NoSQL databases (MongoDB) and unstructured data * Experience with working with big data (Scala, Spark, Pig) * Experience with the operationalization and maintenance of analytics APIs using Plumber, Flask, Swagger and similar * Experience in data analytics, business intelligence, or data science Additional Requirements * The successful candidate must not be subject to employment restrictions from a former employer (such as a non-compete) that would prevent the candidate from performing the job responsibilities as described. Disclaimer About Guidehouse Guidehouse is an Equal Employment Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, national origin, ancestry, citizenship status, military status, protected veteran status, religion, creed, physical or mental disability, medical condition, marital status, sex, sexual orientation, gender, gender identity or expression, age, genetic information, or any other basis protected by law, ordinance, or regulation. Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco. If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at 1-571-633-1711 or via email at . All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation. Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee. Rewards and Benefits Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace. Benefits include: Medical, Rx, Dental & Vision Insurance Personal and Family Sick Time & Company Paid Holidays Parental Leave and Adoption Assistance 401(k) Retirement Plan Basic Life & Supplemental Life Health Savings Account, Dental/Vision & Dependent Care Flexible Spending Accounts Short-Term & Long-Term Disability Tuition Reimbursement, Personal Development & Learning Opportunities Skills Development & Certifications Employee Referral Program Corporate Sponsored Events & Community Outreach Emergency Back-Up Childcare Program

View More
Job Detail
Shortlist Never pay anyone for job application test or interview.