Lead Big Data & Cloud Operations Engineer

Austin, TX, USA

Ranked as #12 on Forbes’ List of 25 Fastest Growing Public Tech Companies for 2017, EPAM is committed to providing our global team of over 24,000 people with inspiring careers from day one. EPAMers lead with passion and honesty, and think creatively. Our people are the source of our success and we value collaboration, try to always understand our customers’ business, and strive for the highest standards of excellence. No matter where you are located, you’ll join a dedicated, diverse community that will help you discover your fullest potential.

DESCRIPTION


You are curious, persistent, logical and clever – a true techie at heart. You enjoy living by the code of your craft and developing elegant solutions for complex problems. If this sounds like you, this could be the perfect opportunity to join EPAM as a Lead Big Data & Cloud Operations Engineer. Scroll down to learn more about the position’s responsibilities and requirements.

We are looking for a Lead Big Data & Cloud Operations Engineer with a background in supporting on-premise and cloud data warehouse platforms as well as associated tools and frameworks for ETL processing, job scheduling, code deployment, end-user analytics and process automation. We are looking for highly skilled, passionate individuals who are quick learners, excited about new technologies, and willing to support, maintain, and implement solutions on our Hadoop and cloud platforms.

Responsibilities

  • Lead the monitoring and troubleshooting of jobs, capacity, availability, and performance problems;
  • Support platforms and tools for analytics on Hadoop and in AWS (Druid, HUE);
  • Work closely with infrastructure and development teams on capacity planning;
  • Work with infrastructure teams on evaluating new types of hardware and software to improve system performance and capacity;
  • Partner with developers on evaluating or developing new tools to simplify and speed up data pipeline development and delivery;
  • Help automate and support code deployment processes;
  • Provide timely communication to stakeholders and users on issue status and resolution;
  • Work closely with offshore operations team by delegating routine and project tasks.

Requirements

  • Bachelor's Degree in computer science, computer engineering, or a related field;
  • 5+ years of working in a Big Data operations engineer role for a large organization;
  • 2+ years of experience in leading junior operations engineers or contractors;
  • Excellent written and verbal communication skills;
  • Ability to multitask;
  • Highly agile and quick learner;
  • Experience in monitoring large-scale Big Data ETL processes on Hadoop and in the cloud (AWS);
  • Expert in Hadoop (HDFS, Hive, HBase, HUE, Sqoop, Spark, Oozie, Cassandra);
  • Expert in AWS (RedShift, Druid, S3);
  • Experience in writing scripts for operations and monitoring (Python is preferred);
  • Ability to thrive in a fast-paced environment, and the passion to make a difference.