My Shortlist

Your shortlisted jobs will appear here. To view your shortlist, please login or register

More Jobs Like This
DATE ADDED: Thu 05/12/2019

Big Data Administrator

San Francisco, CA, US
ADD TO SHORTLIST APPLY NOW

COMPANY: CAPSTONE TECHNOLOGY RESOURCES, INC.

JOB TYPE: Permanent, FullTime

Note: This is a 4-6 month engagement with possibility of extension (no guarantee). Bottom Line: Need a Hadoop / Hortonworks (or Cloudera) administrator that has deployed Hadoop in an ecosystem. Good AWS experience is REQUIRED. Experience with physical data centers, EMR clusters and working in a production support role are required. What you’ll do: In this position, you’ll maintain and improve the BigData (HortonWorks / Cloudera) platform housed in 2 physical data centers as we migrate it to the cloud (AWS)
· You will support the BigData platform in our datacenters and eventually AWS.
· You will improve the efficiency, reliability and security of our BigData ecosystem, while making sure that our developers & analysts have a smooth experience with it
· You will automate day-to-day operational tasks.
· You will be responsible for setting the standards for our production environment.
· You will take part in the 24X7 on-call rotation with the rest of the team and respond to pages and alerts to investigate issues in our platform. What you'll have:
· Hands-on experience with deploying, managing, and monitoring Hadoop eco systems (HDFS, YARN, Hive, Oozie) in production environment
· Proficiency in Linux (Bash ) & Jenkins
· Strong foundation of basic Big Data technologies – HDFS, YARN, Mapreduce, Hive
· Foundation of Hadoop security including SSL/TSL, Kerberos, Role based authorization
· Knowledge RDBMS (Oracle, Mysql) and NoSql (Cassandra, MongoDB)
· Experience with building and managing EMR clusters is a plus
· Good understanding of SQL basics
· Experience with one of the supporting one of the following: Splunk, Wavefront, Graphite, ELK, etc.
· Ability to do post-mortem and provide detailed RCAs
· Strong communication skills Nice to have:
· Hands-on experience with Spark, Presto and similar tools for SQL-like exploration of large-scale data sets
· AWS Experience, Terraform
· Hands-on experience with Ranger or any other Role/Policy based authorization tools
· Hands-on experience with Qubole and Airflow
· Hands-on experience with Hortonworks / Cloudera flavor of Hadoop
· Experience with Python or Java
APPLY NOW