Back to jobs

Big Data Developer

Job ID: 219043

Location: San Francisco, California , US, 94110

Summary:

Responsible for designing, developing and maintaining software solutions in the Hadoop cluster

Project Details:

  • Responsible for designing, developing and maintaining software solutions in Hadoop cluster
  • Participate in design reviews, code reviews, unit testing and integration testing
  • Assume ownership and accountability for the assigned deliverables for him/herself and a small (3-5 member) team.

Job Experience:

  • 10-15 years of IT experience with at least 3-5 years in BigData project experience
  • Strong Hadoop eco system technical knowledge
  • Extensive hands on experience in designing, developing, and maintaining software solutions in Hadoop cluster
  • Experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog, Pig scripts, Hive QL, UDF – design & development
  • Hands-on experience with Spark and Scala – design & development
  • Experience in Kafka
  • Demonstrate strong knowledge in Hadoop Best Practices, Troubleshooting and Performance tuning
  • Demonstrates broad knowledge of technical solutions, design patterns, and code for medium/complex applications deployed in Hadoop production
  • Experience with change management / DevOps tools (Github / Jenkins etc.)
  • Experience in SDLC Methodology (Agile / Scrum / Iterative Development).
  • Mentoring / leading a team and provide technical guidance
  • Good communication skills. Will need to communicate with client IT PMs, client lead and architects
  • Working experience with developing MapReduce programs running on the Hadoop cluster using Java/Python
  • Experience with NoSQL Databases like HBASE, Mongo or Cassandra
  • Experience using Talend with Hadoop technologies
  • Working experience in the data warehousing and Business Intelligence systems
  • Business requirements management and Systems change / configuration management. Familiarity with JIRA
  • Experience in ZENA (or any other scheduling tool)
Apply Now