About the job
Job Summary
We are seeking a skilled Hadoop Developer to join our dynamic Hadoop/Big Data team. The ideal candidate will play a key role in developing innovative solutions for complex, strategic projects. This position requires advanced programming and analytical skills for specific systems assignments.
Responsibilities:
- Design, implement, and deploy custom applications on the Hadoop platform.
- Troubleshoot and resolve production issues within the Hadoop environment.
- Optimize the performance of Hadoop processes and applications.
- Collaborate with senior technology leaders and communicate effectively across teams.
Requirements:
- A Bachelor’s degree in Computer Science, Management Information Systems, or Computer Information Systems, or equivalent experience.
- Minimum of 5 years of experience in building Java applications.
- At least 2 years of experience with Hadoop components such as HDFS, HBase, Hive, Sqoop, and Flume.
- Proficiency in Java MapReduce, Python, Pig programming, Hadoop Streaming, and HiveQL for a minimum of 2 years.
- Strong understanding of traditional ETL tools and Data Warehousing architecture, with 4 years of relevant experience.
- Experience with Teradata and other RDBMS is advantageous.
- Excellent SQL/HiveQL skills are essential.
- Familiarity with Agile Methodology and the Scrum process.
- Ability to deliver high-quality work independently with minimal supervision.
- Strong critical thinking and analytical abilities are a must.
