Hadoop Developer

Job Location: –  900 Cottage Grove Rd, Bloomfield, CT 06002.

Job Description: 

  • Develop tools and libraries, and maintain processes for other engineers to access data and write MapReduce programs.
  • Work with Software Developers, Data Scientists, Architects, Data Warehouse Developers in the architecture, design and development of applications built on Hadoop environment.
  • Interact with stakeholders, business owners, vendors and users to understand their functional requirements and convert them into technical solutions.
  • Develop custom ETL solutions, batch processing and real-time data ingestion pipeline to move data in and out of Hadoop using PySpark.
  • Build, operate, monitor, and troubleshoot Hadoop infrastructure. Apply different HDFS formats and structure like Parquet and Avro to speed up analytics.
  • Build distributed, reliable and scalable data pipelines to ingest and process data in real-time.
  • Build new hadoop clusters and Maintain the privacy and security of hadoop clusters.
  • Develop efficient pig and hive scripts with joins on datasets using various techniques.
  • Create project documentation, maintain the project lifecycle to operate Hadoop infrastructure.
  • Work on real time streaming, perform transformations on the data using Kafka and flume and develop Spark Streaming jobs in Scala to consume data from Kafka Topics.
  • Use Jenkins for continuous integration and continuous delivery. store code into GIT Repository and use GitHub for version control.
  • Utilize Hive tables and HQL queries for daily and weekly reports. Worked on complex data types in Hive like Structs and Maps.
  • Research and provide insights on application infrastructure. Understand security and information assurance requirements to establish coding standards and group procedures.
  • Troubleshoot and debug any hadoop ecosystem run time issues. Fine tune hadoop applications for high performance and throughput.
  • Work on incident management, change management and release management of applications using tools like Clarity and ServiceNow.

Minimum Education Required: –

This is a professional position, and as such, we require, at minimum, a Bachelor’s degree or its working equivalent in computer science, computer information systems, information technology, or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.