Job Location :- 103 John F Kennedy Pkwy, Short Hills, NJ 07078
Job Description :-
- Design, Develop, Test, Deploy and Support Big Data Applications on Hadoop Cluster in Amazon Web Services (AWS) cloud.
- Responsible for gathering and managing all required information and requirements for the project.
- Analyze and Ensure all data sources are productionalized and automate as much as possible and ingest into the platform.
- Manage and review Hadoop log files and fix the existing issues to maximize the efficiency of the process.
- Workflow orchestration to automate successive steps and incorporate appropriate quality checks within the process.
- Automate all jobs by pulling the data from File Transfer Protocol server using Oozie workflows into desired destination as per requirement.
- Generate audits and metadata in the desired format by ensuring all assets code in source control by following best-in-class release management and source code practices.
- Perform code review, bug fixing and production activity when required.
- Perform controlled releases to implement any code/asset change for standardized delivery to various end customers.
- Develop and document design/implementation of requirements based on business needs.
- Reduce the number of open tickets and guiding the offshore programmers’ team for process monitoring in everyday WebEx team meeting.
Minimum Education Required: – This position requires a candidate with a minimum of Bachelor’s degree in computer science, computer information systems, information technology or a combination of education and experience equating to the U.S. equivalent of a Bachelor’s degree in one of the aforementioned subjects.