A Hadoop Developer is responsible for the design, development and operations of systems that store and manage large amounts of data. Most Hadoop developers have a computer software background and have a degree in information systems, software engineering, computer science, or mathematics.
IT Developers are responsible for the development, programming, coding of Information Technology solutions. IT Developers are responsible for documenting detailed system specifications, participation in unit testing and maintenance of planned and unplanned internally developed applications, evaluation and performance testing of purchased products. IT Developers are responsible for including IT Controls to protect the confidentiality, integrity, as well as availability of the application and data processed or output by the application. IT Developers are assigned to moderately complex development projects.
• Write code for moderately complex system designs. Write programs that span platforms. Code and/or create Application Programming Interfaces (APIs).
• Write code for enhancing existing programs or developing new programs.
• Review code developed by other IT Developers.
• Provide input to and drive programming standards.
• Write detailed technical specifications for subsystems. Identify integration points.
• Report missing elements found in the system and functional requirements and explain impacts on the subsystem to team members.
• Consult with other IT Developers, Business Analysts, Systems Analysts, Project Managers and vendors.
• “Scope” time, resources, etc., required to complete programming projects. Seek review from other IT Developers, Business Analysts, Systems Analysts or Project Managers on estimates.
• Perform unit testing and debugging. Set test conditions based upon code specifications. May need assistance from other IT Developers and team members to debug more complex errors.
• Supports the transition of application throughout the Product Development life cycle. Document what has to be migrated. May require more coordination points for subsystems.
• Researches vendor products/alternatives. Conducts vendor product gap analysis/comparison.
• Accountable for including IT Controls and following standard corporate practices to protect the confidentiality, integrity, as well as availability of the application and data processed or output by the application.
• The essential functions listed represent the major duties of this role, additional duties may be assigned.
• Experience and understanding with unit testing, release procedures, coding design and documentation protocol as well as change management procedures
• Proficiency using versioning tools
• Thorough knowledge of Information Technology fields and computer systems
• Demonstrated organizational, analytical and interpersonal skills
• Flexible team player
• Ability to manage tasks independently and take ownership of responsibilities
• Ability to learn from mistakes and apply constructive feedback to improve performance
• Must demonstrate initiative and effective independent decision-making skills
• Ability to communicate technical information clearly and articulately
• Ability to adapt to a rapidly changing environment
• In-depth understanding of the systems development life cycle
• Proficiency programming in more than one object-oriented programming language
• Proficiency using standard desktop applications such as MS Suite and flowcharting tools such as Visio
• Proficiency using debugging tools
• High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy
Specific Tools/Languages Required:
· 5-8 years related work experience or an equivalent combination of transferable experience and education
- Hadoop 4 years’ experience
- ETL Data Warehousing needs 7+ Years of experience
· Ab initio conversion to Spark – but also maintaining Ab-Initio (7+ years of Ab-Initio Experience)
· GuideWell Data Platform Building the Clams process – used for analytical needs and - specifically used for maintaining and migrating. Enterprise Data Warehousing Migrating users
· Experience with Agile Methodology
Related Bachelor's degree in an IT-related field or relevant work experience
Comments for Suppliers: Resource should have extensive experience in Data warehousing and building complex ETL using Spark Scala.