Back to search results

Hadoop / Java Developer

Charlotte, North Carolina;

Job Description:

The Hadoop/Java Developer will be part of the Data Science & Analytics Platform Services (DSAPS) group in Chief Data Office (CDO) organization and will play a vital role in establishing Assurance strategy for the distributed computing platform. The selected person will be responsible for development, test planning and orchestration of large scale multi project/ multi system test efforts. Deep understanding test analysis, planning, execution, dependency and defect management and orchestration other testing activities that lead to the achievement of high quality and scalable product delivery.  We look for you to be result-oriented, business focused, and successful at interfacing across multiple organizational units, at various levels. The knowledge/experience/exposure with Big data technology topics, including the design, development, testing, implementation or governance of Big data practices and solutions, will be very helpful in this role. You should be prepared to demonstrate your knowledge of Big data technologies and project management skills.

Required Skills::

  • Experience with Java and shell scripting
  • Proven understanding with Cloudera Hadoop, IMPALA, Hive, Flume, HBase, Sqoop, Spark and Kafka
  • Able to analyze the existing shell scripts/python code to debug any issues
  • Sound knowledge of relational databases (SQL) and experience with large SQL based systems.
  • Ability to identify, analyze and address problems to resolve issues whenever possible in way that minimizes negative impact and risk to the organization
  • Develops prototypes of the system design and works with database, operations, technical support and other IT areas as appropriate throughout development and testing processes. 
  • Work closely with Architect, Developers and testers to ensure requirements and functional designs are translated accurately into working technical designs and that test plans and scripts serve customer needs.
  • Experience with developer tools for code management, ticket management, performance monitoring, automated testing
  • Provide an in-depth understanding of data modeling and how the design will impact both a Hadoop platform and downstream applications
  • Software development in agile environment

Desired Skills:

  • Knowledge and/or experience working within the Hadoop(HDFS Tools including Cloudera HIVE, Impala, SQOOP, Kafka, Hbase) or other big data distributed ecosystem
  • BS/MS in Computer Science, Engineering, or any quantitative discipline
  • Good understanding of Linux/VM platform
  • Knowledge of cloud computing or distributed computing

Top 3 required technical skill-sets:

Shell Scripting, Python, Hadoop

Critical Skills:  5+ Java experience ; 3+ years in Cloudera Hadoop and Apache (Impala, Hive, Sqoop, SPARK etc)

Enterprise Role Summary:

Responsible for designing and developing complex requirements to accomplish business goals. Ensures that software is developed to meet functional, non-functional, and compliance requirements. Ensures solutions are well designed with maintainability/ease of integration and testing built-in from the outset. Possess strong proficiency in development and testing practices common to the industry, and have extensive experience of using design and architectural patterns. At this level, specializations start to form in either Architecture, Test Engineering or DevOp. Contributes to story refinement/defining requirements. Participates and guides team in estimating work necessary to realize a story/requirement through the delivery lifecycle. Performs spike/proof of concept as necessary to mitigate risk or implement new ideas. Codes solutions and unit tests to deliver a requirement/story per the defined acceptance criteria and compliance requirements. Utilizes multiple architectural components (across data, application, business) in design and development of client requirements. Assists team with resolving technical complexities involved in realizing story work. Designs/develops/modifies architecture components, application interfaces, and solution enablers while ensuring principal architecture integrity is maintained. Designs/develops/maintains automated test suites (integration, regression, performance). Sets up and develops a continuous integration/continuous delivery pipeline. Automates manual release activities. Mentors other Software Engineers and coaches team on CI-CD practices and automating tool stack. Individual contributor.

Job Band:

H5

Shift: 

1st shift (United States of America)

Hours Per Week:

40

Weekly Schedule:

Referral Bonus Amount:

0

Job Description:

The Hadoop/Java Developer will be part of the Data Science & Analytics Platform Services (DSAPS) group in Chief Data Office (CDO) organization and will play a vital role in establishing Assurance strategy for the distributed computing platform. The selected person will be responsible for development, test planning and orchestration of large scale multi project/ multi system test efforts. Deep understanding test analysis, planning, execution, dependency and defect management and orchestration other testing activities that lead to the achievement of high quality and scalable product delivery.  We look for you to be result-oriented, business focused, and successful at interfacing across multiple organizational units, at various levels. The knowledge/experience/exposure with Big data technology topics, including the design, development, testing, implementation or governance of Big data practices and solutions, will be very helpful in this role. You should be prepared to demonstrate your knowledge of Big data technologies and project management skills.

Required Skills::

  • Experience with Java and shell scripting
  • Proven understanding with Cloudera Hadoop, IMPALA, Hive, Flume, HBase, Sqoop, Spark and Kafka
  • Able to analyze the existing shell scripts/python code to debug any issues
  • Sound knowledge of relational databases (SQL) and experience with large SQL based systems.
  • Ability to identify, analyze and address problems to resolve issues whenever possible in way that minimizes negative impact and risk to the organization
  • Develops prototypes of the system design and works with database, operations, technical support and other IT areas as appropriate throughout development and testing processes. 
  • Work closely with Architect, Developers and testers to ensure requirements and functional designs are translated accurately into working technical designs and that test plans and scripts serve customer needs.
  • Experience with developer tools for code management, ticket management, performance monitoring, automated testing
  • Provide an in-depth understanding of data modeling and how the design will impact both a Hadoop platform and downstream applications
  • Software development in agile environment

Desired Skills:

  • Knowledge and/or experience working within the Hadoop(HDFS Tools including Cloudera HIVE, Impala, SQOOP, Kafka, Hbase) or other big data distributed ecosystem
  • BS/MS in Computer Science, Engineering, or any quantitative discipline
  • Good understanding of Linux/VM platform
  • Knowledge of cloud computing or distributed computing

Top 3 required technical skill-sets:

Shell Scripting, Python, Hadoop

Critical Skills:  5+ Java experience ; 3+ years in Cloudera Hadoop and Apache (Impala, Hive, Sqoop, SPARK etc)

Enterprise Role Summary:

Responsible for designing and developing complex requirements to accomplish business goals. Ensures that software is developed to meet functional, non-functional, and compliance requirements. Ensures solutions are well designed with maintainability/ease of integration and testing built-in from the outset. Possess strong proficiency in development and testing practices common to the industry, and have extensive experience of using design and architectural patterns. At this level, specializations start to form in either Architecture, Test Engineering or DevOp. Contributes to story refinement/defining requirements. Participates and guides team in estimating work necessary to realize a story/requirement through the delivery lifecycle. Performs spike/proof of concept as necessary to mitigate risk or implement new ideas. Codes solutions and unit tests to deliver a requirement/story per the defined acceptance criteria and compliance requirements. Utilizes multiple architectural components (across data, application, business) in design and development of client requirements. Assists team with resolving technical complexities involved in realizing story work. Designs/develops/modifies architecture components, application interfaces, and solution enablers while ensuring principal architecture integrity is maintained. Designs/develops/maintains automated test suites (integration, regression, performance). Sets up and develops a continuous integration/continuous delivery pipeline. Automates manual release activities. Mentors other Software Engineers and coaches team on CI-CD practices and automating tool stack. Individual contributor.

Shift:

1st shift (United States of America)

Hours Per Week: 

40

Learn more about this role

Full time

JR-21012570

Manages People: No

Travel: No

Manager:

Talent Acquisition Contact:

Referral Bonus: