girl looking into her desktop
Back to search results

Data Flow Expert - NiFi & Kafka, CTI - Digital Solutions & Automation Services

Charlotte, North Carolina;

Job Description:

The Digital Solutions and Automation Services team and is looking for top talent to design and build best-in-class Data Management and Integration Services capability over Infrastructure/ITSM data using Hadoop Architecture.

The Data Flow Expert will innovate and transform the systems integration landscape for the Technology Infrastructure organization, while following industry best practices and providing capability maturity in support of Enterprise Data Management standards.    

The ideal candidate has solid experience in Data Warehousing and Master Data Management Design and Development.  Candidate should have a strong understanding of data management concepts and applied DW/MDM development of DB-level routines and objects. Candidate should have experience in migrating traditional Relational Database Management System (RDBMS) to a Hadoop based architecture. In addition, have hands on experience developing in many of the Apache Hadoop based tools. It involves hands-on development and support of integrations with multiple systems and ensuring accuracy and quality of data by implementing business and technical reconciliations. Candidate needs to be able to understand macro level requirements and convert them into actionable tasks to deliver a technically sound product. Candidate should be able to work in teams in a collaborative manner.

Responsibilities include:

  • Analyze current RDBMS Master Data Management platform including orchestrations, workflows, transformations, and help designing a scalable platform based on Hadoop for structured and semi-structure big data.
  • Reengineer traditional batch-based database system and stored procedures using Big Data services using expert level skills in Apache NiFi and Apache Kafka.
  • Develop and implement data processes using Spark Steaming including using Spark DataFrames, Spark SQL, Spark MLlib.
  • Deploy Apache HBase and apply its capabilities supporting OLTP applications. 

Required Job Skills:

  • 10+ years of total IT experience
  • At least 5 years of experience developing for Data Warehousing, Data Marts, and/or Master Data Management
  • Deep experience on Hadoop including NiFi, Kafka, Flink, Spark, HBase, HIVE and HDFS.
  • Solid experience implementing trained machine learning models using Spark Streaming and Apache Flink.
  • Programming experience Python, PySpark, Spark SQL 
  • Exposure to Relational Database Management Systems using Oracle, DB2 or SQL Server
  • Possesses and demonstrates deep knowledge of the Hadoop Ecosystem
  • Experienced exposure to Hadoop ecosystem
  • Object oriented programming concepts
  • Expert SQL skills
  • Experience in SDLC and best practices for development
  • Ability to work against mid-level design documentation, take it to a low-level design, and deliver a solution that meets the success criteria
  • Knowledge of packaging and promotion practices for maintaining code in development, test, and production

Desired Job Skills: 

  • Experience with Jira & Bitbucket
  • Data Scientist experience

Core Technology Infrastructure Organization:

  • Is committed to building a workplace where every employee is welcomed and given the support and resources to perform their jobs successfully.
  • Wants to be a great place for people to work and strive to create an environment where all employees have the opportunity to achieve their goals.
  • Believes diversity makes us stronger so we can reflect, connect and meet the diverse needs of our clients and employees around the world.
  • Provides continuous training and development opportunities to help employees achieve their career goals, whatever their background or experience.
  • Is committed to advancing our tools, technology, and ways of working to better serve our clients and their evolving business needs.
  • Believes in responsible growth and is dedicated to supporting our communities by connecting them to the lending, investing and giving they need to remain vibrant and vital.

LOB Job Profile:

Responsible for designing and developing complex requirements to accomplish business goals. Ensures that software is developed to meet functional, non-functional, and compliance requirements. Ensures solutions are well designed with maintainability/ease of integration and testing built-in from the outset. Possess strong proficiency in development and testing practices common to the industry, and have extensive experience of using design and architectural patterns. At this level, specializations start to form in either Architecture, Test Engineering or DevOp. Contributes to story refinement/defining requirements. Participates and guides team in estimating work necessary to realize a story/requirement through the delivery lifecycle. Performs spike/proof of concept as necessary to mitigate risk or implement new ideas. Codes solutions and unit tests to deliver a requirement/story per the defined acceptance criteria and compliance requirements. Utilizes multiple architectural components (across data, application, business) in design and development of client requirements. Assists team with resolving technical complexities involved in realizing story work. Designs/develops/modifies architecture components, application interfaces, and solution enablers while ensuring principal architecture integrity is maintained. Designs/develops/maintains automated test suites (integration, regression, performance). Sets up and develops a continuous integration/continuous delivery pipeline. Automates manual release activities. Mentors other Software Engineers and coaches team on CI-CD practices and automating tool stack. Individual contributor.

Job Band:

H5

Shift: 

1st shift (United States of America)

Hours Per Week:

40

Weekly Schedule:

Referral Bonus Amount:

0

Job Description:

The Digital Solutions and Automation Services team and is looking for top talent to design and build best-in-class Data Management and Integration Services capability over Infrastructure/ITSM data using Hadoop Architecture.

The Data Flow Expert will innovate and transform the systems integration landscape for the Technology Infrastructure organization, while following industry best practices and providing capability maturity in support of Enterprise Data Management standards.    

The ideal candidate has solid experience in Data Warehousing and Master Data Management Design and Development.  Candidate should have a strong understanding of data management concepts and applied DW/MDM development of DB-level routines and objects. Candidate should have experience in migrating traditional Relational Database Management System (RDBMS) to a Hadoop based architecture. In addition, have hands on experience developing in many of the Apache Hadoop based tools. It involves hands-on development and support of integrations with multiple systems and ensuring accuracy and quality of data by implementing business and technical reconciliations. Candidate needs to be able to understand macro level requirements and convert them into actionable tasks to deliver a technically sound product. Candidate should be able to work in teams in a collaborative manner.

Responsibilities include:

  • Analyze current RDBMS Master Data Management platform including orchestrations, workflows, transformations, and help designing a scalable platform based on Hadoop for structured and semi-structure big data.
  • Reengineer traditional batch-based database system and stored procedures using Big Data services using expert level skills in Apache NiFi and Apache Kafka.
  • Develop and implement data processes using Spark Steaming including using Spark DataFrames, Spark SQL, Spark MLlib.
  • Deploy Apache HBase and apply its capabilities supporting OLTP applications. 

Required Job Skills:

  • 10+ years of total IT experience
  • At least 5 years of experience developing for Data Warehousing, Data Marts, and/or Master Data Management
  • Deep experience on Hadoop including NiFi, Kafka, Flink, Spark, HBase, HIVE and HDFS.
  • Solid experience implementing trained machine learning models using Spark Streaming and Apache Flink.
  • Programming experience Python, PySpark, Spark SQL 
  • Exposure to Relational Database Management Systems using Oracle, DB2 or SQL Server
  • Possesses and demonstrates deep knowledge of the Hadoop Ecosystem
  • Experienced exposure to Hadoop ecosystem
  • Object oriented programming concepts
  • Expert SQL skills
  • Experience in SDLC and best practices for development
  • Ability to work against mid-level design documentation, take it to a low-level design, and deliver a solution that meets the success criteria
  • Knowledge of packaging and promotion practices for maintaining code in development, test, and production

Desired Job Skills: 

  • Experience with Jira & Bitbucket
  • Data Scientist experience

Core Technology Infrastructure Organization:

  • Is committed to building a workplace where every employee is welcomed and given the support and resources to perform their jobs successfully.
  • Wants to be a great place for people to work and strive to create an environment where all employees have the opportunity to achieve their goals.
  • Believes diversity makes us stronger so we can reflect, connect and meet the diverse needs of our clients and employees around the world.
  • Provides continuous training and development opportunities to help employees achieve their career goals, whatever their background or experience.
  • Is committed to advancing our tools, technology, and ways of working to better serve our clients and their evolving business needs.
  • Believes in responsible growth and is dedicated to supporting our communities by connecting them to the lending, investing and giving they need to remain vibrant and vital.

LOB Job Profile:

Responsible for designing and developing complex requirements to accomplish business goals. Ensures that software is developed to meet functional, non-functional, and compliance requirements. Ensures solutions are well designed with maintainability/ease of integration and testing built-in from the outset. Possess strong proficiency in development and testing practices common to the industry, and have extensive experience of using design and architectural patterns. At this level, specializations start to form in either Architecture, Test Engineering or DevOp. Contributes to story refinement/defining requirements. Participates and guides team in estimating work necessary to realize a story/requirement through the delivery lifecycle. Performs spike/proof of concept as necessary to mitigate risk or implement new ideas. Codes solutions and unit tests to deliver a requirement/story per the defined acceptance criteria and compliance requirements. Utilizes multiple architectural components (across data, application, business) in design and development of client requirements. Assists team with resolving technical complexities involved in realizing story work. Designs/develops/modifies architecture components, application interfaces, and solution enablers while ensuring principal architecture integrity is maintained. Designs/develops/maintains automated test suites (integration, regression, performance). Sets up and develops a continuous integration/continuous delivery pipeline. Automates manual release activities. Mentors other Software Engineers and coaches team on CI-CD practices and automating tool stack. Individual contributor.

Shift:

1st shift (United States of America)

Hours Per Week: 

40

Learn more about this role

Full time

JR-21060090

Band: H5

Manages People: No

Travel: No

Manager:

Talent Acquisition Contact:

Kathleen Jones-Griffith

Referral Bonus:

0