girl looking into her desktop
Back to search results

Data Engineer - Core Technology Infrastructure

Charlotte, North Carolina;

Job Description:

The Trusted Data Fabric Control Function is seeking a Data Engineer. The Engineer will be responsible for understanding how all TDF Control Functions align within Data Platform Services platforms both on-prem (Teradata, Hadoop) and off-prem (MS Azure Public Cloud, Collibra). The control functions are Metadata, Privacy, Retention, Data Security & Data Quality.  The Engineer will be expected to acquire comprehensive understanding of technical and business  aspects of the Control Functions and strategically plan, execute and test for all future enhancements. They will also help shape the external cloud governance framework by providing technical and business guidance to the framework as well as reporting/Special/Ad Hoc Projects as requested.  The Engineer will play a critical role in shaping the future of the Control Functions. The Engineer will maintain relationships with Enterprise Privacy, Compliance and Application Managers.  This role has high visibility to many other departments at the Bank and is vital to maintaining a healthy Control Function.

This role is responsible for developing and delivering data solutions to accomplish technology and business goals. Key responsibilities include code design and delivery tasks associated with the integration, cleaning, transformation and control of data in operational and analytics data systems. They work with stakeholders, Product Owners and Software Engineers to aid in the implementation of data requirements, analyze performance, conduct research and troubleshoot any issues.  These individuals are familiar with the data engineering practices of the bank.

Primary Skill

  • Teradata
  • Hadoop
  • Jira

Required Skills

  • Knowledge of JIRA and Agile Methodology
  • SQL Queries
  • Teradata
  • Hadoop HDFS and HIVE
  • Azure Cloud Experience and Ansible Tower Preferred

Enterprise Role Overview:

Responsible for developing and delivering data solutions to accomplish technology and business goals. Codes design and delivery tasks associated with the integration, cleaning, transformation and control of data in operational and analytics data systems. Works with stakeholders, Product Owners, and Software Engineers to aid in the implementation of data requirements, performance analysis, research and troubleshooting. Familiar with the data engineering practices of the bank. Contributes to story refinement/defining requirements. Participates in estimating work necessary to realize a story/requirement through the delivery lifecycle. Understands and utilizes basic architecture components in solution development. Codes solutions to integrate, clean, transform and control data in operational and/or analytics data systems per the defined acceptance criteria. Works across development teams to understand and aid in the delivery of data requirements Assembles large, complex data sets that meet functional / non-functional requirements. Builds processes supporting data transformation, data structures, metadata, data quality controls, dependency and workload management. Defines and builds data pipelines that enable faster, better, data-informed decision-making within the business. Contributes to existing test suites (integration, regression, performance), analyzes test reports, identifies any test issues/errors, and triages the underlying cause. Documents and communicates required information for deployment, maintenance, support, and business functionality. Adheres to team delivery/release process and cadence pertaining to code deployment and release Identifies gaps in data management standards adherence and works with appropriate partners to develop plans to close gaps. Individual contributor.

Job Band:

H5

Shift: 

1st shift (United States of America)

Hours Per Week:

40

Weekly Schedule:

Referral Bonus Amount:

0

Job Description:

The Trusted Data Fabric Control Function is seeking a Data Engineer. The Engineer will be responsible for understanding how all TDF Control Functions align within Data Platform Services platforms both on-prem (Teradata, Hadoop) and off-prem (MS Azure Public Cloud, Collibra). The control functions are Metadata, Privacy, Retention, Data Security & Data Quality.  The Engineer will be expected to acquire comprehensive understanding of technical and business  aspects of the Control Functions and strategically plan, execute and test for all future enhancements. They will also help shape the external cloud governance framework by providing technical and business guidance to the framework as well as reporting/Special/Ad Hoc Projects as requested.  The Engineer will play a critical role in shaping the future of the Control Functions. The Engineer will maintain relationships with Enterprise Privacy, Compliance and Application Managers.  This role has high visibility to many other departments at the Bank and is vital to maintaining a healthy Control Function.

This role is responsible for developing and delivering data solutions to accomplish technology and business goals. Key responsibilities include code design and delivery tasks associated with the integration, cleaning, transformation and control of data in operational and analytics data systems. They work with stakeholders, Product Owners and Software Engineers to aid in the implementation of data requirements, analyze performance, conduct research and troubleshoot any issues.  These individuals are familiar with the data engineering practices of the bank.

Primary Skill

  • Teradata
  • Hadoop
  • Jira

Required Skills

  • Knowledge of JIRA and Agile Methodology
  • SQL Queries
  • Teradata
  • Hadoop HDFS and HIVE
  • Azure Cloud Experience and Ansible Tower Preferred

Enterprise Role Overview:

Responsible for developing and delivering data solutions to accomplish technology and business goals. Codes design and delivery tasks associated with the integration, cleaning, transformation and control of data in operational and analytics data systems. Works with stakeholders, Product Owners, and Software Engineers to aid in the implementation of data requirements, performance analysis, research and troubleshooting. Familiar with the data engineering practices of the bank. Contributes to story refinement/defining requirements. Participates in estimating work necessary to realize a story/requirement through the delivery lifecycle. Understands and utilizes basic architecture components in solution development. Codes solutions to integrate, clean, transform and control data in operational and/or analytics data systems per the defined acceptance criteria. Works across development teams to understand and aid in the delivery of data requirements Assembles large, complex data sets that meet functional / non-functional requirements. Builds processes supporting data transformation, data structures, metadata, data quality controls, dependency and workload management. Defines and builds data pipelines that enable faster, better, data-informed decision-making within the business. Contributes to existing test suites (integration, regression, performance), analyzes test reports, identifies any test issues/errors, and triages the underlying cause. Documents and communicates required information for deployment, maintenance, support, and business functionality. Adheres to team delivery/release process and cadence pertaining to code deployment and release Identifies gaps in data management standards adherence and works with appropriate partners to develop plans to close gaps. Individual contributor.

Shift:

1st shift (United States of America)

Hours Per Week: 

40

Learn more about this role

Full time

JR-21049351

Band: H5

Manages People: No

Travel: Yes, 5% of the time

Manager:

Talent Acquisition Contact:

Kathleen Jones-Griffith

Referral Bonus:

0