Back to search results

Technical Data Architect

Charlotte, North Carolina;

Job Description:

Position Summary

  • Serve in the role of a Technical Architect within Data, Analytics & Insight Technology organization. Architect will support data solutions development and the application teams delivering such solutions on the data lake.
  • The Technical Architect will have the following responsibilities:
  • Provide technical leadership, architecture and design guidance and development support to teams onboarding to data lake under self-serve model
  • Create, Publish and enforce the Hadoop SPARK/HIVE/Impala best practices, design patterns, cook book for developer community
  • Design APIs and Frameworks for reuse
  • Perform architecture, design and code reviews for complex applications
  • Ensure development teams are adhering to strategic architecture principles
  • Serve as a fully seasoned, technically proficient resource; routine accountability is for technical knowledge and capabilities as a team member or as an individual contributor. Will not have direct-reports but will influence and direct activities of a team related to special initiatives or operations.
  • Influence, Negotiate and Lead technology alternative evaluations and implementations across the Technology and Line of Business organizations

Required Skills

  • Solid experience with big data programming technology stack (Spark, python, scala) with hands-on experience with architecting, designing, developing and implementation APACHE SPARK applications at scale minimum 5-7 years
  • Proven experience with integrating business intelligence technologies with Hadoop using Hive connectors or SPARK connectors minimum 3-5 years
  • Hand-ons experience with ETL tools, technologies and development methodologies
  • Experienced with application performance optimization techniques and best practices for do’s and don’ts for Hive, Impala and spark applications
  • Working knowledge of Data Science technology stack and automation of analytical products
  • Working knowledge of Big data on cloud
  • Experienced collaborating with Big data partners such as Cloudera, Horton Works etc. for upgrades, issues/escalation management, etc.
  • Understanding of large scale batch processing
  • Understanding of large analytical environments, with many stakeholders, users, different (conflicting) sets of requirements, technology stacks
  • Understanding of conceptual, logical and physical data modeling
  • Ability to provide technical recommendations and trade-offs to address business needs and timelines
  • Understanding and experience with web services, APIs and micro-services design and architecture
  • Experience with leading / consulting with a team of big data architects, data engineers and information analysts
  • Excellent written and verbal communication skills
  • Demonstrated ability to adapt to changes
  • Effective negotiation and team facilitation skills

Desired Skills

  • minimum 5-7 years
  • Broad Bank of America technical experience (3+ years) and good understanding of existing testing/operational processes and an open mind on how to enhance those
  • Systems technology implementation
  • Web services / API’s / micro-services
  • Technical expertise with Spark, Map-reduce, HDFS, Kafka, sqoop, flume, oozie, HIVE, HBase, Avro, Parquet and other Hadoop technologies.
  • Understanding of industry trends and vendor positioning
  • Solid understanding of banking/finance industry domain and applications
  • Understanding of architecture methods
  • Understanding of service oriented architecture
  • Understanding of relational databases, data modeling techniques
  • Understand relevant application technologies and development life cycles
  • Solid understanding of banking/finance industry domain and applications


1st shift (United States of America)

Hours Per Week: 


Learn more about this role

Full time


Manages People: No

Travel: Yes, 5% of the time


Talent Acquisition Contact:

Referral Bonus: