Back to search results

Software Engineer II - Hadoop

Addison, Texas

Job Description:

Position Summary

This is a Platform Engineering role supporting a new initiative at Bank of America in Addison, TX. The team is responsible for managing and building the Graph Data Platform using emerging technologies interface for consumer and global wealth applications as well as for fraud and AML tracking. Candidate must possess a passion for producing high quality software and solutions, support the platform, be ready to jump in and solve complex problems, and mentor junior resources.

Required Skills

  • 5+ years of hands-on experience in software development, primarily Hadoop.

  • Good understanding of data architecture and Big Data platform.

  • Support the company’s commitment to protect the integrity and confidentiality of systems and data.

  • Experience managing and leading small development teams in an Agile environment.

  • Drive and maintain a culture of quality, innovation, and experimentation.

  • Collaborate with product teams, data analysts, and data scientists to design and build data-forward solutions.

  • Provide the prescriptive point-solution architectures and guide the descriptive architectures within assigned modules.

  • Own technical decisions for the solution and application developers in the creation of architectural decisions and artifacts.

  • Accountable for the availability, stability, scalability, security, and recoverability enabled by the designs.

  • Ability to clearly communicate with team & stakeholders.

Desired Skills

  • Bachelor’s Degree in Computer Science or related field in engineering.

  • 3+ years of programming experience in Java or Scala.

  • Hands on experience in designing, developing, and maintaining software frameworks using: Kafka, Spark Streaming, and Spark Batch processing.

  • Hands on experience building big data pipelines using Hadoop components Apache Hive, Spark, HBase.

  • Ability to understand API Specs, identify relevant API calls and implementation.

  • Understanding on various distributed file formats such as Apache Avro, Apache Parquet and common methods in data transformation.

  • Good debugging, critical thinking, and interpersonal skills: ability to interact and work well with members in other functional groups in a project teams and a strong sense of project ownership.

  • Well versed with processing and deployment technologies such as YARN.

  • Proficient in Unix environment and shell scripting.

  • Hands on experience on implementing CI/CD and automation using the Atlassian ecosystem.

  • Proven understanding of source control software like Bitbucket or Git.

Job Band:

H5

Shift: 

1st shift (United States of America)

Hours Per Week:

40

Weekly Schedule:

Referral Bonus Amount:

0

Job Description:

Position Summary

This is a Platform Engineering role supporting a new initiative at Bank of America in Addison, TX. The team is responsible for managing and building the Graph Data Platform using emerging technologies interface for consumer and global wealth applications as well as for fraud and AML tracking. Candidate must possess a passion for producing high quality software and solutions, support the platform, be ready to jump in and solve complex problems, and mentor junior resources.

Required Skills

  • 5+ years of hands-on experience in software development, primarily Hadoop.

  • Good understanding of data architecture and Big Data platform.

  • Support the company’s commitment to protect the integrity and confidentiality of systems and data.

  • Experience managing and leading small development teams in an Agile environment.

  • Drive and maintain a culture of quality, innovation, and experimentation.

  • Collaborate with product teams, data analysts, and data scientists to design and build data-forward solutions.

  • Provide the prescriptive point-solution architectures and guide the descriptive architectures within assigned modules.

  • Own technical decisions for the solution and application developers in the creation of architectural decisions and artifacts.

  • Accountable for the availability, stability, scalability, security, and recoverability enabled by the designs.

  • Ability to clearly communicate with team & stakeholders.

Desired Skills

  • Bachelor’s Degree in Computer Science or related field in engineering.

  • 3+ years of programming experience in Java or Scala.

  • Hands on experience in designing, developing, and maintaining software frameworks using: Kafka, Spark Streaming, and Spark Batch processing.

  • Hands on experience building big data pipelines using Hadoop components Apache Hive, Spark, HBase.

  • Ability to understand API Specs, identify relevant API calls and implementation.

  • Understanding on various distributed file formats such as Apache Avro, Apache Parquet and common methods in data transformation.

  • Good debugging, critical thinking, and interpersonal skills: ability to interact and work well with members in other functional groups in a project teams and a strong sense of project ownership.

  • Well versed with processing and deployment technologies such as YARN.

  • Proficient in Unix environment and shell scripting.

  • Hands on experience on implementing CI/CD and automation using the Atlassian ecosystem.

  • Proven understanding of source control software like Bitbucket or Git.

Shift:

1st shift (United States of America)

Hours Per Week: 

40

Learn more about this role

Full time

JR-21028965

Band: H5

Manages People: No

Travel: Yes, 5% of the time

Manager:

Talent Acquisition Contact:

Jessica Kreiselmaier

Referral Bonus:

0

Street Address

Primary Location:
16001 N Dallas Pkwy, TX, Addison, 75001