Bank of America is one of the world’s leading financial institutions, serving individual consumers, small and middle-market businesses and large corporations with a full range of banking, investing, asset management and other financial and risk management products and services.
We are committed to attracting and retaining top talent across the globe to ensure our continued success. Along with taking care of our customers, we want to be the best place for people to work and aim at creating a work environment where all employees have the opportunity to achieve their goals.
We are a part of the Global Business Services which delivers technology and operations capabilities to all Bank of America lines of business (LOB) and enterprise functions.
Our employees help our customers and clients at every stage of their financial lives, helping them connect to what matters most. This purpose defines and unites us. Every day, we are focused on delivering value, convenience, expertise and innovation for individuals, businesses and institutional investors we serve worldwide.
BA Continuum is a nonbank subsidiary of Bank of America, part of Global Business Services in the bank.
Shared Technology Services (STS) is a technology group providing solutions to the Global Business services group, which in turn caters to multiple Business like Consumer Wealth Solutions, Global Markets, etc. Major offerings of STS includes providing Technical experts spanning ETL, Hadoop, Dot Net, Mainframe, Java and Database with varying competency skills as per need.
Application Development - The Individual will be a part of Shared Technology Services STS providing Hadoop platform based solutions to multiple lines of business. Following established written procedures, guidelines and techniques, develops, enhances, tests, supports, maintains, and debugs software applications that support business units or support functions. Individual contributor role; fully competent to work under general direction for the most of low to moderate complex projects and seeks guidance and technical direction from more senior associates or leads, as needed. Meet functional and non-functional requirements. Often responsible for the completion of a phase of a project.
*Work as a Hadoop Developer as a part of Agile team.
*Understand the data model used by project team
• Develop solutions using Hadoop technologies
• Perform Unit testing, Provide application support.
• Interact with Product Owner / Scrum Master / End users to understand new business requirements and enhancement requests.
• Develop Unix / Autosys scripts as per project requirements
• Resolve all Data Batch job related issues as per requirements.
• Contributes to story refinement/defining requirements.
• Participates in estimating work necessary to realize a story/requirement through the delivery lifecycle.
• Understands and utilizes basic architecture components in solution development.
• Codes solution and unit tests to deliver a requirement/story per the defined acceptance criteria.
• Executes automated test suites (integration, regression, performance); collects results and flags issues.
• Documents and communicates required information for deployment, maintenance, support, and business functionality.
• Adheres to team delivery/release process and cadence pertaining to code deployment and release.
Education - B.E./ B. Tech/M.E./M. Tech/BSC/MSC/BCA/MCA (prefer IT/CS specialization)
Certifications If Any - Cloudera Certified Professional (CCP), Cloudera Certified Associate (CCA) or any other Big Data related certificates would be preferred
Experience Range – 4 – 6 yrs
• 3+ years of hands-on experience with Big Data Technologies - HDFS, HIVE, SPARK, Impala, Sqoop
• Python/Pyspark/Scala (Good to Have)
• Experience in developing applications aligned with Bigdata Strategy & Roadmap
• RBMS (Oracle, Teradata,) knowledge for development using SQL/PL SQL
• Strong Shell scripting experience
• Comfortable with Unix based systems
• Experience with fully integrated SDLC development environments
• Excellent communication skills and attention to detail
• Ability to understand business requirements and develop software solutions
• Knowledge on data warehousing and Extract Transform Load (ETL) tools
• Agile project management (SCRUM, SAFe) methodologies
• Banking domain knowledge.
• API development experience
• Spark model development/integration experience
• Strong experience to data analytical tools/languages
• Big Data Analytics & Data Science concepts are added advantage
• Awareness or Experience with Data Lake with Cloudera Hadoop ecosystem
• Data Integration, Data Security on Hadoop ecosystem - Kerberos
• Experience with Agile Development, SCRUM, or Extreme Programming methodologies.
General Shift (11:30 AM to 8:30 PM / 12:30 PM to 9:30 PM)
Learn more about this role