Overview (Bank overview, GBS India overview, Function Overview)*
Bank of America is one of the world’s leading financial institutions, serving individual consumers, small and middle-market businesses and large corporations with a full range of banking, investing, asset management and other financial and risk management products and services.
We are committed to attracting and retaining top talent across the globe to ensure our continued success. Along with taking care of our customers, we want to be the best place for people to work and aim at creating a work environment where all employees have the opportunity to achieve their goals.
We are a part of the Global Business Services which delivers technology and operations capabilities to all Bank of America lines of business (LOB) and enterprise functions.
Our employees help our customers and clients at every stage of their financial lives, helping them connect to what matters most. This purpose defines and unites us. Every day, we are focused on delivering value, convenience, expertise and innovation for individuals, businesses and institutional investors we serve worldwide.
BA Continuum is a nonbank subsidiary of Bank of America, part of Global Business Services in the bank.
Shared Technology Services (STS) is a technology group providing solutions to the Global Business services group, which in turn caters to multiple Business like Consumer Wealth Solutions, Global Markets, etc. Major offerings of STS includes providing Technical experts spanning ETL, Hadoop, Dot Net, Mainframe, Java and Database with varying competency skills as per need.
(Provide a high level overview of the role and scope of responsibilities)
Application Development - The Individual will be a part of Shared Technology Services STS providing Hadoop platform based solutions to multiple lines of business. Following established written procedures, guidelines and techniques, develops, enhances, tests, supports, maintains, and debugs software applications that support business units or support functions. Individual contributor role; fully competent to work under general direction for the most of low to moderate complex projects and seeks guidance and technical direction from more senior associates or leads, as needed. Meet functional and non-functional requirements. Often responsible for the completion of a phase of a project.
- Work as a Hadoop Developer as a part of Agile team.
- Understand the data model used by project team
- Develop solutions using Hadoop technologies
- Perform Unit testing, Provide application support.
- Interact with Product Owner / Scrum Master / End users to understand new business requirements and enhancement requests.
- Develop Unix / Autosys scripts as per project requirements
- Resolve all Data Batch job related issues as per requirements.
- Contributes to story refinement/defining requirements.
- Participates in estimating work necessary to realize a story/requirement through the delivery lifecycle.
- Understands and utilizes basic architecture components in solution development.
- Codes solution and unit tests to deliver a requirement/story per the defined acceptance criteria.
- Executes automated test suites (integration, regression, performance); collects results and flags issues.
- Documents and communicates required information for deployment, maintenance, support, and business functionality.
- Adheres to team delivery/release process and cadence pertaining to code deployment and release.
- B.E./ B. Tech/M.E./M. Tech/BSC/MSC/BCA/MCA (prefer IT/CS specialization)
Certifications If Any
- Cloudera Certified Professional (CCP), Cloudera Certified Associate (CCA) or any other Big Data related certificates would be preferred
- 3 + years of Big Data experience in developing solutions using Hadoop technologies – HDFS/Hive / Sqoop / Map Reduce / Spark / Scala, preferably on Cloudera Distribution.
- 3 + years of strong Unix shell scripting capabilities and exposure to any Job Scheduler like AutoSys, Unix Script ·
- Loading from disparate datasets, pre-processing using Hive, translate complex technical & functional requirements into detailed design.
- Develop applications aligned with Bigdata Strategy & Roadmap
- Need to work with top level stakeholders of Hadoop clusters, take the requirement and perform end - end work from Analysis, Design, Model tables, Coding & testing and promote to production
- Good understanding of the functional testing process, scenarios and what to expect from a testing sequence
- Strong Problem Identifying and solving ability
- Ability to understand and maintain existing automated scripts based on the specifications and validate that the solution provided meets the requirements and provide inputs.
- Excellent communication skills
- Highly committed and ability to work in fast paced environment against tight deadlines
- Ability to work in international virtual teams and in matrix structures and being a good team player
- Positive attitude to resolve problems and ability to work odd hours to support the business demands
- Banking domain knowledge.
- Working knowledge of Python or Java
- Spark model development/integration experience
- Experience to data analytical tools/languages
- Big Data Analytics & Data Science concepts are added advantage
- Experience with Agile Development, SCRUM, or Extreme Programming methodologies.
General Shift (11:30 AM to 8:30 PM / 12:30 PM to 9:30 PM)
GGM/ HYD/ MUM/ CHN/ GIFT.
Learn more about this role