Bank of America is one of the world’s leading financial institutions, serving individual consumers, small and middle-market businesses and large corporations with a full range of banking, investing, asset management and other financial and risk management products and services. We are committed to attracting and retaining top talent across the globe to ensure our continued success. Along with taking care of our customers, we want to be the best place for people to work and aim at creating a work environment where all employees have the opportunity to achieve their goals.
We are a part of the Global Business Services which delivers technology and operations capabilities to Bank of America lines of business (LOB) and enterprise functions.
Our employees help our customers and clients at every stage of their financial lives, helping them connect to what matters most. This purpose defines and unites us. Every day, we are focused on delivering value, convenience, expertise and innovation for individuals, businesses and institutional investors we serve worldwide.
* BA Continuum is a nonbank subsidiary of Bank of America, part of Global Business Services in the bank.
Build and evolve a consistent Authorized Data Source within Consumer & Small Business Bank (CSBB), with an organization and operating processes to support both the strategic and tactical analytics needs of the Consumer Bank
- Hadoop developer for multiple initiatives.
- Develop Big Data Strategy and Roadmap for the Enterprise
- Experience in Capacity Planning, Cluster Designing and Deployment
- Benchmark systems, analyze system bottlenecks, and propose solutions to eliminate them
- Develop highly scalable and extensible Big Data platform, which enables collection, storage, modeling, and analysis of massive data sets from numerous channels
- Continuously evaluate new technologies, innovate and deliver solution for business critical applications
- Assists the team with the design of the architect layer to ensure re-usable metrics and attributes within the reporting layer.
- Responsible for creating and maintaining necessary documentation (MDR) to ensure audit readiness where necessary.
- Prototype improvement ideas
- Work effectively with the global team:
- Expected to play technical leadership as an individual contributor
- Articulate challenges, propose and drive solutions to make entire India engagement successful.
- Education: Any graduate
- Certifications If Any: NA
- Experience Range: 6 to 8 Years
- Mandatory skills
- Knowledge and development experience in Shell Script, Hive and Autosys.
- Strong Hands on in Spark (Scala and Python)
- Extensive knowledge in Python/Java.
- Extensive knowledge of Hadoop stack and storage technologies HDFS, MapReduce, Yarn, HIVE, sqoop, Impala , spark, flume, kafka and oozie
- Extensive Knowledge on Bigdata Enterprise architecture (Cloudera preferred)
- Desired skills
- Visual Analytics Tools knowledge ( Tableau )
- Development of complex Tableau Reports by connecting to HDFS based tables.
- Experience in both Tableau desktop and Tableau Server for development and Publishing reports.
- Experience in Real time streaming (Kafka )
- Experience in No SQL Technologies ( Cassandra, Hbase )
- Experience in ETL and Banking Card domain knowledge
- Experience with Big Data Analytics & Business Intelligence and Industry standard tools integrated with Hadoop ecosystem using R , Python, Scala
- Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )
- Awareness or experience with Data Lake with Cloudera ecosystem
Work Timings: 11:00 am –8:00 pm general shift. Flexible to accommodate specific needs
Job Location: Gurgaon
Learn more about this role