1.-Bachelor's degree in a technical or business-related field, or equivalent education and related training
2.-Seven years of experience in data warehousing architectural approaches and minimum 3 years in big data (Cloudera)
3.-Exposure to and strong working knowledge of distributed systems
4.-Excellent understanding of client-service models and customer orientation in service delivery
5.-Ability to grasp the 'big picture' for a solution by considering all potential options in impacted area
6.-Aptitude to understand and adapt to newer technologies
7.-Assist in the evaluation of new solutions for integration into the Hadoop Roadmap/Strategy
8.-Motivate internal and external resources to deliver on project commitments
9.-The desire to learn new soft and technology skills and the desire to coach, mentor and train peers throughout the organization
10.-The ability to work with team mates in a collaborative manner to achieve a mission
11.-Presentation skills to prepare and present to large and small groups on technical and functional topics
Essential Duties and Responsibilities:
Following is a summary of the essential functions for this job. Other duties may be performed, both major and minor, which are not mentioned below. Specific activities may change from time to time.
1.-Sound understanding and experience with Hadoop ecosystem (Cloudera). Able to understand and explore the constantly evolving tools within Hadoop ecosystem and apply them appropriately to the relevant problems at hand.
2.-Experience in working with a Big Data implementation in production environment
3.-Must have experience with Big Data technologies like Hadoop, Spark, Kafka, Kudu etc.
4.-Must have experience with Spark using scala/pyspark
5.-Experience in python and Unix shell scripting is mandatory
6.-Experience in Banking domain is mandatory
7.-Sound knowledge of relational databases (SQL) and experience with large SQL based systems.
8.-Experience in query optimization, performance tuning of the complex SQL queries.
9.-Deep understanding of NLP algorithms, data structures, performance optimization techniques and software development in a team environment.
10.-Benchmark and debug critical issues with algorithms and software as they arise.
11.-Lead and assist with the technical design/architecture and implementation of the big data cluster in various environments.
12.-Able to guide/mentor development team for example to create custom common utilities/libraries that can be reused in multiple big data development efforts.
13.-Provide technical resources to assist in the design, testing and implementation of software code and infrastructure to support data infrastructure and governance activities.
14.-Collaborate with cross-functional teams Practice and enforce Agile and Scrum development methodologies
1.-Previous experience in the financial services industry
2.-Broad BofA technical experience and good understanding of existing testing/operational processes and an open mind on how to enhance those
3.-Understanding of industry trends and relevant application technologies
4.-Experience in designing and implementing analytical environments and business intelligence solutions
1st shift (United States of America)
Hours Per Week:
Learn more about this role