This role will lead multiple technical/security architecture and design initiatives across enterprise data science and analytics platforms. The candidate should have excellent communication skills to socialize and collaborate with the internal partners and external vendors.
• 5+ year experience designing, building, and operating large scale in-production Big Data and/or stream processing solutions.
• 3+ years hands on programming experience in tools/technologies such as HDFS, Sqoop, Hive, Hbase, Flume, Yarn, MapReduce, Spark, etc.
• Excellent understanding of data security concepts and implementing authentication, encryption and authorization
• Aptitude to evaluate and investigate software products from information security perspective
• Monitor, prevent and troubleshoot security related issues
• Collects data to identify root cause of problems
• Develops metrics that provide data to measure and identify indicators for future improvement opportunities
• Analyze the metrics and parameters to identify root causes to the current problem or to identify the best solution
• Identify and develop appropriate controls to ensure the security/process compliance
• Track the performance of the solution to ensure the solution provides the desired results and conduct periodic assessments to validate the effectiveness of solutions
• Experience working in an Agile/SCRUM environment
• Ability to communicate the concepts clearly, apply industry standards and best practices.
• Ability to lead cross functional teams, establish routines and develop work map
• Experience building Data Cloud and computing platforms alongside Hadoop
• Experience with Apache Atlas, Ranger
• Functional knowledge of metadata management and governance
• Risk Management exposure/experience
• Awareness of tools like Process maps, affinity diagrams, RCA etc.
Top 3 required technical skill-sets:
Hadoop stack, Data Security, J2EE
Shift:1st shift (United States of America)
Hours Per Week:40
Learn more about this role