Job Description:
• 9+ years of software development experience building large scale distributed data processing (minimum 50+ nodes cluster) systems/application
• Experience of at least 3 years in architecting Big Data solution at enterprise scale with at least one end to end implementation
• Ability to articulate pros & cons of TO-BE design/architecture decisions across a wide spectrum of factors
• Work closely with Operations team to size, scale and tune existing and new architecture.
• Experience working in core development Big Data projects, Should be able to perform hands-on development activities in HDFS, Hive, HBase, Spark, Scala, Map Reduce and Hadoop ETL development via tools is plus.
• Must have strong knowledge on Hadoop security components like Kerberos, SSL, and Encryption using TDE etc.
• Ability to engage in senior-level technology discussions.
• Should have worked in agile environments and good to have exposure to DevOps
• Excellent oral and written communications skills
29 Jan 2018 -save job - original job
» Apply Now
Please review all application instructions before applying to Workhcm.
Workhcm is the #1 job site worldwide*, with over 200 million unique visitors per month from more than 60 countries in 28 languages. Since...