Connecting...

Job Search / IT & Support

Hadoop Platform Engineer

Job Details

Location
Melbourne
Salary
Melbourne Location available
Job Type
Full Time
Ref
JO-1911-535836
Contact
Taliya Pagnozzi
Posted
12 months ago
  • Be a part of the world's leading independent, end-to-end IT services company, helping clients harness the power of innovation to thrive on change.
  • Based in Melbourne or Adelaide - you choose!
  • Australian Citizens applications only
 
Our client is looking for an experienced Hadoop Platform Engineer/Administrator to join our team. The position will be responsible for performing a platform engineering and administration duties on existing Industry Accounts and new channels, will participate in cross-division, multi-function teams around go to market initiatives and provide mentoring and guidance to peers and subordinates. The Hadoop Platform Engineer/Administrator will have subject matter expertise in one or more fields related to Analytics offerings.


The ideal candidate:
 
  • Has deep Analytics and Data Management knowledge including technologies, theories, and techniques.
  • Demonstrates excellent communication and business client presentation skills
  • Helps clients identify business challenges and implement innovative solutions to deliver the right outcomes.
  • Has the ability to develop or contribute to business and technology solutions to address client needs.
  • Has a proven track record of achieving results and meeting client expectations.
  • Works independently as well as collaboratively with others.
  • Contributes to the development of innovative principles and ideas.
  • Routinely exercises independent judgment in developing methods, techniques, and criteria for achieving objectives.

Mandatory experience:
  • 5 or more years of experience in a Platform Engineering/Data Engineering/Administration role or related area.
  • Hands-on experience with a Hadoop distribution (Cloudera, Hortonworks or MapR).
  • Hands-on experience troubleshooting Java Virtual Machines (JVMs) in a clustered environment.
  • Hands-on experience with distributed file systems (HDFS).
  • Hands-on experience working with the following: Apache Kafka, Apache Spark, Apache Hive.
  • Hands-on experience working with the following: HBase, ZooKeeper.
  • Hands-on experience with TLS/SSL.
  • Hands-on experience with Document Management Systems ("wikis', Atlassian Confluence).
  • Hands-on experience with Version Control Systems (Git, GitHub, Atlassian Confluence).
  • Exposure to large Linux environments (including day-to-day operations, networking, and security).
  • Exposure to Identity Management software (RedHat Identity Management, IPA).
  • Exposure to LDAP and LDAPS in a large Linux/Active Directory environment.
  • Degree in Computer Science, Engineering or equivalent real-world work experience.

Due to the nature of the role advertised, only Australian Citizens will be considered.

If this sounds like you, then I want to hear from you. Please click "Apply Now' or to have a confidential chat, please call Taliya on 82281570.
 

Expired job