Job Search / Software Development

Big Data Engineer - GCP

Job Details

Melbourne C B D
Job Type
Melissa Haddah
about 1 year ago
- 6-month contract + likely extensions
- Melbourne CBD location
- Attractive day rate

Essential skills and experience:
* Proven working experience as Big Data engineer for 3+ years 
* Experience with multiple Big data technologies and concepts such as HDFS, Hive, MapReduce, Spark, Spark streaming and NoSQL DB like HBase etc
* Experience with specific GCP technologies
* Hands-on experience in Big Data ETL tools like Informatica big data management or Talend big data management will be a huge plus
* Experience in one or more of Java, Scala, python and bash.
* Ability to work in team in diverse/ multiple stakeholder environment
* Experience in working in a fast-paced Agile environmentPreferable skills and experience
* Knowledge of and/or experience with Big Data integration and streaming technologies (e.g. Kafka, Flume, etc.)* Experience in building data ingestion framework for enterprise data lake is highly desirable
* Experience of CI/CD pipeline using Jenkins
* Knowledge of building self-contained applications using Docker, Kubernetes or similar technologies

APPLY today  |  (03) 9236 7732

Expired job