Main characteristics
- Location
- Canberra, Australian Capital Territory
- Company
- Rubix Solutions Pty Ltd
- Salary
- $220k+ (depending on experience) + Super + Benefit
- Employment type
- Contract\u002FTemp
- Category
- Information & Communication Technology
Job summary
Lead Data Engineer (DataOps) - NV1 Responsible for the design, build and management of the information and data management pipeline Preparing data for analytical and operational use With an aptitude for translating business problems into data and infrastructure / resource requirements and solutions Design, construct, test and maintain data pipelines to consolidate information from different source systems and pipelines Integrate, consolidate, cleanse and monitor the data Actively ensure the stability and scalability of systems and data platforms Bring the best of DevOps practices to the data world by embracing the emerging practice of DataOps. NV1 Security Clearance is required as a minimum for this position.Skills, Experience & Requirements Tertiary educated in Software Engineering, Computer Science, or relevant industry qualification with a focus on software design and development. Experience with a range of technical skills that could include: Knowledge of architecting and engineering cloud-based data solutions with the following products Teradata Oracle Data Integrator (ODI) Snowflake Cloud platforms with a focus on Paas or Serverless: AWS: Redshift/RDS, S3, EC2, Lambda, step, EMR, Glue, DynamoDB, Athena, Kinesis, Cloudwatch, SQS, SNS, fargate OR Azure: Blob storage, Synapse, Data Factory, Functions, CosmosDB, Databricks, log analytics, Event hub, Azure Kubernetes Service Big Data technologies such as Hadoop, Spark Streaming, Flink, Hudi, Storm, NiFi, HBase, Hive, Zepplin, Kafka, Ranger, Ambari. Programming languages such as Java, Node, C#, Go, Python, Scala, SAS, R. Experience with DevOps principles and tools, including: Agile enterprise development environments, CICD implementation, continuous testing, Cloud resource management (CloudFormation, terraform, ARM templates, pulumi etc..), automation of environment deployment. Continuous Integration/Delivery tools such as Jenkins, AWS code, Azure DevOps or other similar industry tools Version control and development processes for data, low-level hardware and software configurations, and the code and configuration specific to each tool in the chain. Experience with SQL-based technologies (e.g., PostgreSQL and MySQL) and NoSQL technologies (e.g. Cassandra and MongoDB) Experience of Data Lake and Data warehousing solutions and architectures and low-level design principles Data modelling tools (e.g., ERWin, Enterprise Architect and Visio) Experience with SQL, Python and specialist ETL tools Experience with DevOps principles and tools, including Agile enterprise developments environments, continuous integration/ delivery tools such as high skills in Azure DevOps or other similar industry tools. Oracle ETL Toolset experience is desirable Defence experience is desirable This role is open to Australian citizens with a minimum of NV1 Security Clearance Benefits This role is based in Canberra Full Time Permanent or Contract Opportunity with a Flexible work culture Development and learning opportunities with Progressive benefits Start Date: Immediately (within 1 to 4 weeks) Salary Offer: $220,000++ (depending on experience) + Super + Benefits
Apply
Please hit the Apply button and attach your most up to date resume that highlights your relevant background and experience including your qualifications, tools, technologies and skills.
Alternatively, contact John Murphy for a confidential discussion.
John Murphy Accounts Director [email protected]