Job description
Job Description
Must be an Australian Citizen
Locations: NSW, Victoria
The role will be responsible for supporting and maintaining data assets on client’s Cloud Data Lake, Cloud EDW, Legacy EDW and SAS analytics platform and will provide operational support of daily data delivery process, month end processing and responding to data related requests in the BAU pipeline.
Key duties and responsibilities
Desirable:
Experience with infrastructure-as-code tools: CloudFormation or Terraform, at least one ETL tool like DBT, Talend, Informatica etc, Strong SQL Proficiency and SAS (Base, EG, SAS Viya) will be highly desirable.
To Apply:
If this sounds like the role for you, please submit an updated copy of your resume in MS Word format by hitting APPLY NOW or contact Humaira Hashmi on HHashmi@dfp.com.au
Applicants new to DFP may be asked to provide additional information including work rights status via a survey link – if requested, we ask that you provide this information in order to expedite your application.
DFP welcomes applications from Aboriginal and Torres Strait Islander people, people with diverse cultural and linguistic backgrounds and people with disability. In addition, DFP will provide reasonable adjustments for individuals with disability throughout the recruitment process. If you identify as a person with disability and require adjustments to the application, recruitment, selection and/or assessment process, please advise via adjustments@dfp.com.au or 1300 337 000 and indicate your preferred method of communication (email, phone, text) so we can keep in touch and meet your accessibility needs.
By clicking 'apply', you give consent that DFP may use your personal information to process your job application and to contact you for future employment opportunities. For further information on how DFP process your personal information please review the DFP Information Collection and Privacy Policy via https://www.dfp.com.au/about-us/policies. Do not submit any sensitive personal information in your resume.
- 12-months contract with possible extension
- NSW, VIC
- Must be able to obtain Baseline
Must be an Australian Citizen
Locations: NSW, Victoria
The role will be responsible for supporting and maintaining data assets on client’s Cloud Data Lake, Cloud EDW, Legacy EDW and SAS analytics platform and will provide operational support of daily data delivery process, month end processing and responding to data related requests in the BAU pipeline.
Key duties and responsibilities
- Experience in AWS Cloud, AWS S3, AWS Glue or similar tools within the cloud environment
- Provide level 2/3 technical support for AWS, Control-M, Teradata (legacy EDW), Snowflake, and ETL tool-based data integration solutions.
- DevOps - Ability to understand DevOps process and can use DevOps tools in accordance with the process
- Programming – High level of competency in Programming, including knowledge of supplementary programming languages such as Python
- Experience in Control M or similar scheduling applications.
- Worked on File transfer tools e.g. GoAnyWhere or any other similar tools
- Version control - Ability to demonstrate knowledge of version controls and its appropriate uses
- Facilitate continuous improvement measures in delivery principles, coding standards, documentation and provide training sessions to team.
- Prioritise work items and add them to a work queue.
- Understand, analyse and size user requirements.
- Development and maintenance of SQL analytical and ETL code.
- Development and maintenance of system documentation.
- Work within a state of the art, greenfield dev ops environment.
- Collaboration with data consumers, database development, testers and IT support teams.
- Proficiency in AWS services related to data engineering: AWS Glue, S3, Lambda, EC2, RDS
- Data pipeline design and development using ETL/ELT frameworks
- Proficiency with Snowflake as a Cloud EDW/Teradata as On-Prem EDW
- Proficiency in programming languages: Python (preferred)
- Control-m Orchestration / Monitoring or similar applications.
- Strong experience in Operational Support processes and working in a BAU environment.
Desirable:
Experience with infrastructure-as-code tools: CloudFormation or Terraform, at least one ETL tool like DBT, Talend, Informatica etc, Strong SQL Proficiency and SAS (Base, EG, SAS Viya) will be highly desirable.
To Apply:
If this sounds like the role for you, please submit an updated copy of your resume in MS Word format by hitting APPLY NOW or contact Humaira Hashmi on HHashmi@dfp.com.au
Applicants new to DFP may be asked to provide additional information including work rights status via a survey link – if requested, we ask that you provide this information in order to expedite your application.
DFP welcomes applications from Aboriginal and Torres Strait Islander people, people with diverse cultural and linguistic backgrounds and people with disability. In addition, DFP will provide reasonable adjustments for individuals with disability throughout the recruitment process. If you identify as a person with disability and require adjustments to the application, recruitment, selection and/or assessment process, please advise via adjustments@dfp.com.au or 1300 337 000 and indicate your preferred method of communication (email, phone, text) so we can keep in touch and meet your accessibility needs.
By clicking 'apply', you give consent that DFP may use your personal information to process your job application and to contact you for future employment opportunities. For further information on how DFP process your personal information please review the DFP Information Collection and Privacy Policy via https://www.dfp.com.au/about-us/policies. Do not submit any sensitive personal information in your resume.