Job title: DataOps Engineer
Job type: Permanent
Emp type: Full-time
Location: Dubai
Job published: 2024-07-23
Job ID: 33001
Contact name: Katie Atkins
Contact email: katie@pinkcamel.ae

Job Description

We are looking for a DataOps Engineer for a Global Trading Company based in Dubai.

Location – Dubai, UAE

Salary – DOE

Preferred Requirements:

  • University degree in computer science or a related field
  • 3 years of experience in the engineering domain
  • Strong ability to inspect code and actively seek out security issues and vulnerabilities
  • In-depth knowledge of software development, security, operations principles and best practices
  • Extensive experience in programming languages such as Python, Perl, Bash, etc.
  • Knowledge of configuration management tools such as Chef
  • An understanding of Public Cloud Security and its core components, including Compute Engine, Google Kubernetes Engine (GKE), Google Cloud Networking (Virtual Private Cloud, Subnets, Routes, Firewall Rules, VPN), Cloud SQL, Storage (Cloud Storage, Persistent Disk), Single Sign-On (SSO), and Identity and Access Management (IAM)
  • Experience with DevOps tools and best practices such as Git, Jenkins, CircleCI, and Ansible
  • Experience with Docker or similar container technologies
  • Proficiency in security and privacy principles, including best practices such as authentication, authorisation, encryption, and GDPR
  • Excellent spoken and written English communication skills

Key Responsibilities:

  • Work closely with engineering and operations to smoothly integrate security and privacy into all stages of software development.
  • Design, implement, and maintain our secure GCP cloud setup, ensuring it is efficient and reliable. Monitor and improve cloud resources’ performance, scalability, and cost-effectiveness.
  • Create high availability and disaster recovery plans for critical data systems. Quickly troubleshoot and fix issues in data pipelines and infrastructure to minimise downtime.
  • Collaborate with the security team to put strong security measures in place for sensitive data. Stay updated with industry practices and standards, actively addressing potential risks.
  • Safeguard data privacy using access controls, encryption, and other measures. Make sure confidential information complies with industry regulations.
  • Use automation and DevOps techniques to simplify data pipeline workflows. Develop and implement automated tools to enhance security controls.
  • Respond swiftly to security incidents like data breaches or cyber-attacks. Take immediate action to limit their impact, working alongside relevant teams.
  • Supervise the implementation of security measures in databases. Collaborate with diverse teams to create secure database environments that meet business needs.

About the Role:

In this pivotal role, you will maintain and optimize our cutting-edge cloud and data infrastructure, leveraging technologies such as Google Cloud Platform (GCP), Airflow, Docker, and PostgreSQL. Your responsibilities encompass ensuring the seamless setup, efficiency, and security of data pipelines, databases, and associated systems.