GCP Data Engineer hos Endava

Responsibilities

  • Working with Endava’s customers to analyze existing systems and prepare plan for migrating databases and data processing systems to cloud
  • Defining and implementing cloud data strategy including security posture, target operating model, DR strategy and others
  • Deliver end-to-end data solution – from early concept and planning, through early PoCs and benefits analysis, all the way to production roll outs.
  • Enhance client’s data landscape by improving data lineage, quality and reliability
  • Help organizations adopt AI/ML based solutions by introducing MLOps culture
  • Be part of technical advisory and cloud infrastructure team responsible for
    • Secure foundational Data Lakes and Data Meshes implementations
    • Automated provisioning of infrastructure and pipelines
    • Cloud-ready ETL/ELT architectures
    • Presenting analytical findings on cutting-edge BI Dashboards

Qualifications and Experience

  • Understanding of entire software development lifecycle, CI & CD as well as Data/MLOps approach
  • Expert knowledge of SQL and at least one language used in Data Analytics/Science space (Python, R, SAS)
  • Knowledge of at least one programming language (Java, C#, C++)
  • Knowledge of Big Data and Orchestration tools such as Apache Airflow or Spark
  • Experience working with Relational and NoSQL databases
  • Working experience with BI Tools (Looker, Power BI, Tableau, Data Studio)
  • Basin understanding of GIT various automation servers (Jenkins, CircleCI, GitLab CI)
  • Knowledge of messaging systems
  • Basic Knowledge of containers, Docker and Kubernetes
  • Cloud certifications such as Associate Cloud Engineer will be an asset

Husk at nævne, at du fandt dette opslag på KU Projekt & Job