Tasks
About the Role
We are an emerging AI-native product-driven agile start-up under Abu Dhabi government AND we are seeking a motivated and technically versatile Data Engineer to join our team. You will play a key role in delivering data platforms pipelines and ML enablement within a Databricks on Azure environment.
As part of a stream-aligned delivery team youll work closely with Data Scientists Architects and Product Managers to build scalable high-quality data solutions for clients. Youll be empowered by a collaborative environment that values continuous learning Agile best practices and technical excellence.
Ideal candidates have strong hands-on experience in Databricks Python ADF and are comfortable in fast-paced client-facing consulting engagements.
Skills and Experience requirements
1. Technical
- Databricks (or similar) e.g. Notebooks (Python SQL) Delta Lake job scheduling clusters and workspace management Unity Catalog access control awareness
- Cloud data engineering ideally Azure including storage (e.g. ADLS S3 ADLS) compute and secrets management
- Development languages such as Python SQL C# javascript etc. especially data ingestion cleaning and transformation
- ETL / ELT including structured logging error handling reprocessing strategies APIs flat files databases message queues event streaming event sourcing etc.
- Automated testing (ideally TDD) pairing/mobbing. Trunk Based Development Continuous Deployment and Infrastructure-as-Code (Terraform)
- Git and CI/CD for notebooks data pipelines and deployments
2. Integration & Data Handling
- Experienced in delivering platforms for clients including file transfer APIS (REST etc.) SQL/NoSQL/graph databases JSON CSV XML Parquet etc
- Data validation and profiling - assess incoming data quality. Cope with schema drift deduplication and reconciliation
- Testing and monitoring pipelines: Unit tests for transformations data checks and pipeline observability
3. Working Style
- Comfortable leveraging the best of lean agile and waterfall approaches. Can contribute to planning estimation and documentation but also collaborative daily re-prioritisation
- Able to explain technical decisions to teammates or clients
- Documents decisions and keeps stakeholders informed
- Comfortable seeking support from other teams for Product Databricks Data architecture
- Happy to collaborate with Data Science team on complex subsystems
Requirements
Nice-to-haves
- MLflow or light MLOps experience (for the data science touchpoints)
- Dbt / dagster / airflow or similar transformation tools
- Understanding of security and compliance (esp. around client data)
- Past experience in consulting or client-facing roles
Candidate Requirements
- 58 years (minimum 34 years hands-on with cloud/data engineering 12 years in Databricks/Azure and team/project leadership exposure)
- Bachelors degree in Computer Science Data Engineering Software Engineering Information Systems Data Engineering
Job Type: Full-time
Benefits
Visa Insurance Yearly Flight Ticket Bonus scheme relocation logistics covered
Interviewing process consists of 2 or 3 technical/behavioral interviews