Share this Job

Senior Data Engineer

 

Since its foundation in 1982, Aramex has grown to become a world leader in comprehensive logistics and transportation solutions recognized for its customized services and innovative products for businesses and consumers. Listed on the Dubai Financial Market (DFM) and headquartered in the UAE, we currently have business operations in over 567 cities across 66 countries worldwide and employ over 17,000 transportation professionals. Our breadth of services, including express courier, freight, logistics, supply chain management, e-Commerce and record management also give us considerable reach. We remain committed to further enhancing our global operations and pursuing more opportunities for business growth.

We are looking for a lead data engineer to join our team. The candidate shall work in close collaboration with the solution engineer and Data Science team. The ideal candidate should be able to fully understand the data pipelines of the current use cases.

 

They should be able to support the current Data Science use cases by building data pipelines to the data lake and data warehouse for development as well as production purposes. They should be able to mentor and guide junior members of the data engineering team.

 

If you have world-class big data skills and experience in being a driving force building top notch enterprise-level big data use case, we would love to hear from you.

 

 

In this role, you shall work a wide variety of problems to name a few:

  • Address geocoding
  • Delivery Risk Models
  • Incomplete address auto-completion using natural language processing
  • Automatic time slot prediction
  • Dynamic territory optimization
  • Profiling and predictive analytics for customers, consignees and shippers.


The role will cover the following areas of responsibility:
 

  • Own the data design and data pipeline deployment, along with establishing standards, frameworks and solution governance.
  • Support data pipelines and data architecture of current data science use cases.
  • Develop and maintain a data architecture that aligns with business needs.
  • Identify ways to improve our data reliability, efficiency, and quality.
  • Collaborate and work with data management stakeholders such as data engineers, data scientists and product managers to identify requirements for complex business problems that may be defined loosely.
  • Data modeling and table design for AWS Redshift data warehouse, PostgreSQL.
  • Create, modify and maintain ETL/ELTs to transform input datasets to data warehouse and production databases.
  • Monitor, maintain and troubleshoot data integrity issues that may occur on the data warehouse and databases.
  • Building data pipelines to move data from internal and external data sources into a data lake and data warehouse.
  • Establishing the principles of data quality management, including metadata, lineage, and business definitions.
  • Preparing data for building machine learning models
  • Providing technical guidance to team members.

 


To be successful in this role, you will need to be at least:
 

  • Bachelor’s or master’s degree in Computer Science or Engineering.
  • 5+ years experience within data management disciplines working with cloud technologies.
  • Proficient in programming languages such as Java, Python.
  • Experience with AWS or other cloud platforms.
  • Experience with big data technologies (eg Spark, Hadoop, Hive, Kafka)
  • Experience with data engineering concepts and database design.
  • Strong understanding of data modeling principles including Dimensional modeling, data normalization principles etc.
  • Experience with docker, containerization.
  • Experience with pipeline orchestration tooling (e.g. Airflow, AWS Step Function, Metaflow, KubeFlow  etc).
  • Advanced working SQL knowledge including performance tuning.
  • Understanding of data engineering tools with respect to data lake services. (Glue, Glue Catalog, RedShift, Athena, Kinesis, S3, AWS Database Migration Service, SNS, SQS, Matillion).
  • Experience with continuous integration and deployment (CI/CD) tools such as AWS CodePipeline, Jenkins, Azure DevOps.
  • Experience with version control (GIT).
  • Strong interpersonal and communication skills.
  • Excellent conceptual and analytical reasoning competencies.
  • Comfortable working in a fast-paced and highly collaborative environment. A great team player who embraces collaborations also work well individually while supporting multiple projects in parallel.
  • Experience in working and delivering end-to-end projects independently or leading a team.

Aramex