Job Opening for Data Engineer (Job Code RT 1485).
Apply Before 15-08-2025
Job Title: Data Engineer
Job Location: Remote
Employment Type: Contract
Experience Required: 3+ years
Educational Qualification: Advanced degree in Computer Science, Engineering, Data Science, AI, or a related field
Synopsis:
We are seeking a highly motivated Data Engineer to join our expanding Data & AI team. This role offers the opportunity to design and develop robust, scalable data pipelines and infrastructure, ensuring the delivery of high-quality, timely, and accessible data throughout the organization. As a Data Engineer, you will collaborate across teams to build and optimize data solutions that support analytics, reporting, and business operations. The ideal candidate combines deep technical expertise, strong communication, and a drive for continuous improvement.
Who You Are:
• Experienced in designing and building data pipelines for ingestion, transformation, and loading (ETL/ELT) of data from diverse sources to data warehouses or lakes.
• Proficient in SQL and at least one programming language, such as Python, Java, or Scala.
• Skilled at working with both relational databases (e.g., PostgreSQL, MySQL) and big data platforms (e.g., Hadoop, Spark, Hive, EMR).
• Competent in cloud environments (AWS, GCP, Azure), data lake, and data warehouse solutions.
• Comfortable optimizing and managing the quality, reliability, and timeliness of data flows.
• Ability to translate business requirements into technical specifications and collaborate effectively with stakeholders, including data scientists, analysts, and engineers.
• Detail-oriented, with strong documentation skills and a commitment to data governance, security, and compliance.
• Proactive, agile, and adaptable to a fast-paced environment with evolving business needs.
What You Will Do:
• Design, build, and manage scalable ETL/ELT pipelines to ingest, transform, and deliver data efficiently from diverse sources to centralized repositories such as lakes or warehouses.
• Implement validation, monitoring, and cleansing procedures to ensure data consistency, integrity, and adherence to organizational standards.
• Develop and maintain efficient database architectures, optimize data storage, and streamline data integration flows for business intelligence and analytics.
• Work closely with data scientists, analysts, and business users to gather requirements and deliver tailored data solutions supporting business objectives.
• Document data models, dictionaries, pipeline architectures, and data flows to ensure transparency and knowledge sharing.
• Implement and enforce data security and privacy measures, ensuring compliance with regulatory requirements and best practices.
• Monitor, troubleshoot, and resolve issues in data pipelines and infrastructure to maintain high availability and performance.
Preferred Qualifications
• Bachelor’s or higher degree in Computer Science, Information Technology, Engineering, or a related field.
• 3-4years of experience in data engineering, ETL development, or related areas.
• Strong SQL and data modelling expertise with hands-on experience in data warehousing or business intelligence projects.
• Familiarity with AWS data integration tools (e.g., Glue, Athena), messaging/streaming platforms (e.g., Kafka, AWS MSK), and big data tools (Spark, Databricks).
• Proficiency with version control, testing, and deployment tools for maintaining code and ensuring best practices.
• Experience in managing data security, quality, and operational support in a production environment.
What You Deliver
• Comprehensive data delivery documentation (data dictionary, mapping documents, models).
• Optimized, reliable data pipelines and infrastructure supporting the organization’s analytics and reporting needs.
• Operations support and timely resolution of data-related issues aligned with service level agreements.
Interdependencies / Internal Engagement
• Actively engage with cross-functional teams to align on requirements, resolve issues, and drive improvements in data delivery, architecture, and business impact.
• Become a trusted partner in fostering a data-centric culture and ensuring the long-term scalability and integrity of our data ecosystem