Job Details
Nationality Requirement: Malaysia
Job Description
Job Responsibilities
Data Pipeline Development
Design, develop, and maintain scalable, reliable data pipelines for ingestion, transformation, and storage.
Build and optimize ETL/ELT processes to ensure high data quality, integrity, and consistency.
Automate data workflows and implement monitoring solutions to ensure operational reliability and performance.
Data Storage & Management
Architect and implement efficient data storage solutions, including data warehouses, data lakes, and databases.
Ensure data security, integrity, and compliance with relevant regulatory standards.
Optimize data storage structures and query performance for scalability and efficiency.
Data Infrastructure & Technology
Evaluate, recommend, and implement modern data technologies to enhance data infrastructure capabilities.
Manage and optimize cloud-based data environments (e.g., Google Cloud Platform, BigQuery).
Develop and maintain comprehensive data documentation, including metadata and data lineage.
Collaboration & Communication
Partner with data scientists, analysts, AI engineers, and business stakeholders to gather and translate data requirements into scalable solutions.
Clearly communicate technical concepts to both technical and non-technical audiences.
Participate in code reviews and contribute to knowledge sharing and best practices within the team.
Performance Optimization & Support
Continuously optimize data pipelines and infrastructure for performance, scalability, and reliability.
Troubleshoot and resolve data-related issues in a timely manner.
Monitor system performance and proactively identify opportunities for improvement.
Business Intelligence & Dashboard Development
Prepare curated “gold layer” datasets for business consumption.
Develop and maintain user-friendly dashboards using BI tools (e.g., Looker Studio or similar platforms).
Guide and support business users in leveraging datasets to design or enhance dashboards effectively.
AI-Driven Data Analytics
Collaborate with AI Engineers to co-develop AI-powered analytics solutions, including chatbot interfaces.
Design scalable data models and define structured schemas to ensure context-ready data for AI applications.
Job Requirement
Bachelor’s degree in Computer Science, Data Engineering, or a related discipline (or equivalent practical experience).
Proven experience in a Data Engineer or similar role.
Strong proficiency in SQL and at least one programming language (e.g., Python, Java, or similar).
Hands-on experience with data warehousing and data lake technologies (e.g., BigQuery, Hadoop, Spark).
Experience designing and maintaining ETL/ELT processes and tools.
Familiarity with cloud platforms (e.g., GCP) and their data services.
Solid understanding of data modeling and database design principles.
Strong analytical and problem-solving abilities.
Excellent communication and collaboration skills.
Knowledge of end-to-end supply chain logistics is an added advantage.
Benefits
Hybrid Working Arrangement
EPF, SOCSO & EIS
Entitlement of various leaves such as Annual Leave, Medical Leave, Compassionate Leave, Paternity Leave, Maternity Leave etc.
Fun Working Environment
Career Growth Opportunities