79 Data Pipelines jobs in Saudi Arabia
Data Engineer – FinTech AI / Stock Market Data Pipelines
Posted today
Job Viewed
Job Description
Data Engineer – FinTech AI / Stock Market Data Pipelines
Title: Data Engineer – FinTech AI / Stock Market Data Pipelines
Tech Stack: Apache Kafka, PySpark, TimescaleDB, PostgreSQL/MySQL, Snowflake/BigQuery/Redshift, AWS S3 / Data Lakes, Airflow / Prefect / Dagster
Responsibilities- Develop and maintain high-throughput real-time and batch pipelines for equities and financial market data .
- Implement time-series databases and ensure performance at scale.
- Manage data lakes, warehouses, and relational databases for analytics and reporting.
- Collaborate with the AI/ML team to structure datasets for training and deploying trading models.
- Ensure strong data governance, lineage, and reliability.
- Optimize cloud infrastructure for low-latency trading and market analytics.
- Competitive tax-free salary in Saudi Arabia (30k SAR)
- Opportunity to work on cutting-edge fintech + AI projects
- Be part of a CMA-licensed, regulated startup with strong growth prospects
- Career progression in a high-visibility data engineering role
- Work with a global, diverse, and ambitious team shaping the future of investing in the region
- 5+ years as a Data Engineer within fintech, trading, or financial services
- Must have experience with stock market equities, tick data, OHLC, fundamentals, APIs, and sentiment analysis
- PySpark (distributed data processing)
- TimescaleDB or other time-series DBs
- Relational Databases (PostgreSQL, MySQL)
- Data Warehouses (Snowflake, BigQuery, Redshift)
- Data Lakes (e.g., AWS S3 + Lake Formation)
- Orchestration tools: Apache Airflow, Prefect, or Dagster
- Solid experience designing scalable ETL/ELT pipelines for structured and unstructured data
- Strong communication skills and ability to thrive in a fast-paced, startup-style environment
- Experience with Bloomberg, Refinitiv, or other financial market data providers
- Knowledge of machine learning data preparation and feature engineering
- Familiarity with financial compliance, data security, and regional regulations
Data Engineer (Data Modeling & Architecture)
Posted today
Job Viewed
Job Description
Contract duration: 6 months (might get extended)
**Experience**:
- 5+ Years
Data Engineer (data modeling, architecture )
Data Integration Expert
Posted 4 days ago
Job Viewed
Job Description
This job requires relocation to Saudi Arabia for 12 months.
We are seeking a Data Integration Expert to architect and implement scalable, secure, and robust data platform solutions to support machine learning (ML), business intelligence (BI), and customer value management (CVM) analytics. This role requires deep expertise in data ingestion, ETL pipelines, and on-premise data platforms like Cloudera or Teradata .
Key Responsibilities:- Define and implement scalable architecture for data platforms with a focus on AI and analytics enablement .
- Ensure the platform supports end-to-end ML pipelines , BI dashboards , and CVM analytics .
- Design and maintain reference models for data ingestion, transformation, processing, and access.
- Integrate security, governance , and compliance frameworks (including PDPL and other local regulatory standards).
- Work closely with data scientists, analysts , and IT infrastructure teams to align architecture with business goals.
- Manage and optimize ETL pipelines and ensure data quality, lineage, and metadata management .
- Provide hands-on leadership in setting up, maintaining, and evolving on-premise platforms such as Cloudera or Teradata.
- 6–8 years of experience in building ETL pipelines , data integration , and data platform management .
- Strong understanding of on-premise data ecosystems , preferably Cloudera (Hadoop) or Teradata .
- Proficiency in data ingestion frameworks , data lakes , and batch + real-time data processing .
- Experience in data governance, compliance , and security standards , especially PDPL or similar data privacy laws.
- Strong knowledge of SQL, Spark, Hive , and scripting languages (e.g., Python, Bash).
- Ability to collaborate across cross-functional teams and work independently in a fast-paced environment.
- Experience working in telecom or large-scale enterprise data environments .
- Familiarity with Kafka, NiFi , and data orchestration tools like Airflow .
- Knowledge of DevOps practices , CI/CD pipelines , and containerization (Docker/Kubernetes) is a plus.
CIM Data Modelling & Data Integration Expert – Electric Utility
Posted today
Job Viewed
Job Description
Overview
Company Overview
Elia Grid International (EGI) is a global consultancy company specialized in tackling complex power system challenges. Our multidisciplinary team of experts offers strategic, technical and regulatory advice in all fields related to large power system integration, as well as a range of specialist solutions. We are an international company with offices in Brussels, Berlin, Abu-Dhabi, Riyadh, Kuala Lumpur and Calgary, and we have been successfully delivering projects in over countries worldwide.
Role SummaryRole Summary
We are seeking a highly skilled CIM (Common Information Model) Data Modelling & Data Integration Expert with deep experience in the Electric Utility industry. The ideal candidate will have a strong foundation in IEC CIM standards, data architecture, and system integration, along with proven knowledge of Asset Management processes and practices. You will play a key role in enabling interoperability between enterprise systems (GIS, APM, EAM, ADMS, SCADA, etc.) by defining and managing data models and integration frameworks, aligned with the utility’s asset management and digital transformation strategy.
Key Responsibilities- Develop and maintain CIM-based data models for transmission, distribution, and generation asset domains using standards such as IEC , IEC , and IEC .
- Lead data integration initiatives between systems such as GIS, EAM (Maximo / SAP), APM platforms, ADMS, SCADA, and data lakes using CIM-based or other semantic data models.
- Design and implement semantic mappings and canonical models to ensure consistent and accurate data exchange between systems.
- Collaborate with asset management teams to understand current and future asset lifecycle management requirements, and ensure data models support use cases such as condition-based maintenance, AHI, risk-based planning, and outage analysis.
- Define and maintain data dictionaries, mapping specifications, and data lineage documentation for enterprise asset data.
- Support ETL / ELT processes, APIs, and real-time data exchange mechanisms as needed for integrating structured and unstructured asset data.
- Advise on data governance, data quality, and master data management practices to ensure asset data integrity across the utility.
- Lead or support workshops with business and IT stakeholders to define requirements and align on CIM data model adaptations / extensions.
- Stay up to date with evolving CIM standards, industry practices, and contribute to internal knowledge sharing.
- Bachelor’s or Master’s degree in Electrical Engineering, Computer Science, Information Systems, or related field.
- years of experience in the Electric Utility sector, with at least years working with CIM standards and data modeling.
- Proven experience in Asset Management processes and systems, such as IBM Maximo, SAP EAM, or similar.
- Hands-on experience integrating systems like GIS (ESRI, Smallworld), APM (IBM, GE, ABB), ADMS, SCADA, and / or data warehouses.
- Strong understanding of power system models, network topology, and asset hierarchies.
- Knowledge of integration frameworks such as ESBs, middleware, or APIs (, MuleSoft, Dell Boomi, Azure Integration Services).
- Familiarity with data governance practices, metadata management, and data quality standards.
Join EGI for more than a job—embrace a career filled with growth and learning opportunities in an exciting, professional international setting. We offer a competitive salary package and comprehensive benefits.
Location
Riyadh
Application Process
Interested candidates should submit their CV in English.
#J-18808-LjbffrCIM Data Modelling & Data Integration Expert Electric Utility
Posted 1 day ago
Job Viewed
Job Description
Company Overview
Elia Grid International (EGI) is a global consultancy company specialized in tackling complex power system challenges. Our multidisciplinary team of experts offers strategic, technical and regulatory advice in all fields related to large power system integration, as well as a range of specialist solutions. We are an international company with offices in Brussels, Berlin, Abu-Dhabi, Riyadh, Kuala Lumpur and Calgary, and we have been successfully delivering projects in over 20 countries worldwide.
Role Summary
We are seeking a highly skilled CIM (Common Information Model) Data Modelling & Data Integration Expert with deep experience in the Electric Utility industry. The ideal candidate will have a strong foundation in IEC CIM standards, data architecture, and system integration, along with proven knowledge of Asset Management processes and practices. You will play a key role in enabling interoperability between enterprise systems (GIS, APM, EAM, ADMS, SCADA, etc.) by defining and managing data models and integration frameworks, aligned with the utility’s asset management and digital transformation strategy.
Key Responsibilities
• Develop and maintain CIM-based data models for transmission, distribution, and generation asset domains using standards such as IEC 61970, IEC 61968, and IEC 62325.
• Lead data integration initiatives between systems such as GIS, EAM (Maximo/SAP), APM platforms, ADMS, SCADA, and data lakes using CIM-based or other semantic data models.
• Design and implement semantic mappings and canonical models to ensure consistent and accurate data exchange between systems.
• Collaborate with asset management teams to understand current and future asset lifecycle management requirements, and ensure data models support use cases such as condition-based maintenance, AHI, risk-based planning, and outage analysis.
• Define and maintain data dictionaries, mapping specifications, and data lineage documentation for enterprise asset data.
• Support ETL/ELT processes, APIs, and real-time data exchange mechanisms as needed for integrating structured and unstructured asset data.
• Advise on data governance, data quality, and master data management practices to ensure asset data integrity across the utility.
• Lead or support workshops with business and IT stakeholders to define requirements and align on CIM data model adaptations/extensions.
• Stay up to date with evolving CIM standards, industry practices, and contribute to internal knowledge sharing.
Candidate Profile
• Bachelor’s or Master’s degree in Electrical Engineering, Computer Science, Information Systems, or related field.
• 5+ years of experience in the Electric Utility sector, with at least 3 years working with CIM standards and data modeling.
• Proven experience in Asset Management processes and systems, such as IBM Maximo, SAP EAM, or similar.
• Hands-on experience integrating systems like GIS (ESRI, Smallworld), APM (IBM, GE, ABB), ADMS, SCADA, and/or data warehouses.
• Strong understanding of power system models, network topology, and asset hierarchies.
• Knowledge of integration frameworks such as ESBs, middleware, or APIs (e.g., MuleSoft, Dell Boomi, Azure Integration Services).
• Familiarity with data governance practices, metadata management, and data quality standards.
We Offer
Join EGI for more than a job—embrace a career filled with growth and learning opportunities in an exciting, professional international setting. We offer a competitive salary package and comprehensive benefits.
Location
Riyadh
Application Process
Interested candidates should submit their CV in English.
#J-18808-LjbffrETL Developer
Posted 7 days ago
Job Viewed
Job Description
About the Role
We are seeking an experienced ETL/ODS Developer with strong expertise in Informatica to design, develop, and maintain data integration solutions. The candidate will be responsible for building robust ETL processes, managing data flows, and supporting the organization’s operational data store (ODS) and data warehouse environments.
Key Responsibilities- Design, develop, and optimize ETL processes using Informatica PowerCenter / IICS .
- Build and maintain ODS layers and data pipelines to support reporting and analytics.
- Analyze data requirements, source-to-target mappings, and data transformation rules.
- Ensure data quality, integrity, and consistency across systems.
- Monitor ETL jobs, troubleshoot issues, and provide performance tuning.
- Work closely with data architects, DBAs, and BI teams to deliver end-to-end solutions.
- Develop and maintain technical documentation (ETL workflows, mappings, job schedules).
- Bachelor’s degree in Computer Science, Information Systems, or related field.
- Minimum 2 years of hands-on experience in ETL development using Informatica .
- Strong SQL skills and experience with Oracle, SQL Server, or PostgreSQL .
- Experience with ODS and Data Warehouse design principles .
- Knowledge of data modeling, normalization, and schema design .
- Familiarity with scheduling tools (Control-M, Autosys, etc.).
- Strong analytical and problem-solving skills.
- Experience with cloud data platforms (AWS Redshift, Azure Synapse, GCP BigQuery, Snowflake).
- Knowledge of Python or Shell scripting for automation.
- Exposure to Agile/Scrum environments .
- Informatica certification is a plus.
ETL Developer
Posted 8 days ago
Job Viewed
Job Description
Overview
IBEA fully automates B2B payments to vendors, addressing the chronic issue of delayed supplier payments prevalent in manual processing systems. Our AP automation solution streamlines operations, enables rapid scalability, accelerates business visibility, and reduces fraud risks. Located in Riyadh, IBEA helps businesses modernize their accounts payable departments and monetize supplier payments efficiently and effectively.
Role DescriptionThis is a full-time hybrid role for an ETL Developer located in Riyadh, with some work-from-home flexibility. The ETL Developer will be responsible for designing, developing, and maintaining ETL processes. Daily tasks include data integration, data modeling, using ETL tools, and ensuring data accuracy and consistency. The developer will also work closely with data analysts and other stakeholders to meet project requirements and timelines.
Qualifications- Experience with Extract, Transform, Load (ETL) and ETL Tools
- Strong Data Integration and Data Modeling skills
- Excellent Analytical Skills
- Ability to work in a hybrid setting, both remotely and on-site in Riyadh
- Relevant experience in AP automation and financial data management is a plus
- Bachelor's degree in Computer Science, Information Systems, or a related field
- Entry level
- Full-time
- Business Development and Sales
Be The First To Know
About the latest Data pipelines Jobs in Saudi Arabia !
ETL Developer
Posted 18 days ago
Job Viewed
Job Description
We are seeking an experienced ETL/ODS Developer with strong expertise in Informatica to design, develop, and maintain data integration solutions. The candidate will be responsible for building robust ETL processes, managing data flows, and supporting the organization's operational data store (ODS) and data warehouse environments.
Key Responsibilities
- Design, develop, and optimize ETL processes using Informatica PowerCenter / IICS.
- Build and maintain ODS layers and data pipelines to support reporting and analytics.
- Analyze data requirements, source-to-target mappings, and data transformation rules.
- Ensure data quality, integrity, and consistency across systems.
- Monitor ETL jobs, troubleshoot issues, and provide performance tuning.
- Work closely with data architects, DBAs, and BI teams to deliver end-to-end solutions.
- Develop and maintain technical documentation (ETL workflows, mappings, job schedules).
- Bachelor's degree in Computer Science, Information Systems, or related field.
- Minimum 2 years of hands-on experience in ETL development using Informatica.
- Strong SQL skills and experience with Oracle, SQL Server, or PostgreSQL.
- Experience with ODS and Data Warehouse design principles.
- Knowledge of data modeling, normalization, and schema design.
- Familiarity with scheduling tools (Control-M, Autosys, etc.).
- Strong analytical and problem-solving skills.
- Experience with cloud data platforms (AWS Redshift, Azure Synapse, GCP BigQuery, Snowflake).
- Knowledge of Python or Shell scripting for automation.
- Exposure to Agile/Scrum environments.
- Informatica certification is a plus.
Senior Etl Developer - Ssis/datastage
Posted today
Job Viewed
Job Description
- Provide support on projects including designing, building, and maintaining metadata models and complex ETL packages and OLAP cubes
- Design and develop SQL Server stored procedures, functions, views and triggers to be used during the ETL process
- Builds data transformations with SSIS including importing data from files, moving data from one database platform to another
- Debug and tune SSIS or other ETL processes to ensure accurate and efficient movement of data
- Analyze and develop strategies and approaches to import and transfer data between source, staging, and ODS/Data Warehouse destinations
- Create processes and frameworks in the DataStage platform to assist with the successful deployment and maintenance of ETL processes.
- Configuring clustered and distributed scalable parallel environments.
- Monitoring jobs and identifying bottlenecks in the data processing pipeline
- Testing and troubleshooting problems in ETL system designs and processes.
- Test and prepare ETL processes for deployment to production and non-production environments
- Updating data within repositories, data marts, and data warehouses.
- Improving existing ETL approaches and solutions used by the company
Qualifications and Exp
- Bachelor's degree in computer science, information systems, or a similar field
- 8-10 Years of Strong ETL Developer’s Experience with a mix of SSIS/DataStage
- At least 5 years of data integration (sourcing, staging, mapping, loading, ) experience, SSIS preferred
- Proficiency in SQL or another relevant coding language.
- Demonstrated experience with an enterprise-class integration tool such as SSIS, Informatica, Ab Initio, Data Stage
- Understanding of other ETL tools, such as Informatica, Oracle ETL, or Xplenty
- Previous Experience in Saudi Region will be a plus, especially in a Financial Institution
**Salary**: ﷼17,500.00 - ﷼26,000.00 per month
COVID-19 considerations:
No
Ability to commute/relocate:
- Jeddah: Reliably commute or planning to relocate before starting work (required)
Data Engineering Specialist
Posted today
Job Viewed
Job Description
Overview
Data Engineering Specialist role at Takamol Holding.
Design, build, and manage scalable data pipelines to collect, transform, and store data from various internal and external sources.
Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Ensure data integrity, accuracy, and availability across all systems and platforms. Optimize data infrastructure for performance, reliability, and cost-efficiency. Develop and maintain processes using modern data tools and platforms. Implement data governance, security, and privacy best practices in compliance with regulatory requirements. Monitor, troubleshoot, and improve data workflows and performance issues. Document data architecture, data flows, and technical processes. Support the development of data models and analytical datasets for reporting and AI/ML use cases.
Job Requirements- Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field.
- 2–5 years of experience in data engineering, data warehousing, or similar roles.
- Strong problem-solving and communication skills.
- Analytical Thinking
- Leadership
- Teamwork and Time Management
- Entry level
- Full-time
- Engineering and Information Technology
- Business Consulting and Services