79 Data Pipelines jobs in Saudi Arabia

Data Engineer – FinTech AI / Stock Market Data Pipelines

Riyadh, Riyadh UMATR

Posted today

Job Viewed

Tap Again To Close

Job Description

Data Engineer – FinTech AI / Stock Market Data Pipelines

Title: Data Engineer – FinTech AI / Stock Market Data Pipelines

Tech Stack: Apache Kafka, PySpark, TimescaleDB, PostgreSQL/MySQL, Snowflake/BigQuery/Redshift, AWS S3 / Data Lakes, Airflow / Prefect / Dagster

Responsibilities
  • Develop and maintain high-throughput real-time and batch pipelines for equities and financial market data .
  • Implement time-series databases and ensure performance at scale.
  • Manage data lakes, warehouses, and relational databases for analytics and reporting.
  • Collaborate with the AI/ML team to structure datasets for training and deploying trading models.
  • Ensure strong data governance, lineage, and reliability.
  • Optimize cloud infrastructure for low-latency trading and market analytics.
What Is In It For You
  • Competitive tax-free salary in Saudi Arabia (30k SAR)
  • Opportunity to work on cutting-edge fintech + AI projects
  • Be part of a CMA-licensed, regulated startup with strong growth prospects
  • Career progression in a high-visibility data engineering role
  • Work with a global, diverse, and ambitious team shaping the future of investing in the region
Requirements
  • 5+ years as a Data Engineer within fintech, trading, or financial services
  • Must have experience with stock market equities, tick data, OHLC, fundamentals, APIs, and sentiment analysis
  • PySpark (distributed data processing)
  • TimescaleDB or other time-series DBs
  • Relational Databases (PostgreSQL, MySQL)
  • Data Warehouses (Snowflake, BigQuery, Redshift)
  • Data Lakes (e.g., AWS S3 + Lake Formation)
  • Orchestration tools: Apache Airflow, Prefect, or Dagster
  • Solid experience designing scalable ETL/ELT pipelines for structured and unstructured data
  • Strong communication skills and ability to thrive in a fast-paced, startup-style environment
Nice to Have
  • Experience with Bloomberg, Refinitiv, or other financial market data providers
  • Knowledge of machine learning data preparation and feature engineering
  • Familiarity with financial compliance, data security, and regional regulations

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer (Data Modeling & Architecture)

Riyadh, Riyadh Novel Overseas Corporation

Posted today

Job Viewed

Tap Again To Close

Job Description

**Work location**: KSA

Contract duration: 6 months (might get extended)

**Experience**:

- 5+ Years

Data Engineer (data modeling, architecture )
This advertiser has chosen not to accept applicants from your region.

Data Integration Expert

FiftyFive Technologies

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

This job requires relocation to Saudi Arabia for 12 months.

We are seeking a Data Integration Expert to architect and implement scalable, secure, and robust data platform solutions to support machine learning (ML), business intelligence (BI), and customer value management (CVM) analytics. This role requires deep expertise in data ingestion, ETL pipelines, and on-premise data platforms like Cloudera or Teradata .

Key Responsibilities:
  • Define and implement scalable architecture for data platforms with a focus on AI and analytics enablement .
  • Ensure the platform supports end-to-end ML pipelines , BI dashboards , and CVM analytics .
  • Design and maintain reference models for data ingestion, transformation, processing, and access.
  • Integrate security, governance , and compliance frameworks (including PDPL and other local regulatory standards).
  • Work closely with data scientists, analysts , and IT infrastructure teams to align architecture with business goals.
  • Manage and optimize ETL pipelines and ensure data quality, lineage, and metadata management .
  • Provide hands-on leadership in setting up, maintaining, and evolving on-premise platforms such as Cloudera or Teradata.
Required Skills:
  • 6–8 years of experience in building ETL pipelines , data integration , and data platform management .
  • Strong understanding of on-premise data ecosystems , preferably Cloudera (Hadoop) or Teradata .
  • Proficiency in data ingestion frameworks , data lakes , and batch + real-time data processing .
  • Experience in data governance, compliance , and security standards , especially PDPL or similar data privacy laws.
  • Strong knowledge of SQL, Spark, Hive , and scripting languages (e.g., Python, Bash).
  • Ability to collaborate across cross-functional teams and work independently in a fast-paced environment.
Preferred Qualifications:
  • Experience working in telecom or large-scale enterprise data environments .
  • Familiarity with Kafka, NiFi , and data orchestration tools like Airflow .
  • Knowledge of DevOps practices , CI/CD pipelines , and containerization (Docker/Kubernetes) is a plus.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

CIM Data Modelling & Data Integration Expert – Electric Utility

Riyadh, Riyadh Elia Grid International

Posted today

Job Viewed

Tap Again To Close

Job Description

Overview

Company Overview

Elia Grid International (EGI) is a global consultancy company specialized in tackling complex power system challenges. Our multidisciplinary team of experts offers strategic, technical and regulatory advice in all fields related to large power system integration, as well as a range of specialist solutions. We are an international company with offices in Brussels, Berlin, Abu-Dhabi, Riyadh, Kuala Lumpur and Calgary, and we have been successfully delivering projects in over countries worldwide.

Role Summary

Role Summary

We are seeking a highly skilled CIM (Common Information Model) Data Modelling & Data Integration Expert with deep experience in the Electric Utility industry. The ideal candidate will have a strong foundation in IEC CIM standards, data architecture, and system integration, along with proven knowledge of Asset Management processes and practices. You will play a key role in enabling interoperability between enterprise systems (GIS, APM, EAM, ADMS, SCADA, etc.) by defining and managing data models and integration frameworks, aligned with the utility’s asset management and digital transformation strategy.

Key Responsibilities
  • Develop and maintain CIM-based data models for transmission, distribution, and generation asset domains using standards such as IEC , IEC , and IEC .
  • Lead data integration initiatives between systems such as GIS, EAM (Maximo / SAP), APM platforms, ADMS, SCADA, and data lakes using CIM-based or other semantic data models.
  • Design and implement semantic mappings and canonical models to ensure consistent and accurate data exchange between systems.
  • Collaborate with asset management teams to understand current and future asset lifecycle management requirements, and ensure data models support use cases such as condition-based maintenance, AHI, risk-based planning, and outage analysis.
  • Define and maintain data dictionaries, mapping specifications, and data lineage documentation for enterprise asset data.
  • Support ETL / ELT processes, APIs, and real-time data exchange mechanisms as needed for integrating structured and unstructured asset data.
  • Advise on data governance, data quality, and master data management practices to ensure asset data integrity across the utility.
  • Lead or support workshops with business and IT stakeholders to define requirements and align on CIM data model adaptations / extensions.
  • Stay up to date with evolving CIM standards, industry practices, and contribute to internal knowledge sharing.
Candidate Profile
  • Bachelor’s or Master’s degree in Electrical Engineering, Computer Science, Information Systems, or related field.
  • years of experience in the Electric Utility sector, with at least years working with CIM standards and data modeling.
  • Proven experience in Asset Management processes and systems, such as IBM Maximo, SAP EAM, or similar.
  • Hands-on experience integrating systems like GIS (ESRI, Smallworld), APM (IBM, GE, ABB), ADMS, SCADA, and / or data warehouses.
  • Strong understanding of power system models, network topology, and asset hierarchies.
  • Knowledge of integration frameworks such as ESBs, middleware, or APIs (, MuleSoft, Dell Boomi, Azure Integration Services).
  • Familiarity with data governance practices, metadata management, and data quality standards.
We Offer

Join EGI for more than a job—embrace a career filled with growth and learning opportunities in an exciting, professional international setting. We offer a competitive salary package and comprehensive benefits.

Location

Riyadh

Application Process

Interested candidates should submit their CV in English.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

CIM Data Modelling & Data Integration Expert Electric Utility

Riyadh, Riyadh Elia Grid International

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Company Overview

Elia Grid International (EGI) is a global consultancy company specialized in tackling complex power system challenges. Our multidisciplinary team of experts offers strategic, technical and regulatory advice in all fields related to large power system integration, as well as a range of specialist solutions. We are an international company with offices in Brussels, Berlin, Abu-Dhabi, Riyadh, Kuala Lumpur and Calgary, and we have been successfully delivering projects in over 20 countries worldwide.

Role Summary

We are seeking a highly skilled CIM (Common Information Model) Data Modelling & Data Integration Expert with deep experience in the Electric Utility industry. The ideal candidate will have a strong foundation in IEC CIM standards, data architecture, and system integration, along with proven knowledge of Asset Management processes and practices. You will play a key role in enabling interoperability between enterprise systems (GIS, APM, EAM, ADMS, SCADA, etc.) by defining and managing data models and integration frameworks, aligned with the utility’s asset management and digital transformation strategy.

Key Responsibilities

• Develop and maintain CIM-based data models for transmission, distribution, and generation asset domains using standards such as IEC 61970, IEC 61968, and IEC 62325.
• Lead data integration initiatives between systems such as GIS, EAM (Maximo/SAP), APM platforms, ADMS, SCADA, and data lakes using CIM-based or other semantic data models.
• Design and implement semantic mappings and canonical models to ensure consistent and accurate data exchange between systems.
• Collaborate with asset management teams to understand current and future asset lifecycle management requirements, and ensure data models support use cases such as condition-based maintenance, AHI, risk-based planning, and outage analysis.
• Define and maintain data dictionaries, mapping specifications, and data lineage documentation for enterprise asset data.
• Support ETL/ELT processes, APIs, and real-time data exchange mechanisms as needed for integrating structured and unstructured asset data.
• Advise on data governance, data quality, and master data management practices to ensure asset data integrity across the utility.
• Lead or support workshops with business and IT stakeholders to define requirements and align on CIM data model adaptations/extensions.
• Stay up to date with evolving CIM standards, industry practices, and contribute to internal knowledge sharing.

Candidate Profile

• Bachelor’s or Master’s degree in Electrical Engineering, Computer Science, Information Systems, or related field.
• 5+ years of experience in the Electric Utility sector, with at least 3 years working with CIM standards and data modeling.
• Proven experience in Asset Management processes and systems, such as IBM Maximo, SAP EAM, or similar.
• Hands-on experience integrating systems like GIS (ESRI, Smallworld), APM (IBM, GE, ABB), ADMS, SCADA, and/or data warehouses.
• Strong understanding of power system models, network topology, and asset hierarchies.
• Knowledge of integration frameworks such as ESBs, middleware, or APIs (e.g., MuleSoft, Dell Boomi, Azure Integration Services).
• Familiarity with data governance practices, metadata management, and data quality standards.

We Offer

Join EGI for more than a job—embrace a career filled with growth and learning opportunities in an exciting, professional international setting. We offer a competitive salary package and comprehensive benefits.

Location

Riyadh

Application Process

Interested candidates should submit their CV in English.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

ETL Developer

Integrated Solutions Tawantech

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

About the Role

We are seeking an experienced ETL/ODS Developer with strong expertise in Informatica to design, develop, and maintain data integration solutions. The candidate will be responsible for building robust ETL processes, managing data flows, and supporting the organization’s operational data store (ODS) and data warehouse environments.

Key Responsibilities
  • Design, develop, and optimize ETL processes using Informatica PowerCenter / IICS .
  • Build and maintain ODS layers and data pipelines to support reporting and analytics.
  • Analyze data requirements, source-to-target mappings, and data transformation rules.
  • Ensure data quality, integrity, and consistency across systems.
  • Monitor ETL jobs, troubleshoot issues, and provide performance tuning.
  • Work closely with data architects, DBAs, and BI teams to deliver end-to-end solutions.
  • Develop and maintain technical documentation (ETL workflows, mappings, job schedules).
Requirements
  • Bachelor’s degree in Computer Science, Information Systems, or related field.
  • Minimum 2 years of hands-on experience in ETL development using Informatica .
  • Strong SQL skills and experience with Oracle, SQL Server, or PostgreSQL .
  • Experience with ODS and Data Warehouse design principles .
  • Knowledge of data modeling, normalization, and schema design .
  • Familiarity with scheduling tools (Control-M, Autosys, etc.).
  • Strong analytical and problem-solving skills.
Nice to Have
  • Experience with cloud data platforms (AWS Redshift, Azure Synapse, GCP BigQuery, Snowflake).
  • Knowledge of Python or Shell scripting for automation.
  • Exposure to Agile/Scrum environments .
  • Informatica certification is a plus.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

ETL Developer

IBEA

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

workfromhome

Overview

IBEA fully automates B2B payments to vendors, addressing the chronic issue of delayed supplier payments prevalent in manual processing systems. Our AP automation solution streamlines operations, enables rapid scalability, accelerates business visibility, and reduces fraud risks. Located in Riyadh, IBEA helps businesses modernize their accounts payable departments and monetize supplier payments efficiently and effectively.

Role Description

This is a full-time hybrid role for an ETL Developer located in Riyadh, with some work-from-home flexibility. The ETL Developer will be responsible for designing, developing, and maintaining ETL processes. Daily tasks include data integration, data modeling, using ETL tools, and ensuring data accuracy and consistency. The developer will also work closely with data analysts and other stakeholders to meet project requirements and timelines.

Qualifications
  • Experience with Extract, Transform, Load (ETL) and ETL Tools
  • Strong Data Integration and Data Modeling skills
  • Excellent Analytical Skills
  • Ability to work in a hybrid setting, both remotely and on-site in Riyadh
  • Relevant experience in AP automation and financial data management is a plus
  • Bachelor's degree in Computer Science, Information Systems, or a related field
Seniority level
  • Entry level
Employment type
  • Full-time
Job function
  • Business Development and Sales

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data pipelines Jobs in Saudi Arabia !

ETL Developer

TAWANTECH

Posted 18 days ago

Job Viewed

Tap Again To Close

Job Description

About The Role

We are seeking an experienced ETL/ODS Developer with strong expertise in Informatica to design, develop, and maintain data integration solutions. The candidate will be responsible for building robust ETL processes, managing data flows, and supporting the organization's operational data store (ODS) and data warehouse environments.

Key Responsibilities

  • Design, develop, and optimize ETL processes using Informatica PowerCenter / IICS.
  • Build and maintain ODS layers and data pipelines to support reporting and analytics.
  • Analyze data requirements, source-to-target mappings, and data transformation rules.
  • Ensure data quality, integrity, and consistency across systems.
  • Monitor ETL jobs, troubleshoot issues, and provide performance tuning.
  • Work closely with data architects, DBAs, and BI teams to deliver end-to-end solutions.
  • Develop and maintain technical documentation (ETL workflows, mappings, job schedules).

Requirements

  • Bachelor's degree in Computer Science, Information Systems, or related field.
  • Minimum 2 years of hands-on experience in ETL development using Informatica.
  • Strong SQL skills and experience with Oracle, SQL Server, or PostgreSQL.
  • Experience with ODS and Data Warehouse design principles.
  • Knowledge of data modeling, normalization, and schema design.
  • Familiarity with scheduling tools (Control-M, Autosys, etc.).
  • Strong analytical and problem-solving skills.

Nice to Have

  • Experience with cloud data platforms (AWS Redshift, Azure Synapse, GCP BigQuery, Snowflake).
  • Knowledge of Python or Shell scripting for automation.
  • Exposure to Agile/Scrum environments.
  • Informatica certification is a plus.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Senior Etl Developer - Ssis/datastage

Jeddah, Makkah ITC infoTech

Posted today

Job Viewed

Tap Again To Close

Job Description

Senior ETL Developer SSIS/DataStage
- Provide support on projects including designing, building, and maintaining metadata models and complex ETL packages and OLAP cubes
- Design and develop SQL Server stored procedures, functions, views and triggers to be used during the ETL process
- Builds data transformations with SSIS including importing data from files, moving data from one database platform to another
- Debug and tune SSIS or other ETL processes to ensure accurate and efficient movement of data
- Analyze and develop strategies and approaches to import and transfer data between source, staging, and ODS/Data Warehouse destinations
- Create processes and frameworks in the DataStage platform to assist with the successful deployment and maintenance of ETL processes.
- Configuring clustered and distributed scalable parallel environments.
- Monitoring jobs and identifying bottlenecks in the data processing pipeline
- Testing and troubleshooting problems in ETL system designs and processes.
- Test and prepare ETL processes for deployment to production and non-production environments
- Updating data within repositories, data marts, and data warehouses.
- Improving existing ETL approaches and solutions used by the company

Qualifications and Exp
- Bachelor's degree in computer science, information systems, or a similar field
- 8-10 Years of Strong ETL Developer’s Experience with a mix of SSIS/DataStage
- At least 5 years of data integration (sourcing, staging, mapping, loading, ) experience, SSIS preferred
- Proficiency in SQL or another relevant coding language.
- Demonstrated experience with an enterprise-class integration tool such as SSIS, Informatica, Ab Initio, Data Stage
- Understanding of other ETL tools, such as Informatica, Oracle ETL, or Xplenty
- Previous Experience in Saudi Region will be a plus, especially in a Financial Institution


**Salary**: ﷼17,500.00 - ﷼26,000.00 per month

COVID-19 considerations:
No

Ability to commute/relocate:

- Jeddah: Reliably commute or planning to relocate before starting work (required)
This advertiser has chosen not to accept applicants from your region.

Data Engineering Specialist

Riyadh, Riyadh Takamol Holding

Posted today

Job Viewed

Tap Again To Close

Job Description

Overview

Data Engineering Specialist role at Takamol Holding.

Design, build, and manage scalable data pipelines to collect, transform, and store data from various internal and external sources.

Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Ensure data integrity, accuracy, and availability across all systems and platforms. Optimize data infrastructure for performance, reliability, and cost-efficiency. Develop and maintain processes using modern data tools and platforms. Implement data governance, security, and privacy best practices in compliance with regulatory requirements. Monitor, troubleshoot, and improve data workflows and performance issues. Document data architecture, data flows, and technical processes. Support the development of data models and analytical datasets for reporting and AI/ML use cases.

Job Requirements
  • Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field.
  • 2–5 years of experience in data engineering, data warehousing, or similar roles.
  • Strong problem-solving and communication skills.
  • Analytical Thinking
  • Leadership
  • Teamwork and Time Management
Seniority level
  • Entry level
Employment type
  • Full-time
Job function
  • Engineering and Information Technology
Industries
  • Business Consulting and Services

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Pipelines Jobs