2 530 Data Professionals jobs in Saudi Arabia
Data Engineer / Data Integration
Posted 8 days ago
Job Viewed
Job Description
#J-18808-Ljbffr
Data Engineer / Data Integration
Posted 10 days ago
Job Viewed
Job Description
#J-18808-Ljbffr
Data Engineer / Data Integration
Posted 10 days ago
Job Viewed
Job Description
#J-18808-Ljbffr
Data Engineer / Data Integration
Posted today
Job Viewed
Job Description
Data Engineer / Data Integration
Posted today
Job Viewed
Job Description
Data Engineer / Data Integration
Posted today
Job Viewed
Job Description
Data Engineer
Posted 1 day ago
Job Viewed
Job Description
Job Location
Jeddah
About the JobWe are currently looking for a Data Engineer to join our team in Jeddah focused on Business Unit specific deliverables. At P&G our IT teams are the enablers of our business. Together we help achieve increased efficiency digitization breakthrough innovation speed to market and better protection against security threats for our users and brands. We are all about applied IT. Use your drive and passion! From Day 1 you will be the manager of your domain and will put your skills and ideas into practice to support develop and improve the IT solutions for our business. Your contributions will make an impact on business results and help shape the direction of your space to take it to the next level.
Job DescriptionAs a Data Engineer serving Saudi business you will move with the speed of business. Working on priorities that are most important and will deliver most business impact at different points in time. Through this role you will have the opportunity to impact multiple areas of the business :
- Go to Market with Direct Customers and Distributors.
- Retail.
- Product Supply Chain.
- Brand and Digital Marketing.
- Internal Business Planning and Operations.
- Develop business cases within Data & Analytics.
- Build data & analytics solutions in Microsoft Azure Google Cloud or AWS craft technical solutions from approved architecture to acquire process store and provide insights based on the processed data.
- Develop within existing designs of various solutions in Microsoft Azure environment to help the business get valuable insights.
- Work on agile products using cloud solutions.
- Automation : leverage technology for automated data fetching processing aggregation & syndication leading to realtime information flows and democratization of data access
- Lead IT projects in the market liaising with internal teams external partners and strategic vendors ensuring fitforuse solutions and high usage adoption.
- Manage IT operational excellence across solutions and systems (ERP SFA Reporting solutions etc.
- Lead the thinking and own trainings to IT and to other functions including documentation of best practices.
- Undergraduate / Master level qualifications in IT domains
- Proven experience in Python and SQL programming skills
- Experience in Big Data / ETL (Spark and Databricks preferred)
- Experience in implementing projects & solutions in cloud (Azure / GCP preferred AWS)Knowledge and / or experience with using or building CI / CD tools.
- Previous experience or understanding of Data Models.
- Able to access and manipulate data (KNIME DAX and Power BI front end)
- Experience in leading and managing projects (Knowledge of Agile SCRUM and DevOps methodologies preferred)
- Understanding of IT Service Operations Management (ITIL v4 is preferred)
- Knowledge of Privacy and Information security
- Up to 3 years of demonstrated experience in the above fields is preferred.
Full time
Key SkillsApache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala
Vacancy 1
#J-18808-LjbffrBe The First To Know
About the latest Data professionals Jobs in Saudi Arabia !
Data Engineer
Posted 2 days ago
Job Viewed
Job Description
- Develop and maintain robust data architectures that support business needs and provide reliable data accessibility.
- Collaborate with cross-functional teams to define data requirements and deliver scalable data solutions.
- Implement ETL processes for data extraction, transformation, and loading, ensuring high data quality and integrity.
- Optimize data storage and access strategies for improved performance and efficiency.
- Monitor and troubleshoot data pipeline performance issues, implementing necessary fixes.
- Create comprehensive documentation for data workflows and system architecture.
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- 3+ years of experience in data engineering or related roles.
- Proficiency in programming languages such as Python, Java, or Scala.
- Solid experience with SQL databases and NoSQL technologies, such as Cassandra or MongoDB.
- Familiarity with data warehousing solutions and big data technologies (e.g., Hadoop, Spark).
- Strong analytical skills and attention to detail.
Data Engineer
Posted 2 days ago
Job Viewed
Job Description
Get the future you want!
At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world’s most innovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as they provide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where can you make a difference. Where no two days are the same.
Your RoleWe are looking for a passionate and experienced Data Engineer to join our growing team. In this role, you will design, build, and optimize scalable data infrastructure that powers intelligent decision-making across industries. You’ll work with cutting-edge technologies to integrate diverse data sources, build real-time and batch pipelines, and ensure data quality, governance, and performance. You’ll collaborate with cross-functional teams to deliver robust, secure, and high-performance data solutions that drive innovation and business value.
Key Responsibilities- Design and maintain data pipelines for structured, semi-structured, and unstructured data
- Optimize Apache Spark for distributed processing and scalability
- Manage data lakes and implement Delta Lake for ACID compliance and lineage
- Integrate diverse data sources (APIs, databases, streams, flat files)
- Build real-time streaming pipelines using Apache Kafka
- Automate workflows using Airflow and containerize solutions with Docker
- Leverage cloud platforms (AWS, Azure, GCP) for scalable infrastructure
- Develop ETL workflows to transform raw data into actionable insights
- Ensure compliance with data privacy standards (PII, GDPR, HIPAA)
- Build APIs to serve processed data to downstream systems
- Implement CI/CD pipelines and observability tools (Prometheus, Grafana, Datadog)
- Bachelor’s or Master’s in Computer Science, Data Engineering, or related field
- 5+ years of experience in data engineering and distributed systems
- Expertise in Apache Spark and Delta Lake
- Hands‑on experience with cloud services (AWS, Azure, GCP)
- Strong skills in SQL and NoSQL databases (PostgreSQL, MongoDB, Cassandra)
- Proficiency in data formats like Parquet, Avro, JSON, XML
- Experience with Airflow, Docker, and CI/CD pipelines
- Familiarity with data governance and compliance frameworks
- Strong understanding of data quality, lineage, and error handling
- Experience developing data APIs and working with MLOps tools
- Experience with Kubernetes for container orchestration
- Knowledge of data warehouses (Snowflake, Redshift, Synapse)
- Familiarity with real‑time analytics platforms (Flink, Druid, ClickHouse)
- Exposure to machine learning pipelines and IoT data integration
- Understanding of graph databases (Neo4j) and data cataloging tools (Apache Atlas, Alation)
- Experience with data versioning tools like DVC
- Flexible work arrangements including remote options and flexible hours
- Career growth programs and diverse opportunities to help you thrive
- Access to certifications in the latest technologies and platforms
Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 360,000 team members in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast-evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. The Group reported in 2022 global revenues of €22 billion.
Apply now!
#J-18808-LjbffrData Engineer
Posted 2 days ago
Job Viewed
Job Description
Overview
Müller's Solutions is seeking a talented Data Engineer with expertise in Talend to join our growing team. As a Data Engineer, you will play a critical role in building and maintaining data pipelines that enable seamless data integration and processing across various systems. You will work closely with data analysts and data scientists to ensure that data is accessible, reliable, and structured for analysis.
Responsibilities- Design, develop, and maintain ETL processes using Talend to extract, transform, and load data from various sources.
- Collaborate with stakeholders to understand data integration requirements and ensure data is delivered accurately and on time.
- Implement data quality checks and validation procedures to maintain data integrity.
- Optimize data pipelines for performance and scalability, ensuring efficient data processing.
- Monitor data flow and resolve any issues related to data processing or quality.
- Document data architecture, ETL processes, and data lineage to enhance knowledge sharing within the team.
- Stay updated on best practices and advancements in data engineering, Talend, and related technologies.
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Proven experience as a Data Engineer, with a strong focus on Talend.
- Expertise in ETL design and implementation using Talend data integration tools.
- Solid understanding of database management systems, SQL, and data warehousing concepts.
- Experience with data modeling and database design best practices.
- Strong analytical and problem-solving skills, with attention to detail.
- Familiarity with cloud data solutions (e.g., AWS, Azure) is a plus.
- Excellent communication skills for collaborating with cross-functional teams and stakeholders.
- A self-starter with the ability to work independently as well as part of a team.
- Knowledge of data governance and data security practices.
- Why Join Us :
- Opportunity to work with a talented and passionate team.
- Competitive salary and benefits package.
- Exciting projects and innovative work environment.