EN | AR

878 Data Engineer jobs in Saudi Arabia

Cloudera Big Data Engineer

SAR120000 - SAR240000 Y MDS for Computer Systems (MDS CS)

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Title: Cloudera Big Data Engineer

Employment Type: Full-time

Role Overview

We are seeking a highly skilled
Cloudera Big Data Engineer
with
5+ years of hands-on experience
in building and managing large-scale
data pipelines
using
Apache NiFi
,
Apache Kafka
, and the
Cloudera Data Platform (CDP Private Cloud)
. The ideal candidate will have deep expertise in
batch
and
real-time streaming
architectures, and will play a critical role in designing, implementing, and optimizing robust ingestion and processing frameworks to power enterprise analytics and AI/ML initiatives.

Key Responsibilities

Data Pipeline Design & Development

  • Design, implement, and maintain
    end-to-end data pipelines
    for both
    batch
    and
    real-time
    data ingestion using Apache NiFi and Kafka.
  • Ingest data from heterogeneous sources (databases, files, APIs, message queues, IoT devices) into the Cloudera ecosystem.
  • Build and manage
    Kafka topics, producers, and consumers
    to enable low-latency streaming data flows.
  • Implement complex
    data routing, transformations, validation, and enrichment
    logic within NiFi flows and Kafka streams.

Platform Engineering & Integration

  • Configure and manage data ingestion components within
    Cloudera Data Platform (CDP)
    , including NiFi, Kafka, HDFS, and Hive.
  • Integrate pipelines with downstream layers such as Hive, Impala, Kudu, or analytical databases to enable analytics and AI workloads.
  • Develop
    metadata-driven
    , reusable ingestion frameworks to accelerate new data source onboarding.

Optimization & Monitoring

  • Optimize NiFi flows and Kafka configurations for
    high throughput, scalability, and fault tolerance
    .
  • Implement comprehensive
    monitoring and alerting
    using Cloudera Manager, NiFi provenance, and Kafka metrics.
  • Troubleshoot complex ingestion and streaming issues across environments (DEV, UAT, PROD).

Governance, Security & Compliance

  • Ensure all pipelines adhere to
    data governance
    ,
    lineage
    , and
    security
    standards in line with
    PDPL
    ,
    NCA
    , and organizational frameworks.
  • Apply best practices in
    access control
    , encryption, data quality checks, and auditing.

Collaboration & Enablement

  • Work closely with
    data architects
    ,
    data modelers
    ,
    data scientists
    , and
    business analysts
    to deliver clean, reliable, and timely data.
  • Document flows, create reusable templates, and conduct
    knowledge transfer sessions
    to upskill internal teams.

Required Qualifications

  • Bachelor's degree in
    Computer Science
    ,
    Information Systems
    , or a related field (Master's is a plus).
  • 5+ years
    of proven experience as a
    Data Engineer
    working on
    big data ecosystems
    , preferably on
    Cloudera CDP (Private Cloud)
    .
  • Expert-level skills in:
  • Apache NiFi
    : flow design, scheduling, controller services, provenance tracking, error handling, templates.
  • Apache Kafka
    : topic creation, schema management, producer/consumer implementation, stream processing.
  • Building and maintaining
    batch & streaming
    data pipelines in production-grade environments.
  • Solid experience with
    HDFS
    ,
    Hive/Impala
    ,
    Linux
    , and shell scripting.
  • Good understanding of
    data security
    ,
    lineage
    , and
    governance
    frameworks.
  • Strong scripting skills (e.g., Python, Bash, Groovy) for automation and orchestration.
  • Proficiency with
    Cloudera Manager
    or similar cluster management tools.

Preferred Skills

  • Cloudera certifications (e.g.,
    CDP Data Engineer
    ,
    CDP Administrator
    ) are highly desirable.
  • Experience in
    real-time analytics
    and
    IoT ingestion
    scenarios.
  • Familiarity with
    CI/CD for data pipelines
    , Git, and DevOps practices.
  • Domain experience in
    telecom
    ,
    financial services
    , or
    government
    is a plus.

Soft Skills

  • Strong analytical and problem-solving mindset.
  • Excellent communication skills (English; Arabic is a plus).
  • Ability to work independently and in cross-functional teams.
  • Proactive, structured, and delivery-focused in complex environments.
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Jeddah, Makkah P&G

Posted today

Job Viewed

Tap Again To Close

Job Description

Job Location

Jeddah

About the Job

We are currently looking for a Data Engineer to join our team in Jeddah focused on Business Unit specific deliverables. At P&G our IT teams are the enablers of our business. Together we help achieve increased efficiency digitization breakthrough innovation speed to market and better protection against security threats for our users and brands. We are all about applied IT. Use your drive and passion! From Day 1 you will be the manager of your domain and will put your skills and ideas into practice to support develop and improve the IT solutions for our business. Your contributions will make an impact on business results and help shape the direction of your space to take it to the next level.

Job Description

As a Data Engineer serving Saudi business you will move with the speed of business. Working on priorities that are most important and will deliver most business impact at different points in time. Through this role you will have the opportunity to impact multiple areas of the business :

  • Go to Market with Direct Customers and Distributors.
  • Retail.
  • Product Supply Chain.
  • Brand and Digital Marketing.
  • Internal Business Planning and Operations.
Responsibilities
  • Develop business cases within Data & Analytics.
  • Build data & analytics solutions in Microsoft Azure Google Cloud or AWS craft technical solutions from approved architecture to acquire process store and provide insights based on the processed data.
  • Develop within existing designs of various solutions in Microsoft Azure environment to help the business get valuable insights.
  • Work on agile products using cloud solutions.
  • Automation : leverage technology for automated data fetching processing aggregation & syndication leading to realtime information flows and democratization of data access
  • Lead IT projects in the market liaising with internal teams external partners and strategic vendors ensuring fitforuse solutions and high usage adoption.
  • Manage IT operational excellence across solutions and systems (ERP SFA Reporting solutions etc.
  • Lead the thinking and own trainings to IT and to other functions including documentation of best practices.
Qualifications
  • Undergraduate / Master level qualifications in IT domains
  • Proven experience in Python and SQL programming skills
  • Experience in Big Data / ETL (Spark and Databricks preferred)
  • Experience in implementing projects & solutions in cloud (Azure / GCP preferred AWS)Knowledge and / or experience with using or building CI / CD tools.
  • Previous experience or understanding of Data Models.
  • Able to access and manipulate data (KNIME DAX and Power BI front end)
  • Experience in leading and managing projects (Knowledge of Agile SCRUM and DevOps methodologies preferred)
  • Understanding of IT Service Operations Management (ITIL v4 is preferred)
  • Knowledge of Privacy and Information security
  • Up to 3 years of demonstrated experience in the above fields is preferred.
Job Schedule

Full time

Key Skills

Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala

Vacancy 1

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Master-Works

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

  • Develop and maintain robust data architectures that support business needs and provide reliable data accessibility.
  • Collaborate with cross-functional teams to define data requirements and deliver scalable data solutions.
  • Implement ETL processes for data extraction, transformation, and loading, ensuring high data quality and integrity.
  • Optimize data storage and access strategies for improved performance and efficiency.
  • Monitor and troubleshoot data pipeline performance issues, implementing necessary fixes.
  • Create comprehensive documentation for data workflows and system architecture.
Requirements
  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • 3+ years of experience in data engineering or related roles.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Solid experience with SQL databases and NoSQL technologies, such as Cassandra or MongoDB.
  • Familiarity with data warehousing solutions and big data technologies (e.g., Hadoop, Spark).
  • Strong analytical skills and attention to detail.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Capgemini Engineering

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

workfromhome

Get the future you want!

At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world’s most innovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as they provide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where can you make a difference. Where no two days are the same.

Your Role

We are looking for a passionate and experienced Data Engineer to join our growing team. In this role, you will design, build, and optimize scalable data infrastructure that powers intelligent decision-making across industries. You’ll work with cutting-edge technologies to integrate diverse data sources, build real-time and batch pipelines, and ensure data quality, governance, and performance. You’ll collaborate with cross-functional teams to deliver robust, secure, and high-performance data solutions that drive innovation and business value.

Key Responsibilities
  • Design and maintain data pipelines for structured, semi-structured, and unstructured data
  • Optimize Apache Spark for distributed processing and scalability
  • Manage data lakes and implement Delta Lake for ACID compliance and lineage
  • Integrate diverse data sources (APIs, databases, streams, flat files)
  • Build real-time streaming pipelines using Apache Kafka
  • Automate workflows using Airflow and containerize solutions with Docker
  • Leverage cloud platforms (AWS, Azure, GCP) for scalable infrastructure
  • Develop ETL workflows to transform raw data into actionable insights
  • Ensure compliance with data privacy standards (PII, GDPR, HIPAA)
  • Build APIs to serve processed data to downstream systems
  • Implement CI/CD pipelines and observability tools (Prometheus, Grafana, Datadog)
Your Profile
  • Bachelor’s or Master’s in Computer Science, Data Engineering, or related field
  • 5+ years of experience in data engineering and distributed systems
  • Expertise in Apache Spark and Delta Lake
  • Hands‑on experience with cloud services (AWS, Azure, GCP)
  • Strong skills in SQL and NoSQL databases (PostgreSQL, MongoDB, Cassandra)
  • Proficiency in data formats like Parquet, Avro, JSON, XML
  • Experience with Airflow, Docker, and CI/CD pipelines
  • Familiarity with data governance and compliance frameworks
  • Strong understanding of data quality, lineage, and error handling
  • Experience developing data APIs and working with MLOps tools
Preferred Skills
  • Experience with Kubernetes for container orchestration
  • Knowledge of data warehouses (Snowflake, Redshift, Synapse)
  • Familiarity with real‑time analytics platforms (Flink, Druid, ClickHouse)
  • Exposure to machine learning pipelines and IoT data integration
  • Understanding of graph databases (Neo4j) and data cataloging tools (Apache Atlas, Alation)
  • Experience with data versioning tools like DVC
What You’ll Love About Working Here
  • Flexible work arrangements including remote options and flexible hours
  • Career growth programs and diverse opportunities to help you thrive
  • Access to certifications in the latest technologies and platforms
About Capgemini

Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 360,000 team members in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast-evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. The Group reported in 2022 global revenues of €22 billion.

Apply now!

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Müller`s Solutions

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Overview

Müller's Solutions is seeking a talented Data Engineer with expertise in Talend to join our growing team. As a Data Engineer, you will play a critical role in building and maintaining data pipelines that enable seamless data integration and processing across various systems. You will work closely with data analysts and data scientists to ensure that data is accessible, reliable, and structured for analysis.

Responsibilities
  • Design, develop, and maintain ETL processes using Talend to extract, transform, and load data from various sources.
  • Collaborate with stakeholders to understand data integration requirements and ensure data is delivered accurately and on time.
  • Implement data quality checks and validation procedures to maintain data integrity.
  • Optimize data pipelines for performance and scalability, ensuring efficient data processing.
  • Monitor data flow and resolve any issues related to data processing or quality.
  • Document data architecture, ETL processes, and data lineage to enhance knowledge sharing within the team.
  • Stay updated on best practices and advancements in data engineering, Talend, and related technologies.
Requirements
  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • Proven experience as a Data Engineer, with a strong focus on Talend.
  • Expertise in ETL design and implementation using Talend data integration tools.
  • Solid understanding of database management systems, SQL, and data warehousing concepts.
  • Experience with data modeling and database design best practices.
  • Strong analytical and problem-solving skills, with attention to detail.
  • Familiarity with cloud data solutions (e.g., AWS, Azure) is a plus.
  • Excellent communication skills for collaborating with cross-functional teams and stakeholders.
  • A self-starter with the ability to work independently as well as part of a team.
  • Knowledge of data governance and data security practices.
Benefits
  • Why Join Us :
  • Opportunity to work with a talented and passionate team.
  • Competitive salary and benefits package.
  • Exciting projects and innovative work environment.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Riyadh, Riyadh Arbete Careers

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

One of our Client is hiring for the position of Data Engineer in Riyadh

Job Title:Data Engineer

Location: Riyadh

Key Responsibilities:

  • Build and maintain scalable data pipelines and ETL workflows

  • Integrate multiple data sources, ensuring accuracy and timeliness

  • Support analytics and business intelligence teams with clean, optimized data

  • Work closely with architects and dev teams to enhance performance and data usability

Minimum Qualifications:

  • Bachelors degree in Data Engineering, Computer Science, or related field

  • 5+ years of hands-on data engineering experience

  • Strong command of ETL frameworks (e.g., Airflow, Talend, Glue)

  • Experience with big data tools (e.g., Spark, Hadoop, Kafka)

  • Advanced skills in SQL, Python, or Scala

Preferred:

  • Prior experience in government or high-compliance environments

  • Familiarity with Arabic language or translation workflows

  • Cloud certifications (AWS, Azure, GCP)


#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Riyadh, Riyadh Denodo

Posted 2 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description

Denodo is always looking for technical, passionate people to join our Customer Success team. We want a professional who will travel, consult, develop, train and troubleshoot to enhance our clients’ journey around Data Virtualization.

Your mission : to help people realize their full potential through accelerated adoption and productive use of Denodo solutions.

In this role, you will successfully employ a combination of high technical expertise and client management skills to conduct on-site and off-site consulting, product implementation, and solutions development in either short or long-term engagements. You will be a critical point of contact for getting things done among Denodo, partners, and client teams.

Job Responsibilities & Duties

  • Obtain and maintain strong knowledge of the Denodo Platform, be able to deliver a superb technical pitch, including overview of our key and advanced features and benefits, service offerings, differentiation, and competitive positioning.
  • Constantly learn new things and maintain an overview of modern technologies.
  • Address technical questions concerning customization, integration, enterprise architecture, and general feature/functionality of our product.
  • Build and/or lead the development of custom deployments based on client requirements.
  • Provide timely, prioritized, and complete customer feedback to Product Management, Sales, Support, and Development regarding client business cases, requirements, and issues.
  • Train and engage clients in the product architecture, configuration, and use of the Denodo Platform.
  • Promote knowledge sharing and best practices while managing deliverables and client expectations.
  • Manage client expectations, establish credibility at all levels within the client organization, and build problem-solving partnerships with clients, partners, and colleagues.
  • Provide technical consulting, training, and support.
  • Develop white papers, presentations, training materials, or documentation on related topics.

Desired Skills & Experience

  • BS or higher degree in Computer Science or a related field, or equivalent experience in a similar role.
  • Several years of demonstrated experience as a Data Engineer or in a similar role, preferably in data management or analytics software industry.
  • Solid understanding of SQL and good grasp of relational and analytical database management theory and practice.
  • Experience in Java software development, especially in web and database fields.
  • Good knowledge of JDBC, XML, and Web Services APIs.
  • Excellent verbal and written communication skills to interact with technical and business counterparts.
  • Fluent / Native in Arabic and English.
  • Active listener.
  • Strong analytical and problem-solving abilities.
  • Curiosity and eagerness to learn new things.
  • Creativity and ability to propose innovative solutions.
  • Team player with a positive attitude.

Non-Mandatory Skills (Nice to have)

  • Experience with GIT or other version control systems.
  • Experience with Big Data and/or NoSQL environments like Hadoop, MongoDB, etc.
  • Experience with caching technologies such as JCS.
  • Experience with Windows & Linux (and UNIX) server environments.
  • Experience in business software implementation and integration projects (e.g., ETL/Data Warehouse architectures, CEP, BPM).
  • Experience integrating with packaged applications (e.g., relational databases, SAP, Siebel, Oracle Financials, BI tools).
  • Industry experience supporting mission-critical software components.
  • Experience attending customer meetings and writing technical documentation.
  • Foreign language skills are a plus.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Data engineer Jobs in Saudi Arabia !

Data Engineer

Riyadh, Riyadh Lucidya LLC.

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

The Data Engineer will play a crucial role in building and maintaining the data infrastructure at Lucidya. This role involves designing, developing, and managing scalable and robust data pipelines that support the analytics and machine learning teams. The Data Engineer will ensure that data is accessible, reliable, and organized to meet the growing analytical needs of the company.

The ideal candidate will have a strong background in data modeling, ETL processes, and database technologies. In addition to technical skills, the Data Engineer should possess excellent problem-solving abilities and a proactive attitude toward collaboration with other teams.

Key responsibilities include:

  • Designing and implementing ETL processes to move and transform data from various sources.
  • Building and maintaining the data infrastructure necessary for data storage, processing, and analysis.
  • Collaborating with data analysts and data scientists to understand their data needs and provide relevant support.
  • Ensuring data quality and integrity throughout the data lifecycle.
  • Optimizing data storage and retrieval processes for performance and scalability.
  • Documenting data flows, transformations, and any changes to data architecture.
  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • 3+ years of experience in a Data Engineer or similar role.
  • Strong knowledge of SQL and experience with database technologies (e.g., PostgreSQL, MySQL).
  • Proficient in programming languages such as Python, Java, or Scala.
  • Experience with ETL tools and frameworks (e.g., Apache Airflow, Talend, or similar).
  • Familiarity with cloud platforms (e.g., AWS, Google Cloud Platform, Azure) and their data services.
  • Knowledge of big data technologies (e.g., Hadoop, Spark) is a plus.
  • Strong attention to detail and ability to ensure data accuracy and consistency.
  • Excellent communication skills and ability to work collaboratively with cross-functional teams.
Why Join Us?

This is more than just an engineering role—it’s an opportunity to shape the infrastructure and technical future of Lucidya . You’ll play a key part in scaling our platform, building a strong engineering culture, and delivering technology that empowers companies across the region and beyond.

We offer Employee Stock Option Plans (ESOP) to give you ownership in the company’s success, along with performance-based bonuses to reward your impact and dedication. As part of a fast-growing team, you’ll have the autonomy to lead, innovate, and grow—both personally and professionally.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Riyadh, Riyadh Master Works

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

3 weeks ago Be among the first 25 applicants

  • Develop and maintain robust data architectures that support business needs and provide reliable data accessibility
  • Collaborate with cross-functional teams to define data requirements and deliver scalable data solutions
  • Implement ETL processes for data extraction, transformation, and loading, ensuring high data quality and integrity
  • Optimize data storage and access strategies for improved performance and efficiency
  • Monitor and troubleshoot data pipeline performance issues, implementing necessary fixes
  • Create comprehensive documentation for data workflows and system architecture

  • Develop and maintain robust data architectures that support business needs and provide reliable data accessibility
  • Collaborate with cross-functional teams to define data requirements and deliver scalable data solutions
  • Implement ETL processes for data extraction, transformation, and loading, ensuring high data quality and integrity
  • Optimize data storage and access strategies for improved performance and efficiency
  • Monitor and troubleshoot data pipeline performance issues, implementing necessary fixes
  • Create comprehensive documentation for data workflows and system architecture

Requirements

  • Bachelor's degree in Computer Science, Engineering, or a related field
  • 3+ years of experience in data engineering or related roles
  • Proficiency in programming languages such as Python, Java, or Scala
  • Solid experience with SQL databases and NoSQL technologies, such as Cassandra or MongoDB
  • Familiarity with data warehousing solutions and big data technologies (e.g., Hadoop, Spark)
  • Strong analytical skills and attention to detail
Seniority level
  • Seniority level Mid-Senior level
Employment type
  • Employment type Full-time
Job function
  • Job function Other
  • Industries IT Services and IT Consulting

Referrals increase your chances of interviewing at Master Works by 2x

Get notified about new Data Engineer jobs in Riyadh, Riyadh, Saudi Arabia .

Riyadh, Riyadh, Saudi Arabia 53 minutes ago

Quality Assurance (QA) Software Engineer - Remote Option Available Full Stack Mobile Application Development

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Riyadh, Riyadh Master Works

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Job Responsibilities:
  1. Develop and maintain robust data architectures that support business needs and provide reliable data accessibility.
  2. Collaborate with cross-functional teams to define data requirements and deliver scalable data solutions.
  3. Implement ETL processes for data extraction, transformation, and loading, ensuring high data quality and integrity.
  4. Optimize data storage and access strategies for improved performance and efficiency.
  5. Monitor and troubleshoot data pipeline performance issues, implementing necessary fixes.
  6. Create comprehensive documentation for data workflows and system architecture.
Minimum Requirements:
  1. Bachelor’s degree in Computer Science, Engineering, or a related field.
  2. 3+ years of experience in data engineering or related roles.
  3. Proficiency in programming languages such as Python, Java, or Scala.
  4. Solid experience with SQL databases and NoSQL technologies, such as Cassandra or MongoDB.
  5. Familiarity with data warehousing solutions and big data technologies (e.g., Hadoop, Spark).
  6. Strong analytical skills and attention to detail.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Data Engineer Jobs