21 Big Data jobs in Qatar
Big Data Engineer
Posted today
Job Viewed
Job Description
Job Role: Big Data Engineer – Dremio Administrator
Location: Qatar/offshore
Job Type: Yearly Renewable Contract
Industry: Banking
Our client is looking for a skilled
Big Data Engineer – Dremio Administrator
to join their dynamic team. If you have strong hands-on experience with
Dremio administration
and a passion for optimizing big data environments, this role is perfect for you.
Role Overview:
- Administer and optimize the newly implemented Dremio platform
- Develop and deploy Machine Learning modules, scoring cards, and automated delivery platforms for banking use cases
- Build and manage data pipelines using Python & Airflow
- Ensure high-performance SQL query optimization & data integration
- Collaborate with business teams to deliver end-to-end analytics & BI solutions
Must-Have Skills:
- Strong Banking / Financial services domain experience (preferred & highly valued)
- Proven expertise in Python
- Proficiency in MS SQL
- Hands-on experience with Apache Airflow
- Dremio administration (newly implemented – critical skill)
Nice-to-Have:
- Broader experience in Big Data technologies
- Exposure to Cloud platforms (AWS, Azure, GCP)
- Strong background in data pipelines, ETL & BI tools
Intern - Data Management
Posted today
Job Viewed
Job Description
Job Description:
We are looking for a
motivated and detail-oriented Data Management Intern
to join our team. This role is ideal for freshers who have a basic understanding or academic exposure to data management, Excel, and data hygiene practices. The intern will support the team in maintaining accurate, clean, and organized datasets that are essential for business reporting and decision-making.
Key Responsibilities:
- Assist in collecting, organizing, and maintaining large volumes of data.
- Perform
data cleaning
and validation tasks to ensure accuracy and consistency. - Use
Microsoft Excel
(including formulas, filters, pivot tables, etc.) to manage and analyze datasets. - Regularly audit databases to ensure
data hygiene
and remove or correct inaccurate records. - Support the preparation of reports and dashboards by providing clean and reliable data.
- Collaborate with cross-functional teams to gather data requirements and clarify inconsistencies.
- Document data processes and contribute to improving data handling procedures.
Requirements:
- Fresher
– recent graduate or final year student in any discipline (preferably Computer Science, Business, Statistics, or related fields). - Working knowledge of
Microsoft Excel
– including formulas, formatting, sorting, and basic data analysis tools. - Basic understanding of
data cleaning
, data validation, and data integrity. - High attention to detail and accuracy.
- Good communication and organizational skills.
- Willingness to learn and take ownership of assigned tasks.
Preferred Skills (Nice to Have):
- Familiarity with tools like Google Sheets, MS Access, or basic SQL.
- Exposure to data visualization tools like Power BI or Tableau (not mandatory).
C-17 Product Data Management Specialist
Posted today
Job Viewed
Job Description
At Boeing, we innovate and collaborate to make the world a better place. We're committed to fostering an environment for every teammate that's welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us.
C-17 Product Data Management Specialist
Boeing is hiring for a C-17 PDM specialist in Al-Udeid, Qatar.
Role Responsibilities:
- The position requires the successful applicant to deal directly with the military customer at Al Udeid Air Base and its personnel on a daily basis.
- The applicant must demonstrate the ability to adapt to a different cultural environment and the ability to work independently with minimal supervision.
- The applicant will verify that the customer logistics data management records received meet documented regulations and will coordinate with customers to resolve routine problems regarding data accuracy and data integrity.
- The applicant will transcribe the customer data into the appropriate data systems. These systems will include the US Air Force FMxC2 system, C-17 Standard Flight Data Recorder, and C-17 engine Quick Access Recorder repositories.
- The applicant will ensure data quality, maintain process records and files, keep accountability records and make final disposition of data, records and files.
- The successful applicant will be responsible for responding to customer requests for a variety of pre-programmed and customized C-17 data reports extracted from the data systems, principally from the US Air Force FMxC2 system.
Required Qualifications/Experience:
- US Security Clearance Required - US Citizen
- Candidates should also have previous experience working in an environment where flexibility is essential and prompt responses to ad-hoc customer requests are expected.
Desired Qualifications Experience:
- Prior experience utilizing the USAF FMxC2 data system is beneficial
- Ideal candidates will have a working knowledge of performing product data management and configuration status accounting tasks in support of Department of Defense contracts as well as knowledge of quality systems (Boeing Quality Management System (BQMS) and Aerospace Standard (AS9100)).
- Applicable and appropriate educational/certification credentials from an accredited institution and/or equivalent experience is desired.
This International Local Hire Employee (ILHE) position is in support of the C-17 Globemaster III Sustainment (G3) contract.
This position offers relocation based on candidate eligibility.
This requisition is for an international, locally hired position in Al-Udeid, Qatar. Benefits and pay are determined at the local level and are not part of Boeing U.S.-based payroll and will commensurate with experience and qualifications and in accordance with applicable Qatari law. Employment is subject to the candidate's ability to satisfy all Qatari labor and immigration formalities.
Applications for this position will be accepted until Oct. 09, 2025
Relocation
This position offers relocation based on candidate eligibility.
Security Clearance
This position requires an active U.S. Secret Security Clearance (U.S. Citizenship Required). (A U.S. Security Clearance that has been active in the past 24 months is considered active)
Visa Sponsorship
Employer willing to sponsor applicants for employment visa status.
Shift
Not a Shift Worker (Qatar)
Equal Opportunity Employer:
We are an equal opportunity employer. We do not accept unlawful discrimination in our recruitment or employment practices on any grounds including but not limited to; race, color, ethnicity, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military and veteran status, or other characteristics covered by applicable law.
We have teams in more than 65 countries, and each person plays a role in helping us become one of the world's most innovative, diverse and inclusive companies. We are proud members of the Valuable 500 and welcome applications from candidates with disabilities. Applicants are encouraged to share with our recruitment team any accommodations required during the recruitment process. Accommodations may include but are not limited to: conducting interviews in accessible locations that accommodate mobility needs, encouraging candidates to bring and use any existing assistive technology such as screen readers and offering flexible interview formats such as virtual or phone interviews.
Data Management and Business Intelligence Analyst
Posted today
Job Viewed
Job Description
Data modelling: Develop custom data models and algorithms to apply to data sets
Data Mining, Cleaning and Munging
Data Visualization and Reporting (Power BI)
Data warehousing and structures
Business Process and Workflow (SharePoint)
Statistical Analysis and Risk Analysis
Database Programming (SQL)
Software Engineering Skills
Problem-Solving
Effective Communication
Bachelor or Master level degree in Business, Computer, Information Science or a related field
Eight years (8) years of experience in a business environment with specific exposure to Business Processes and Data Analysis relating to Project Management required, preferably in the oil and gas industry
Must have strong experience using a variety of data mining/data analysis methods, using a variety of data tools, building and implementing models, using/creating algorithms and creating/running simulations
Experience in statistical and data mining techniques (Data Mining, Cleaning and Munging) to mind and analyze data from company databases, ERP and other legacy data sources/systems to drive optimization techniques and business strategies
Experience in Data Visualization, dashboard and Reporting using Data Visualization Tools such as (Power PI, Tableau, SAS, Python, etc) with experience using web services and cloud tools
Knowledge of Master data object definitions and models would be an advantage
Knowledge in Business Process Analysis, Modelling, Optimizing and Workflow automation
Experience in business process analysis, modelling, notation, process improvement methodologies, optimization strategies and statistical process control
Experience in documentation and diagramming approaches used to describe typical business components including entity relationship diagrams, process diagrams, and workflow diagrams
Good exposure to Project Management principles and practices
Thorough understanding of Business Process management and ability to plan/conduct study and document the results
Expert proficiency in using MS Office suite applications, MS SharePoint and Power BI
V.I.E Construction Data Management Specialist / EasyPlant Operator H/F
Posted 1 day ago
Job Viewed
Job Description
Please note :
As a reminder, the VIE program is strictly reserved for applicants aged between 18 and 28 who are French or European nationals, in accordance with current regulations
Starting Date : 1st of November 2025
About The Role:
Easy Plant (EP) is an in-house construction tool of Technip Energies, developed to support the management of the construction and to control site activities in order to reach a smooth, documented and structured plant hand-over to the client.
EP has been progressively developed and tested by T.En during the last 15 years since its launch. EP is today a valid support for prefabrication, construction, pre-commissioning and commissioning phases of either modularized or stick-build approaches. EP populates Project WBS and interfaces and exchanges data required with the main Engineering IT
tools (SPI, SPMAT, SPEL, Tekla, etc.), Planning software (Primavera, MS Project), 3D Construction Model (CSim), Construction Execution software for prefabrication (SpoolGen).
The EasyPlant Operator plays a critical role in population and management of departments data into EasyPlant in his area from inception to completion. This position implies being part of a multicultural team of skilled Construction Data Management specialists, ensuring compliance with safety regulations, coordinating with other departments, and ensuring successful Construction, followed by Commissioning and Completion activities. The ideal candidate possesses strong technical expertise,
team player skills, rigour and a commitment to quality.
- Ensure data collecting from discipline supervisors, HSES, Planning and Quality departments
- Data checking and entry into the tool.
- Preparation of reports & presentations.
- Follow up of punch list points:
• Punch points opening
• Punch points closure
- Support of the newcomers to be familiarized with the tools.
About you :
- He/She has a degree in architecture, engineering, or a related field is preferred, and having a post-graduation is also desirable
- He/She speaks and writes in English fluently
- He/She is result driven, delivering at the best of his/capabilities despite tight deadlines
- He/She Has a good team-spirit
- He/She has a high level of attention to detail in data. Being able to identify and report issues within the data is also necessary. Experience and knowledge of PowerBI are mandatory
Whats next ?
At Technip Energies, we prioritize internal applications and provide timely feedback to internal applicants. Our Recruiting Team screens and matches your skills, experience, and potential team fit against the role requirements. We ask for your patience as the team completes the volume of applications with a reasonable timeframe. You can check your application progress directly in PeopleConnect Recruiting.
Once receiving your system application, our recruiting team will screen and match your skills, experience, and potential team fit against the role requirements. We ask for your patience as the team completes the volume of applications with reasonable timeframe. Check your application progress periodically via personal account from created candidate profile during your application.
We invite you to get to know more about our company by visiting and follow us on LinkedIn , Instagram , Facebook , and YouTube for company updates.
Technip Energies attaches great importance to diversity and inclusion, which is why all our offers are open to people with disabilities.
Together, lets be part of the solution !
Same Posting Description for Internal and External Candidates
Data Engineer
Posted today
Job Viewed
Job Description
About Commercial Bank
Commercial Bank , Founded in 1975 and headquartered in Doha , plays a vital role in Qatar's economic development through offering a range of personal, business, government, international and investment services. At Commercial Bank of Qatar, we believe in empowering our employees, providing them with opportunities for growth and professional development.
By Joining us , you'll be part of workplace culture that fosters innovation, supports work-life balance, and encourages you to reach your full potential.
Join us in shaping the future of banking
Job Summary
Data Engineer is one of the key roles in a major initiative of Enterprise Data Fabric development which is a part of CBQ's Data Strategy and overall Digital Transformation. As a part of Data Governance and Engineering department Data Engineer will be implementing various data management systems such as Data Lake, RDM, Data Streaming, Metadata management which will be a part of the Enterprise Data Fabric. Data Engineer will also work closely with number of IT, PMO, Data and operations teams to understand their needs and ensure that the Enterprise Data Fabric contributes to overall data quality, availability, democratization and culture.
Key Responsibilities
- Participate in Data Management systems implementation projects: Data Lakehouse, Data Streaming, Metadata management, Reference Data Management
- Develop data pipelines to bring new data to Enterprise Data Fabric
- Ensure data pipelines are efficient, scalable, and maintainable
- Comply with data engineering and development best practices (CI/CD, Code Management, Testing, Knowledge Management, Documentation etc.)
- Ensure that all Data Policies are met within Enterprise Data Fabric.
- Ensure that implemented systems correspond with target Data Architecture
- Support Data teams (DQ, Data Governance, BI, Data Science) in achieving their goals
- Maintain agile delivery process based on one of frameworks: Kanban, Scrum
- Ensure that SLAs with data consumers and data sources are maintained
- Implement all necessary monitoring, alerting
- Facilitating collaboration between IT and wider Data team and supporting data science, BI and Data Excellence teams to ensure seamless integration and effective communication.
Key Competencies
- Python (Advanced level)
- Airflow or Apache NiFi
- K8s (OpenShift), Docker
- RDBMS: MS SQL Server, PostgreSQL, Oracle
- ETL (at least one of): SSIS, Informatica PowerCenter, IBM Datastage, Pentaho
- SQL – Advanced user (Stored Procedures, Window functions, Temp Tables, Recursive Queries)
- Git (GitHub/GitLab)
- Kafka is a plus
- Object Storage (S3, GCS, ABS) is a plus
- Spark is a plus
- Experience with dbt is a plus
- Familiar with Data Warehousing (Star/Snowflake schemas) And Data Lake
- Agile methodologies (Kanban, Scrum)
- Strong problem-solving skills
*Why Commercial Bank? *
- Best Performing Bank in Qatar in The Banker's prestigious Top 1000 World Banks Rankings 2025.
- Best Digital Bank in the Middle East 2024 by World Finance and Best Mobile Banking App in the Middle East 2024 by Global Finance.
- An Innovation-Driven, Digital-First Environment where employees work with the latest tools and technologies to redefine banking
- Opportunities for Global Partnerships & International Exposure, connecting employees with global networks and perspectives.
- A focus on Employee Well-being & Work-Life Balance, ensuring a healthy and supportive environment for all team members
- Competitive Compensation & Benefits that ensure our employees are rewarded for their dedication and performance
- A strong Commitment to Diversity, Equity & Inclusion, fostering a culture that values every individual's unique perspective
At Commercial Bank, we don't just offer careers, We shape futures by pioneering digital transformation in Qatar's banking sector, blending digital-first approach to redefine banking through innovative solutions.
Disclaimer
We appreciate your interest in joining CBQ Please note that only selected candidates will be contacted for further steps in the hiring process. This job posting is for informational purposes only, and CBQ reserves the right to modify, withdraw, or close it at any time without notice.
Data Engineer
Posted today
Job Viewed
Job Description
About the Data Platform
We are building a robust
Data & AI platform
to drive smart insights, enable automation, and empower strategic decision-making across various business sectors.
We are seeking a
passionate and skilled Data Engineer
to join our growing team and help design, develop, and optimize our data infrastructure on Microsoft Azure.
Job Summary
The
Data Engineer
will play a key role in designing, building, and maintaining scalable data pipelines and solutions using the
Azure ecosystem
— with a strong focus on
Azure Data Factory, Azure Databricks, PySpark, and Delta Lake
.
This role involves close collaboration with the
Head of Data & AI
to implement efficient, secure, and high-performance data workflows that enable advanced analytics and AI-driven insights.
Key Responsibilities
- Design, develop, and maintain
ETL/ELT pipelines
using
Azure Data Factory
and
Azure Databricks
. - Implement data workflows leveraging
PySpark
and
Delta Lake
following
Medallion Architecture
principles. - Build scalable and efficient
data models and pipelines
for both structured and unstructured data. - Collaborate with
Data Analysts
,
Data Scientists
, and
Business Stakeholders
to deliver reliable data solutions. - Ensure
data quality
,
validation
, and
governance
across all data pipelines. - Optimize data performance, cost, and storage using
Azure-native tools
. - Support
AI/ML model deployment pipelines
and integrate them into production workflows (a strong plus). - Write clean, modular, testable, and well-documented
Python
code. - Participate in
architectural discussions
,
code reviews
, and
Agile ceremonies
. - Continuously identify and implement
process improvements
in data infrastructure and development workflows.
Key Qualifications
- 3+ years
of experience in
Data Engineering
roles. - Proven hands-on expertise with:
- Azure Data Factory (ADF)
- Azure Databricks
- Delta Lake / Lakehouse Architecture
- PySpark
and
distributed data processing - SQL
and
Python - Strong understanding of
data warehousing
,
data modeling
, and
data governance best practices
. - Familiarity with
CI/CD pipelines
,
version control (Git)
, and
DevOps
practices. - Excellent
communication
,
problem-solving
, and
collaboration
skills. - Eagerness to learn and contribute to a rapidly evolving
Data & AI landscape
.
Be The First To Know
About the latest Big data Jobs in Qatar !
Data Engineer
Posted today
Job Viewed
Job Description
Job Description – Data Engineer (Arabic Speaker)
Position Overview
We are seeking a skilled
Data Engineer (Arabic Speaker)
to join our growing team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines, ensuring data integrity, and enabling advanced analytics and reporting. Proficiency in Arabic is required to work effectively with regional stakeholders and clients.
Key Responsibilities
- Design, develop, and maintain
ETL/ELT pipelines
for structured and unstructured data. - Build and optimize
data warehouses, data lakes, and data models
for analytics and reporting. - Collaborate with
business analysts, data scientists, and stakeholders
to understand requirements and translate them into technical solutions. - Ensure
data quality, consistency, and security
in line with organizational and regulatory standards. - Work with
cloud platforms (Azure, AWS, or GCP)
to manage and optimize data infrastructure. - Monitor, troubleshoot, and improve data pipeline performance and reliability.
- Develop
data documentation, dictionaries, and workflows
to support end users. - Support
Arabic-language datasets
and ensure accurate handling of regional and linguistic requirements.
Qualifications & Skills
- Bachelor's or Master's degree in
Computer Science, Information Systems, Data Engineering, or related field
. - 2–5+ years
of proven experience as a
Data Engineer
or in a similar role. - Strong knowledge of
SQL, Python, and/or Java/Scala
. - Experience with
big data tools
(Spark, Hadoop, Kafka, etc.) and
ETL frameworks
. - Hands-on experience with
cloud data services
(Azure Data Factory, AWS Glue, Google BigQuery, etc.). - Familiarity with
data visualization tools
(Power BI, Tableau, QlikView, etc.) is a plus. - Knowledge of
data governance, security, and compliance standards
. - Strong problem-solving skills and ability to work in a
fast-paced, multicultural environment
. - Fluency in Arabic (spoken and written)
and strong proficiency in English.
Preferred Skills
- Experience in
machine learning data preparation
and advanced analytics. - Knowledge of
regional regulations
around data privacy and protection (e.g., GDPR, NCSA Qatar, etc.). - Background in
telecom, government, or financial sectors
is a plus.
Soft Skills
- Excellent communication and interpersonal skills to liaise with Arabic-speaking stakeholders.
- Strong analytical mindset with attention to detail.
- Team player with the ability to manage multiple priorities.
Location
- Doha, Qatar
Python Data Engineer
Posted today
Job Viewed
Job Description
Apt Resources is hiring a Python Data Engineer for our client in the banking sector. This role focuses on Big Data architectures and data warehousing, with hands-on involvement in tools like Tableau, Teradata, NIFI, and IBM DataStage.
Key Responsibilities:
- Design scalable Big Data and DWH architectures.
- Develop KPIs and dashboards to analyze market trends.
- Work with Tableau for data visualization and reporting.
- Perform data migration using NIFI and ETL with IBM DataStage.
- Collaborate with teams and vendors on data-driven initiatives.
- Maintain data privacy, security, and regulatory compliance.
- 3 to 4 years of experience in data engineering or related roles.
- Bachelor's degree in Engineering or a related field.
- Certifications in Big Data or Data Science are preferred.
- Proficiency in SQL, Python, PySpark, and Shell scripting (Bash).
- Experience with data visualization tools like Tableau.
- Familiarity with ETL tools such as IBM DataStage and data pipelines using NIFI.
- Knowledge of REST, JSON, SOAP, and Web Services APIs.
- Strong analytical, communication, and project leadership skills.
- Experience in telecom data (2G/3G) is an advantage.
Salary: USD 2,000 - 2,500 per month
Senior Data Engineer- GCP
Posted today
Job Viewed
Job Description
Job Title: Senior Data Engineer - GCP
Job Type: Contract
Job location : Doha Qatar
Contract Duration: Initial 1 Year and Extendable
Years of Experience: 5+ Years
Key Responsibilities:
Required Skills:
- Bachelor's degree in Computer Science or similar field or equivalent work experience.
- 5+ years of experience on Data Warehousing, Data Engineering or Data Integration projects.
- Expert with data warehousing concepts, strategies, and tools.
- Strong SQL background.
- Strong knowledge of relational databases like SQL Server, PostgreSQL, MySQL.
- Strong experience in GCP & Google BigQuery, Cloud SQL, Composer (Airflow), Dataflow, Dataproc, Cloud Function and GCS
- Good to have knowledge on SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS).
- Knowledge of AWS and Azure Cloud is a plus.
- Experience in Informatica Power exchange for Mainframe, Salesforce, and other new-age data sources.
- Experience in integration using APIs, XML, JSONs etc.
- In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework, data-warehousing and Data Lakes
- Good understanding of SDLC, Agile and Scrum processes.
- Strong problem-solving, multi-tasking, and organizational skills.
- Highly proficient in working with large volumes of business data and strong understanding of database design and implementation.
- Good written and verbal communication skills.
- Demonstrated experience of leading a team spread across multiple locations.
Role & Responsibilities:
- Work with business users and other stakeholders to understand business processes.
- Ability to design and implement Dimensional and Fact tables
- Identify and implement data transformation/cleansing requirements
- Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse
- Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions
- Design, develop and maintain ETL workflows and mappings using the appropriate data load technique
- Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions.
- Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use.
- Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate,
- design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality.
- Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions.
- Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics.
- Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.
- Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements.
- Train business end-users, IT analysts, and developers.