14 Reporting Engineer jobs in Qatar
Data Engineer
Posted today
Job Viewed
Job Description
About Commercial Bank
Commercial Bank , Founded in 1975 and headquartered in Doha , plays a vital role in Qatar's economic development through offering a range of personal, business, government, international and investment services. At Commercial Bank of Qatar, we believe in empowering our employees, providing them with opportunities for growth and professional development.
By Joining us , you'll be part of workplace culture that fosters innovation, supports work-life balance, and encourages you to reach your full potential.
Join us in shaping the future of banking
Job Summary
Data Engineer is one of the key roles in a major initiative of Enterprise Data Fabric development which is a part of CBQ's Data Strategy and overall Digital Transformation. As a part of Data Governance and Engineering department Data Engineer will be implementing various data management systems such as Data Lake, RDM, Data Streaming, Metadata management which will be a part of the Enterprise Data Fabric. Data Engineer will also work closely with number of IT, PMO, Data and operations teams to understand their needs and ensure that the Enterprise Data Fabric contributes to overall data quality, availability, democratization and culture.
Key Responsibilities
- Participate in Data Management systems implementation projects: Data Lakehouse, Data Streaming, Metadata management, Reference Data Management
- Develop data pipelines to bring new data to Enterprise Data Fabric
- Ensure data pipelines are efficient, scalable, and maintainable
- Comply with data engineering and development best practices (CI/CD, Code Management, Testing, Knowledge Management, Documentation etc.)
- Ensure that all Data Policies are met within Enterprise Data Fabric.
- Ensure that implemented systems correspond with target Data Architecture
- Support Data teams (DQ, Data Governance, BI, Data Science) in achieving their goals
- Maintain agile delivery process based on one of frameworks: Kanban, Scrum
- Ensure that SLAs with data consumers and data sources are maintained
- Implement all necessary monitoring, alerting
- Facilitating collaboration between IT and wider Data team and supporting data science, BI and Data Excellence teams to ensure seamless integration and effective communication.
Key Competencies
- Python (Advanced level)
- Airflow or Apache NiFi
- K8s (OpenShift), Docker
- RDBMS: MS SQL Server, PostgreSQL, Oracle
- ETL (at least one of): SSIS, Informatica PowerCenter, IBM Datastage, Pentaho
- SQL – Advanced user (Stored Procedures, Window functions, Temp Tables, Recursive Queries)
- Git (GitHub/GitLab)
- Kafka is a plus
- Object Storage (S3, GCS, ABS) is a plus
- Spark is a plus
- Experience with dbt is a plus
- Familiar with Data Warehousing (Star/Snowflake schemas) And Data Lake
- Agile methodologies (Kanban, Scrum)
- Strong problem-solving skills
*Why Commercial Bank? *
- Best Performing Bank in Qatar in The Banker's prestigious Top 1000 World Banks Rankings 2025.
- Best Digital Bank in the Middle East 2024 by World Finance and Best Mobile Banking App in the Middle East 2024 by Global Finance.
- An Innovation-Driven, Digital-First Environment where employees work with the latest tools and technologies to redefine banking
- Opportunities for Global Partnerships & International Exposure, connecting employees with global networks and perspectives.
- A focus on Employee Well-being & Work-Life Balance, ensuring a healthy and supportive environment for all team members
- Competitive Compensation & Benefits that ensure our employees are rewarded for their dedication and performance
- A strong Commitment to Diversity, Equity & Inclusion, fostering a culture that values every individual's unique perspective
At Commercial Bank, we don't just offer careers, We shape futures by pioneering digital transformation in Qatar's banking sector, blending digital-first approach to redefine banking through innovative solutions.
Disclaimer
We appreciate your interest in joining CBQ Please note that only selected candidates will be contacted for further steps in the hiring process. This job posting is for informational purposes only, and CBQ reserves the right to modify, withdraw, or close it at any time without notice.
Data Engineer
Posted today
Job Viewed
Job Description
About the Data Platform
We are building a robust
Data & AI platform
to drive smart insights, enable automation, and empower strategic decision-making across various business sectors.
We are seeking a
passionate and skilled Data Engineer
to join our growing team and help design, develop, and optimize our data infrastructure on Microsoft Azure.
Job Summary
The
Data Engineer
will play a key role in designing, building, and maintaining scalable data pipelines and solutions using the
Azure ecosystem
— with a strong focus on
Azure Data Factory, Azure Databricks, PySpark, and Delta Lake
.
This role involves close collaboration with the
Head of Data & AI
to implement efficient, secure, and high-performance data workflows that enable advanced analytics and AI-driven insights.
Key Responsibilities
- Design, develop, and maintain
ETL/ELT pipelines
using
Azure Data Factory
and
Azure Databricks
. - Implement data workflows leveraging
PySpark
and
Delta Lake
following
Medallion Architecture
principles. - Build scalable and efficient
data models and pipelines
for both structured and unstructured data. - Collaborate with
Data Analysts
,
Data Scientists
, and
Business Stakeholders
to deliver reliable data solutions. - Ensure
data quality
,
validation
, and
governance
across all data pipelines. - Optimize data performance, cost, and storage using
Azure-native tools
. - Support
AI/ML model deployment pipelines
and integrate them into production workflows (a strong plus). - Write clean, modular, testable, and well-documented
Python
code. - Participate in
architectural discussions
,
code reviews
, and
Agile ceremonies
. - Continuously identify and implement
process improvements
in data infrastructure and development workflows.
Key Qualifications
- 3+ years
of experience in
Data Engineering
roles. - Proven hands-on expertise with:
- Azure Data Factory (ADF)
- Azure Databricks
- Delta Lake / Lakehouse Architecture
- PySpark
and
distributed data processing - SQL
and
Python - Strong understanding of
data warehousing
,
data modeling
, and
data governance best practices
. - Familiarity with
CI/CD pipelines
,
version control (Git)
, and
DevOps
practices. - Excellent
communication
,
problem-solving
, and
collaboration
skills. - Eagerness to learn and contribute to a rapidly evolving
Data & AI landscape
.
Data Engineer
Posted today
Job Viewed
Job Description
Job Description – Data Engineer (Arabic Speaker)
Position Overview
We are seeking a skilled
Data Engineer (Arabic Speaker)
to join our growing team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines, ensuring data integrity, and enabling advanced analytics and reporting. Proficiency in Arabic is required to work effectively with regional stakeholders and clients.
Key Responsibilities
- Design, develop, and maintain
ETL/ELT pipelines
for structured and unstructured data. - Build and optimize
data warehouses, data lakes, and data models
for analytics and reporting. - Collaborate with
business analysts, data scientists, and stakeholders
to understand requirements and translate them into technical solutions. - Ensure
data quality, consistency, and security
in line with organizational and regulatory standards. - Work with
cloud platforms (Azure, AWS, or GCP)
to manage and optimize data infrastructure. - Monitor, troubleshoot, and improve data pipeline performance and reliability.
- Develop
data documentation, dictionaries, and workflows
to support end users. - Support
Arabic-language datasets
and ensure accurate handling of regional and linguistic requirements.
Qualifications & Skills
- Bachelor's or Master's degree in
Computer Science, Information Systems, Data Engineering, or related field
. - 2–5+ years
of proven experience as a
Data Engineer
or in a similar role. - Strong knowledge of
SQL, Python, and/or Java/Scala
. - Experience with
big data tools
(Spark, Hadoop, Kafka, etc.) and
ETL frameworks
. - Hands-on experience with
cloud data services
(Azure Data Factory, AWS Glue, Google BigQuery, etc.). - Familiarity with
data visualization tools
(Power BI, Tableau, QlikView, etc.) is a plus. - Knowledge of
data governance, security, and compliance standards
. - Strong problem-solving skills and ability to work in a
fast-paced, multicultural environment
. - Fluency in Arabic (spoken and written)
and strong proficiency in English.
Preferred Skills
- Experience in
machine learning data preparation
and advanced analytics. - Knowledge of
regional regulations
around data privacy and protection (e.g., GDPR, NCSA Qatar, etc.). - Background in
telecom, government, or financial sectors
is a plus.
Soft Skills
- Excellent communication and interpersonal skills to liaise with Arabic-speaking stakeholders.
- Strong analytical mindset with attention to detail.
- Team player with the ability to manage multiple priorities.
Location
- Doha, Qatar
Big Data Engineer
Posted today
Job Viewed
Job Description
Job Role: Big Data Engineer – Dremio Administrator
Location: Qatar/offshore
Job Type: Yearly Renewable Contract
Industry: Banking
Our client is looking for a skilled
Big Data Engineer – Dremio Administrator
to join their dynamic team. If you have strong hands-on experience with
Dremio administration
and a passion for optimizing big data environments, this role is perfect for you.
Role Overview:
- Administer and optimize the newly implemented Dremio platform
- Develop and deploy Machine Learning modules, scoring cards, and automated delivery platforms for banking use cases
- Build and manage data pipelines using Python & Airflow
- Ensure high-performance SQL query optimization & data integration
- Collaborate with business teams to deliver end-to-end analytics & BI solutions
Must-Have Skills:
- Strong Banking / Financial services domain experience (preferred & highly valued)
- Proven expertise in Python
- Proficiency in MS SQL
- Hands-on experience with Apache Airflow
- Dremio administration (newly implemented – critical skill)
Nice-to-Have:
- Broader experience in Big Data technologies
- Exposure to Cloud platforms (AWS, Azure, GCP)
- Strong background in data pipelines, ETL & BI tools
Python Data Engineer
Posted today
Job Viewed
Job Description
Apt Resources is hiring a Python Data Engineer for our client in the banking sector. This role focuses on Big Data architectures and data warehousing, with hands-on involvement in tools like Tableau, Teradata, NIFI, and IBM DataStage.
Key Responsibilities:
- Design scalable Big Data and DWH architectures.
- Develop KPIs and dashboards to analyze market trends.
- Work with Tableau for data visualization and reporting.
- Perform data migration using NIFI and ETL with IBM DataStage.
- Collaborate with teams and vendors on data-driven initiatives.
- Maintain data privacy, security, and regulatory compliance.
- 3 to 4 years of experience in data engineering or related roles.
- Bachelor's degree in Engineering or a related field.
- Certifications in Big Data or Data Science are preferred.
- Proficiency in SQL, Python, PySpark, and Shell scripting (Bash).
- Experience with data visualization tools like Tableau.
- Familiarity with ETL tools such as IBM DataStage and data pipelines using NIFI.
- Knowledge of REST, JSON, SOAP, and Web Services APIs.
- Strong analytical, communication, and project leadership skills.
- Experience in telecom data (2G/3G) is an advantage.
Salary: USD 2,000 - 2,500 per month
Senior Data Engineer- GCP
Posted today
Job Viewed
Job Description
Job Title: Senior Data Engineer - GCP
Job Type: Contract
Job location : Doha Qatar
Contract Duration: Initial 1 Year and Extendable
Years of Experience: 5+ Years
Key Responsibilities:
Required Skills:
- Bachelor's degree in Computer Science or similar field or equivalent work experience.
- 5+ years of experience on Data Warehousing, Data Engineering or Data Integration projects.
- Expert with data warehousing concepts, strategies, and tools.
- Strong SQL background.
- Strong knowledge of relational databases like SQL Server, PostgreSQL, MySQL.
- Strong experience in GCP & Google BigQuery, Cloud SQL, Composer (Airflow), Dataflow, Dataproc, Cloud Function and GCS
- Good to have knowledge on SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS).
- Knowledge of AWS and Azure Cloud is a plus.
- Experience in Informatica Power exchange for Mainframe, Salesforce, and other new-age data sources.
- Experience in integration using APIs, XML, JSONs etc.
- In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework, data-warehousing and Data Lakes
- Good understanding of SDLC, Agile and Scrum processes.
- Strong problem-solving, multi-tasking, and organizational skills.
- Highly proficient in working with large volumes of business data and strong understanding of database design and implementation.
- Good written and verbal communication skills.
- Demonstrated experience of leading a team spread across multiple locations.
Role & Responsibilities:
- Work with business users and other stakeholders to understand business processes.
- Ability to design and implement Dimensional and Fact tables
- Identify and implement data transformation/cleansing requirements
- Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse
- Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions
- Design, develop and maintain ETL workflows and mappings using the appropriate data load technique
- Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions.
- Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use.
- Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate,
- design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality.
- Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions.
- Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics.
- Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.
- Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements.
- Train business end-users, IT analysts, and developers.
ML Data Engineer Senior
Posted today
Job Viewed
Job Description
Welcome to Your Next Adventure
We look for experienced engineer for position of dedicated DE for ML team. Main responsibilities will include creating end extending existing feature marts and realtime marts, including investigation of feature drift and performance optimizations. What we also consider as important is having enough skills to take responsibility for full ML-DE interaction, including new pipelines creation in existing DE infrastructure. Main tools used for feature engineering are Dbt, BigQuery, Databricks and Spark structured streaming, having AWS streaming services and infrastructure as a code for DE part. Our workflow emphasizes adaptability, collaboration, with a focus on driving changes in company wide metrics - and you`ll be able to see the impact of your work on the business and customer experience.
What You'll Get Your Hands On
- Feature engineering - support and creating feature marts for ML team, including tests and validation
- Realtime Feature engineering - creating realtime pipelines, extending realtime feature store with new features
- Integrating new sources into existing data pipelines for ML needs, ensuring data quality and consistency
- Monitor and automate ML pipelines processes, including data drift, pipelines performance, alerting
The Magic You Bring
- Experience in several programming languages or Python frameworks
- PySpark knowledge i.e. Spark structured streaming, data distribution, optimizations
- Strong SQL skills, ability to apply subqueries, window functions, CTEs, math specific database functions
- Experience with modern ETL processes, collection, ingestion, storage, cleaning, transformation, analysis of data
- Software design skills - software patterns, application, data system design
- Math statistics knowhelege, i.e. stdev, quartiles, ability to apply them for drift or outlier detection
- Infrastructure as a code is a plus
- Experience with databricks as a plus
Our Stack
- GCP platform
- BigQuery
- Dbt
- Databricks
- Spark structured streaming
- AWS message brokers
- Tableau, Redash
Inside Snoonu's Universe
Snoonu is Qatar's homegrown Super App, reinventing daily life with blazing-fast delivery, ride-hailing, and shopping - all in one place. Powered by tech, driven by a global team, and obsessed with making life easier.
The Dream We're Chasing
To be the first Qatari Super App that propels the region and its community through innovation and technology. We envision a global expansion where what we do surpasses norms and limitations every time.
The Quest We're On
To radically transform how people live by leveraging technology to connect them with endless possibilities.
Our Everyday Superpowers
Be Customer Obsessed:
"Focus on the customer and all else will follow."
Act with Integrity:
"We are honest, ethical, and trustworthy in everything we do."
Be Curious and Creative:
"We constantly innovate and create solutions to bring a lasting positive impact."
Lead by Example and Take Ownership:
"Be the change you want to see and take ownership."
Work Smart and Deliver Results:
"You can do more by doing less, better, and faster."
It's All About People:
"Be a team player; together we are stronger."
Perks & Worklife Magic At Snoonu
Global Vibes –
Collaborate with a worldwide crew.
Brain Boosters –
Learning budgets, access to courses, and tools for your growth.
Builder's Playground –
Own your tasks, own your path We're big on autonomy.
Flexible Time Off –
We take recharging seriously. Generous leave and wellness policies.
Agile Everything –
Scrum isn't a buzzword here. It's how we roll, from product to ops.
Beyond the Code: Giving Back Matters
We don't just build app. We're committed to doing business sustainably and giving back to the community that fuels us. From eco-conscious practices to CSR projects, we're always finding ways to do better—and we invite you to be a part of that mission.
Diversity Isn't Just a Buzzword
At Snoonu, fairness and inclusion are the foundation of everything we do. We're proud to be an equal opportunity workplace that welcomes people from every walk of life. Be bold. Be you. Thrive here.
Let's Build the Future Together
Apply now to join a tech team where your code sparks change, your voice is heard, and your growth is guaranteed. Let's make some tech magic together.
Stay in the loop—connect with us on LinkedIn
Be The First To Know
About the latest Reporting engineer Jobs in Qatar !
Senior Denodo Data Engineer
Posted today
Job Viewed
Job Description
Key Responsibilities
- Lead the design and architecture of data virtualization solutions using the Denodo Platform.
- Define best practices for data modeling, data governance, metadata management, and security within Denodo.
- Collaborate with business stakeholders, data engineers, and BI teams to understand requirements and design scalable, high-performing data solutions.
- Integrate diverse data sources (cloud, on-prem, relational, APIs, big data platforms, etc.) into a unified data layer.
- Establish data caching, query optimization, and performance tuning strategies for Denodo environments.
- Provide technical leadership on Denodo deployment, configuration, upgrades, and troubleshooting.
- Work closely with enterprise architects to align data virtualization strategy with overall enterprise data architecture.
- Mentor junior team members and provide training/guidance on Denodo best practices.
- Create and maintain technical documentation for architecture, design, and operational procedures.
Qualifications & Skills
- Bachelor's or Master's degree in Computer Science, Information Systems, or related field.
- 8+ years of experience in data architecture, data integration, or BI roles.
- 8+ years of hands-on experience with Denodo Platform (including VDP, Scheduler, Solution Manager, Data Catalog).
- Strong knowledge of SQL, data modeling, data virtualization, and API-based integration.
- Experience with cloud platforms (Azure Data Services "ADF" , Synapse , Databricks) and hybrid data architectures.
- Hands on experience with ETL/ELT tools, data lakes, data warehouses, and BI/reporting tools.
- Expertise in performance tuning, caching strategies, and query optimization.
- Strong understanding of data governance, metadata management, and security practices.
- Excellent problem-solving, communication, and leadership skills.
Data & BI Engineer
Posted today
Job Viewed
Job Description
**Job Title: Data & BI Engineer
Experience: 4+ years
Location: Qatar Onsite
Job Summary**
We're seeking an experienced Data & BI Engineer to design, develop, and maintain robust data pipelines, data models, and analytics solutions. The ideal candidate will have expertise in data engineering, database administration, and business intelligence. You will collaborate with business stakeholders to deliver scalable Power BI dashboards and support AI/ML initiatives.
Key Responsibilities
Data Engineering:
- Design, develop, and maintain ETL/ELT pipelines to integrate data from multiple business systems (SAP, CRM, FMS, HR, etc.) into the data warehouse (Microsoft Fabric/Azure).
- Build and optimize data models (lakehouse, warehouse, semantic models) for analytics and reporting.
- Collaborate with business stakeholders to deliver scalable Power BI dashboards.
- Ensure data quality, validation, and governance across pipelines.
- Automate workflows, monitoring, and alerts for data processes.
- Support AI/ML initiatives by preparing and integrating structured and semi-structured datasets.
Database Administration (SQL DBA Skills)
- Administer Microsoft SQL Server (on-prem & Azure SQL) including installation, configuration, and upgrades.
- Manage database performance, indexing, partitioning, and query optimization.
- Implement and maintain HA/DR strategies (AlwaysOn, replication, failover clustering, backups, restores).
- Enforce security, user access, and compliance (GDPR, role-based access, encryption).
- Monitor database health and capacity, ensuring uptime and reliability.
- Document schemas, procedures, and maintain a data dictionary for critical databases.
Requirements
- 4+ years of experience in data engineering and database administration.
- Strong expertise in Microsoft SQL Server, Azure, and Power BI.
- Proficiency in ETL/ELT tools and data modeling.
- Experience with data governance, security, and compliance.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
Nice To Have
- Experience with AI/ML initiatives and data science projects.
- Knowledge of cloud-based data platforms (Azure, AWS, etc.).
- Familiarity with data visualization tools (Power BI, Tableau, etc.).
Skills: azure,microsoft,database administration,data engineering,data,power bi,database,sql,models,ml
Data & BI Engineer
Posted today
Job Viewed
Job Description
Data Engineering:
• Design, develop, and maintain robust ETL/ELT pipelines to move and transform data from multiple business systems (SAP, CRM, FMS, HR, etc.) into the data warehouse (Microsoft Fabric / Azure).
• Build and optimize data models (lakehouse, warehouse, semantic models) for analytics and reporting.
• Collaborate business stakeholders to deliver scalable Power BI dashboards.
• Ensure data quality, validation, and governance across pipelines.
• Automate workflows, monitoring, and alerts for data processes.
• Support AI/ML initiatives by preparing and integrating structured and semi-structured datasets.
Database Administration (SQL DBA Skills):
• Administer Microsoft SQL Server (on-prem & Azure SQL) including installation, configuration, and upgrades.
• Manage database performance, indexing, partitioning, and query optimization.
• Implement and maintain HA/DR strategies (AlwaysOn, replication, failover clustering, backups, restores).
• Enforce security, user access, and compliance (GDPR, role-based access, encryption).
• Monitor database health and capacity, ensuring uptime and reliability.
• Document schemas, procedures, and maintain a data dictionary for critical databases.