49 Business Intelligence Engineer jobs in Qatar
Data Engineer
Posted 4 days ago
Job Viewed
Job Description
The Data Engineer handles thedesign, development, and maintenance of data pipelines, ETL processes, anddatabase management to support AI and data science initiatives. This roleinvolves ensuring data quality, scalability, and performance across all dataengineering activities.
- Design, develop, and maintain data pipelines, ETL processes,and database systems to support AI and data science initiatives.
- Collaborate with data scientists, AI/ML engineers, and otherstakeholders to understand data requirements and ensure data availability andquality.
- Implement data governance, security, and regulatorystandards in all data engineering activities.
- Optimize data pipelines and processes for scalability,performance, and cost-efficiency.
- Monitor and ensure the performance and reliability of datasystems, identifying and resolving issues as needed.
- Stay updated with the latest advancements in dataengineering technologies and best practices.
- Provide support and guidance to other team members asneeded.
- Prepare and present data engineering reports anddocumentation to senior management and stakeholders.
- Participate in project planning and contribute to thedevelopment of project timelines and deliverables.
- Perform other duties relevant to the job as assigned by theSr. Data Engineer or senior management.
- Bachelor’s degree in Data Engineering, Computer Science, ora related field
- Relevant certifications (e.g., Google Cloud ProfessionalData Engineer, AWS Certified Big Data – Specialty) are preferred
- Minimum of 3 years of experience in data engineering orrelated fields
- Experience in designing and implementing data pipelines, ETLprocesses, and database systems for AI or technology-focused products
- Strong programming skills in languages such as Python, Java,or SQL
- Proficiency in data engineering tools and frameworks (e.g.,Apache Spark, Kafka)
- Excellent problem-solving and analytical skills
- Strong communication and interpersonal skills
- Attention to detail and commitment to quality
- In-depth understanding of data engineering principles, ETLprocesses, and database management
- Familiarity with cloud platforms (e.g., AWS, Azure, GoogleCloud) and their data services
- Knowledge of data governance, security, and regulatorystandards
- Ability to manage multiple tasks and prioritize effectively
- Strong attention to detail and commitment to deliveringhigh-quality work
- Ability to work independently and as part of a team
- Programming languages (e.g., Python, Java, SQL)
- Data engineering tools and frameworks (e.g., Apache Spark,Kafka)
- Data management systems (e.g., SQL, NoSQL databases)
- Collaboration and communication tools (e.g., Slack,Microsoft Teams)
Data Engineer
Posted 9 days ago
Job Viewed
Job Description
Job Description
Role: Data Engineer
Location: Doha, Qatar
Position Type: Full-Time / Onsite / Contract 1 year renewable
Language requirement: English language is required; Arabic is preferred but not mandatory.
Role Description
The Data Engineer is responsible for building, managing, and optimizing data pipelines and ensuring data flow from source to destination. This role requires expertise in data architecture, data lakehouse, data warehousing, and data integration. You will work closely with data architects, data analysts, and business system owners to deliver high-quality data ingestion solutions that meet business needs.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines for extracting, transforming, and loading (ETL) data from various sources.
- Automate data workflows to ensure data is processed consistently and reliably.
- Design and implement scalable and secure data solutions using Azure analytics services, Azure SQL Database, and/or Databricks. OAC experience is desired but not necessary.
- Develop and maintain data storage solutions, including data warehouses, data lakes, and databases, ensuring optimal performance and cost efficiency.
- Implement and maintain data quality checks to ensure data accuracy and consistency.
- Implement security measures to protect sensitive data, including encryption and access control.
- Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing.
- Monitor and optimize the performance of data pipelines, databases, and queries.
Qualifications
Education: Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or a related field.
Must Have
- 5+ years of experience in data engineering.
- Proven hands-on experience in data pipeline design and development on Azure cloud platform.
- 5+ years experience developing solutions using Azure Data Factory and Databricks.
- Proficiency in at least one scripting language such as Python, PowerShell, JavaScript, or Scala.
Nice To Have
- Previous experience with Informatica IICS data governance or IDQ is a strong plus.
- MS Power BI Development experience is good to have.
- Experience with Oracle Analytics Cloud is a strong plus.
Data Engineer
Posted 11 days ago
Job Viewed
Job Description
We are seeking a skilled Informatica Data Engineer to join our team. The successful candidate will have a minimum of five years of experience in data engineering, with expertise in the Informatica suite, including Informatica Data Governance and Data Quality. The role involves designing, developing, and optimizing data integration workflows, ensuring high-quality data standards, and supporting data governance initiatives.
The RoleYou will be responsible for:
- Design, develop, and maintain ETL processes using Informatica PowerCenter and other Informatica tools.
- Implement and manage data quality processes using Informatica Data Quality (IDQ).
- Support data governance initiatives by leveraging Informatica Data Governance solutions.
- Develop data pipelines and workflows to integrate data from multiple sources.
- Optimize performance of data integration processes and troubleshoot issues.
- Collaborate with business analysts and data architects to understand data requirements.
- Ensure compliance with data governance and security policies.
- Perform data profiling, cleansing, and enrichment to improve data accuracy and reliability.
- Document technical solutions and maintain best practices for data engineering.
- Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing.
- Minimum of 5 years of experience in data engineering using Informatica suite.
- Hands-on experience with Informatica Data Management Cloud (IDMC), Informatica Data Quality (IDQ) and Informatica MDM.
- Strong knowledge of ETL development, data integration, and data transformation techniques.
- Experience with SQL and relational databases such as Oracle, SQL Server, or PostgreSQL.
- Familiarity with cloud platforms such as Azure or Oracle Cloud.
- Strong analytical and problem-solving skills.
- Excellent communication and collaboration abilities.
- Work within a company with a solid track record of success.
- Work alongside & learn from best in class talent.
- Join a well-known brand within IT Services.
Data Engineer
Posted 11 days ago
Job Viewed
Job Description
Join to apply for the Data Engineer role at Leidos
Join to apply for the Data Engineer role at Leidos
This range is provided by Leidos. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.
Base pay range$104,650.00/yr - $89,175.00/yr
Description
Job Description
Leidos National Security Sector (NSS) is seeking a highly experienced and skilled mission-driven Data Engineer to support APOLLO/OPIAS in Tampa, FL. This role provides mission-critical support to Joint and Special Operations Forces (SOF) by designing, building, and maintaining scalable data infrastructure and pipelines that enable the planning, execution, and assessment of influence operations within the Irregular Warfare (IW) and Operations in the Information Environment (OIE). With expertise in cloud platforms, ETL frameworks, and both structured and semi-structured data, the data engineer ensures seamless ingestion, transformation, and normalization of diverse data sources—including social media, OSINT, SIGINT, cyber, and psychological operations platforms. The position will be responsible for providing OIE planners, analysts, and commanders with timely, mission-relevant data access for real-time alerting, influence mapping, and trend analysis. This position is on a future contract pending award announcement.
Possible locations for this position are as follows:
- MacDill (Tampa, FL)
- Al Udeid (Qatar)
- Fort Meade (Maryland)
- Northcom (Colorado Springs, CO)
- Camp Humphreys (Korea)
- Arifjan (Kuwait)
- Joint Base Pearl Harbor-Hickam (Hawaii)
- Fort Eisenhower (Georgia)
- Offutt AFB (Omaha, NE)
- Naval Operating Base Norfolk (Virginia)
- Southcom (Doral, FL)
- JB San Antonio (Texas)
- Stuttgart (Germany)
- Vicenza (Italy)
- Tyndall AFB (Florida)
- Conduct analysis of structured and semi-structured data sets to identify the effective integration for mission use.
- Design, build, and maintain the data infrastructure and pipelines that support the planning, execution, and assessment of influence operations.
- Review existing and emerging technical capabilities and offer recommendations on its potential value to enables OIE planners, analysts, and commanders to access, analyze, and operationalize large-scale datasets—often derived from social media, open-source intelligence (OSINT), cyber, SIGINT, or psychological operations platforms.
- Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field
- 8+ years of experience in data engineering or ETL pipeline development
- Experience with data ingestion, transformation, and normalization from diverse structured and unstructured sources
- Experience deploying in cloud environments (AWS, Azure, or GCP)
- Proficient in Python and at least one ETL framework (e.g., Airflow, NiFi, Luigi)
- Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL)
- Familiarity with version control tools (e.g., Git) and collaborative DevOps practices
- Ability to work in cross-functional teams alongside analysts, developers, and IO planners
- Strong documentation, communication, and troubleshooting skills
- Active TS/SCI security clearance
- Master’s degree in a technical discipline
- Experience supporting Information Operations, PSYOP/MISO, or WebOps
- Experience with data lake architecture, graph databases (e.g., Neo4j), or NoSQL stores (e.g., MongoDB)
- Experience building pipelines that support real-time alerting, trend analysis, and influence mapping
- Proficiency with data visualization frameworks (e.g., Kibana, Grafana, Plotly, or D3.js)
- Familiarity with OSINT data platforms (e.g., Babel Street, Echosec, Talkwalker, Pulsar, Meltwater, Maltego)
- Familiarity with containerized environments (Docker, Kubernetes)
- Understanding of foreign language datasets or multilingual processing (NLP/NLU in Arabic, Russian, Chinese, etc.)
- Background in API integration with social media platforms or dark web forums
Original Posting
July 25, 2025
For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
Pay Range
Pay Range $104,650 00 - 189,175.00
The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law. Seniority level
- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Information Technology
- Industries IT Services and IT Consulting
Referrals increase your chances of interviewing at Leidos by 2x
Sign in to set job alerts for “Data Engineer” roles. Data Security Engineer (HSM PKI) - Qatar (Onsite) location Software Engineer (Python/Linux/Packaging) Data Governance Analyst (Nationals Only) Python and Kubernetes Software Engineer - Data, AI/ML & Analytics Software Engineer - Python - Container Images Software Engineer - Python - Container Images Software Engineer - Python - Container Images Software Engineer (Infrastructure), International Public Sector Junior Software Engineer - Cross-platform C++ - Multipass Distributed Systems Software Engineer, Python / Go Software Engineer (Product), International Public Sector Software Engineer (Forward Deployed), International Public SectorWe’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrData Engineer
Posted 4 days ago
Job Viewed
Job Description
Responsibilities and Duties
Design, develop, and maintain data pipelines, ETL processes,and database systems to support AI and data science initiatives. Collaborate with data scientists, AI/ML engineers, and otherstakeholders to understand data requirements and ensure data availability andquality. Implement data governance, security, and regulatorystandards in all data engineering activities. Optimize data pipelines and processes for scalability,performance, and cost-efficiency. Monitor and ensure the performance and reliability of datasystems, identifying and resolving issues as needed. Stay updated with the latest advancements in dataengineering technologies and best practices. Provide support and guidance to other team members asneeded. Prepare and present data engineering reports anddocumentation to senior management and stakeholders. Participate in project planning and contribute to thedevelopment of project timelines and deliverables. Perform other duties relevant to the job as assigned by theSr. Data Engineer or senior management.
Requirements
Bachelor’s degree in Data Engineering, Computer Science, ora related field Relevant certifications (e.g., Google Cloud ProfessionalData Engineer, AWS Certified Big Data – Specialty) are preferred Minimum of 3 years of experience in data engineering orrelated fields Experience in designing and implementing data pipelines, ETLprocesses, and database systems for AI or technology-focused products Strong programming skills in languages such as Python, Java,or SQL Proficiency in data engineering tools and frameworks (e.g.,Apache Spark, Kafka) Excellent problem-solving and analytical skills Strong communication and interpersonal skills Attention to detail and commitment to quality In-depth understanding of data engineering principles, ETLprocesses, and database management Familiarity with cloud platforms (e.g., AWS, Azure, GoogleCloud) and their data services Knowledge of data governance, security, and regulatorystandards Ability to manage multiple tasks and prioritize effectively Strong attention to detail and commitment to deliveringhigh-quality work Ability to work independently and as part of a team Programming languages (e.g., Python, Java, SQL) Data engineering tools and frameworks (e.g., Apache Spark,Kafka) Data management systems (e.g., SQL, NoSQL databases) Collaboration and communication tools (e.g., Slack,Microsoft Teams)
#J-18808-Ljbffr
Data Engineer
Posted 8 days ago
Job Viewed
Job Description
Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Must Have 5+ years of experience in data engineering. Proven hands-on experience in data pipeline design and development on Azure cloud platform. 5+ years experience developing solutions using Azure Data Factory and Databricks. Proficiency in at least one scripting language such as Python, PowerShell, JavaScript, or Scala. Nice To Have Previous experience with Informatica IICS data governance or IDQ is a strong plus. MS Power BI Development experience is good to have. Experience with Oracle Analytics Cloud is a strong plus.
#J-18808-Ljbffr
Data Engineer
Posted 9 days ago
Job Viewed
Job Description
You will be responsible for: Design, develop, and maintain ETL processes using Informatica PowerCenter and other Informatica tools. Implement and manage data quality processes using Informatica Data Quality (IDQ). Support data governance initiatives by leveraging Informatica Data Governance solutions. Develop data pipelines and workflows to integrate data from multiple sources. Optimize performance of data integration processes and troubleshoot issues. Collaborate with business analysts and data architects to understand data requirements. Ensure compliance with data governance and security policies. Perform data profiling, cleansing, and enrichment to improve data accuracy and reliability. Document technical solutions and maintain best practices for data engineering. Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing. Ideal Profile
Minimum of 5 years of experience in data engineering using Informatica suite. Hands-on experience with Informatica Data Management Cloud (IDMC), Informatica Data Quality (IDQ) and Informatica MDM. Strong knowledge of ETL development, data integration, and data transformation techniques. Experience with SQL and relational databases such as Oracle, SQL Server, or PostgreSQL. Familiarity with cloud platforms such as Azure or Oracle Cloud. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. What's on Offer?
Work within a company with a solid track record of success. Work alongside & learn from best in class talent. Join a well-known brand within IT Services.
#J-18808-Ljbffr
Be The First To Know
About the latest Business intelligence engineer Jobs in Qatar !
Data Engineer
Posted 27 days ago
Job Viewed
Job Description
Data Engineer
role at
Leidos Join to apply for the
Data Engineer
role at
Leidos This range is provided by Leidos. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Base pay range
$104,650.00/yr - $89,175.00/yr Description
Job Description
Leidos National Security Sector (NSS) is seeking a highly experienced and skilled mission-driven Data Engineer to support APOLLO/OPIAS in Tampa, FL.
This role provides mission-critical support to Joint and Special Operations Forces (SOF) by designing, building, and maintaining scalable data infrastructure and pipelines that enable the planning, execution, and assessment of influence operations within the Irregular Warfare (IW) and Operations in the Information Environment (OIE). With expertise in cloud platforms, ETL frameworks, and both structured and semi-structured data, the data engineer ensures seamless ingestion, transformation, and normalization of diverse data sources—including social media, OSINT, SIGINT, cyber, and psychological operations platforms. The position will be responsible for providing OIE planners, analysts, and commanders with timely, mission-relevant data access for real-time alerting, influence mapping, and trend analysis.
This position is on a future contract pending award announcement.
Possible locations for this position are as follows:
MacDill (Tampa, FL) Al Udeid (Qatar) Fort Meade (Maryland) Northcom (Colorado Springs, CO) Camp Humphreys (Korea) Arifjan (Kuwait) Joint Base Pearl Harbor-Hickam (Hawaii) Fort Eisenhower (Georgia) Offutt AFB (Omaha, NE) Naval Operating Base Norfolk (Virginia) Southcom (Doral, FL) JB San Antonio (Texas) Stuttgart (Germany) Vicenza (Italy) Tyndall AFB (Florida)
Key Responsibilities
Conduct analysis of structured and semi-structured data sets to identify the effective integration for mission use. Design, build, and maintain the data infrastructure and pipelines that support the planning, execution, and assessment of influence operations. Review existing and emerging technical capabilities and offer recommendations on its potential value to enables OIE planners, analysts, and commanders to access, analyze, and operationalize large-scale datasets—often derived from social media, open-source intelligence (OSINT), cyber, SIGINT, or psychological operations platforms.
Basic Qualifications
Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field 8+ years of experience in data engineering or ETL pipeline development Experience with data ingestion, transformation, and normalization from diverse structured and unstructured sources Experience deploying in cloud environments (AWS, Azure, or GCP) Proficient in Python and at least one ETL framework (e.g., Airflow, NiFi, Luigi) Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL) Familiarity with version control tools (e.g., Git) and collaborative DevOps practices Ability to work in cross-functional teams alongside analysts, developers, and IO planners Strong documentation, communication, and troubleshooting skills Active TS/SCI security clearance
Preferred Qualifications
Master’s degree in a technical discipline Experience supporting Information Operations, PSYOP/MISO, or WebOps Experience with data lake architecture, graph databases (e.g., Neo4j), or NoSQL stores (e.g., MongoDB) Experience building pipelines that support real-time alerting, trend analysis, and influence mapping Proficiency with data visualization frameworks (e.g., Kibana, Grafana, Plotly, or D3.js) Familiarity with OSINT data platforms (e.g., Babel Street, Echosec, Talkwalker, Pulsar, Meltwater, Maltego) Familiarity with containerized environments (Docker, Kubernetes) Understanding of foreign language datasets or multilingual processing (NLP/NLU in Arabic, Russian, Chinese, etc.) Background in API integration with social media platforms or dark web forums
EIO2024
Original Posting
July 25, 2025
For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
Pay Range
Pay Range $104,650 00 - 189,175.00
The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law. Seniority level
Seniority level Mid-Senior level Employment type
Employment type Full-time Job function
Job function Information Technology Industries IT Services and IT Consulting Referrals increase your chances of interviewing at Leidos by 2x Sign in to set job alerts for “Data Engineer” roles.
Data Security Engineer (HSM PKI) - Qatar (Onsite) location
Software Engineer (Python/Linux/Packaging)
Data Governance Analyst (Nationals Only)
Python and Kubernetes Software Engineer - Data, AI/ML & Analytics
Software Engineer - Python - Container Images
Software Engineer - Python - Container Images
Software Engineer - Python - Container Images
Software Engineer (Infrastructure), International Public Sector
Junior Software Engineer - Cross-platform C++ - Multipass
Distributed Systems Software Engineer, Python / Go
Software Engineer (Product), International Public Sector
Software Engineer (Forward Deployed), International Public Sector
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr
Principal Data Engineer
Posted 5 days ago
Job Viewed
Job Description
The Principal Data Engineer handles the design, development, and maintenance of data pipelines, ETL processes, and database management to support AI and data science initiatives. This role involves ensuring data quality, scalability, and performance across all data engineering activities.
Responsibilities and Duties- Design, develop, and maintain data pipelines, ETL processes, and database systems to support AI and data science initiatives.
- Collaborate with data scientists, AI/ML engineers, and other stakeholders to understand data requirements and ensure data availability and quality.
- Implement data governance, security, and regulatory standards in all data engineering activities.
- Optimize data pipelines and processes for scalability, performance, and cost-efficiency.
- Monitor and ensure the performance and reliability of data systems, identifying and resolving issues as needed.
- Stay updated with the latest advancements in data engineering technologies and best practices.
- Mentor and provide guidance to junior data engineers and other team members.
- Prepare and present data engineering reports and documentation to senior management and stakeholders.
- Participate in project planning and contribute to the development of project timelines and deliverables.
- Perform other duties relevant to the job as assigned by the Head of Data & AI Engineering or senior management.
- Bachelor’s degree in Data Engineering, Computer Science, or a related field.
- Relevant certifications (e.g., Google Cloud Professional Data Engineer, AWS Certified Big Data – Specialty) are preferred.
- Minimum of 8 years of experience in data engineering or related fields.
- Experience in designing and implementing data pipelines, ETL processes, and database systems for AI or technology-focused products.
- Strong programming skills in languages such as Python, SQL.
- Proficiency in data engineering tools and frameworks (e.g., Apache Spark, Kafka).
- Excellent problem-solving and analytical skills.
- Strong communication and interpersonal skills.
- Attention to detail and commitment to quality.
- In-depth understanding of data engineering principles, ETL processes, and database management.
- Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services.
- Knowledge of data governance, security, and regulatory standards.
- Ability to manage multiple tasks and prioritize effectively.
- Ability to work independently and as part of a team.
Sr. Data Engineer
Posted 4 days ago
Job Viewed
Job Description
The Sr. Data Engineer supports the design, development, and maintenance of data pipelines, ETL processes, and database systems to support AI and data science initiatives. This role involves ensuring data quality, scalability, and performance across all data engineering activities.
Responsibilities and Duties- Support the design, development, and maintenance of data pipelines, ETL processes, and database systems to support AI and data science initiatives.
- Collaborate with data scientists, AI/ML engineers, and other stakeholders to understand data requirements and ensure data availability and quality.
- Implement data governance, security, and regulatory standards in all data engineering activities.
- Optimize data pipelines and processes for scalability, performance, and cost-efficiency.
- Monitor and ensure the performance and reliability of data systems, identifying and resolving issues as needed.
- Stay updated with the latest advancements in data engineering technologies and best practices.
- Mentor and provide guidance to junior data engineers and other team members.
- Prepare and present data engineering reports and documentation to senior management and stakeholders.
- Participate in project planning and contribute to the development of project timelines and deliverables.
- Perform other duties relevant to the job as assigned by the Principal Data Engineer or senior management.
- Bachelor’s degree in Data Engineering, Computer Science, or a related field.
- Relevant certifications (e.g., Google Cloud Professional Data Engineer, AWS Certified Big Data – Specialty) are preferred.
- Minimum of 5 years of experience in data engineering or related fields.
- Experience in designing and implementing data pipelines, ETL processes, and database systems for AI or technology-focused products.
- Strong programming skills in languages such as Python, SQL.
- Proficiency in data engineering tools and frameworks (e.g., Apache Spark, Kafka).
- Excellent problem-solving and analytical skills.
- Strong communication and interpersonal skills.
- Attention to detail and commitment to quality.
- In-depth understanding of data engineering principles, ETL processes, and database management.
- Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services.
- Knowledge of data governance, security, and regulatory standards.
- Ability to manage multiple tasks and prioritize effectively.
- Ability to work independently and as part of a team.