135 Python Engineer jobs in Qatar

Lead Python Software Engineer, Commercial Systems

Canonical

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

workfromhome
Lead Python Software Engineer, Commercial Systems

Join to apply for the Lead Python Software Engineer, Commercial Systems role at Canonical

Lead Python Software Engineer, Commercial Systems

3 days ago Be among the first 25 applicants

Join to apply for the Lead Python Software Engineer, Commercial Systems role at Canonical

Canonical is a leading provider of open-source software and operating systems for global enterprise and technology markets. Our platform, Ubuntu, is very widely used in breakthrough enterprise initiatives such as public cloud, data science, AI, engineering innovation and IoT. Our customers include the world's leading public cloud and silicon providers, and industry leaders in many sectors. The company is a pioneer of global distributed collaboration, with 1200+ colleagues in more than 80 countries and very few office-based roles. Teams meet two to four times yearly in person, in interesting locations around the world, to align on strategy and execution.

The company is founder led, profitable and growing.

We are hiring a Lead Python Software Engineer who strives for the highest engineering quality, seeks improvements, continuously develops their skills, and applies them at work. This is an exciting opportunity to work with many popular software systems, integrations technologies, and exciting open source solutions.

The Commercial Systems unit is conceived as seven engineering teams that closely collaborate with other engineering and business teams at Canonical. Services designed, developed, and operated by the Commercial Systems unit are at the heart of Canonical business and Python plays an integral role in it. We are looking for Python Software Engineers for the Integrations team.

The Integrations team is responsible for the automation of SAAS user management and onboarding of new data sources to the data mesh. The team designs, develops, and operates a Python based solution to automate SAAS seat management and track spend across the application portfolio. Furthermore the team integrates internal and external data sources into the data mesh using open-source ETL solutions, enabling more data driven decisions in the organization.

Location : This role will be based remotely in the EMEA region.

The role entails

  • Develop engineering solutions leveraging Python
  • Collaborate with colleagues on technical designs and code reviews
  • Deploy and operate services developed by the team
  • Depending on your seniority, coach, mentor, and offer career development feedback
  • Develop and evangelize great engineering and organizational practices

What we are looking for in you

  • Exceptional academic track record from both high school and university
  • Undergraduate degree in a technical subject or a compelling narrative about your alternative chosen path
  • Track record of going above-and-beyond expectations to achieve outstanding results
  • Experience with software development in Python
  • Professional written and spoken English with excellent presentation skills
  • Result-oriented, with a personal drive to meet commitments
  • Ability to travel internationally twice a year, for company events up to two weeks long

Nice-to-have skills

  • Performance engineering and security experience
  • Experience with Airbyte, Ranger, Temporal, or Trino

What we offer colleagues

We consider geographical location, experience, and performance in shaping compensation worldwide. We revisit compensation annually (and more often for graduates and associates) to ensure we recognize outstanding performance. In addition to base pay, we offer a performance-driven annual bonus or commission. We provide all team members with additional benefits, which reflect our values and ideals. We balance our programs to meet local needs and ensure fairness globally.

  • Distributed work environment with twice-yearly team sprints in person
  • Personal learning and development budget of USD 2,000 per year
  • Annual compensation review
  • Recognition rewards
  • Annual holiday leave
  • Maternity and paternity leave
  • Employee Assistance Program
  • Opportunity to travel to new locations to meet colleagues
  • Priority Pass, and travel upgrades for long haul company events

About Canonical

Canonical is a pioneering tech firm at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open source projects and the platform for AI, IoT and the cloud, we are changing the world of software. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence - in order to succeed, we need to be the best at what we do. Most colleagues at Canonical have worked from home since its inception in 2004. Working here is a step into the future, and will challenge you to think differently, work smarter, learn new skills, and raise your game.

Canonical is an equal opportunity employer

We are proud to foster a workplace free from discrimination. Diversity of experience, perspectives, and background create a better work environment and better products. Whatever your identity, we will give your application fair consideration.

Seniority level
  • Seniority level Mid-Senior level
Employment type
  • Employment type Full-time
Job function
  • Job function Information Technology
  • Industries Software Development

Referrals increase your chances of interviewing at Canonical by 2x

Get notified about new Lead Software System Engineer jobs in Doha, Doha, Qatar .

Senior Software Engineer - packaging - optimize Ubuntu Server Senior Software Engineer - packaging - optimize Ubuntu Server Senior Software Engineer - packaging - optimize Ubuntu Server Senior Software Engineer - packaging - optimize Ubuntu Server

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Lead Python Software Engineer, Commercial Systems

Doha, Doha Canonical

Posted 7 days ago

Job Viewed

Tap Again To Close

Job Description

Lead Python Software Engineer, Commercial Systems

Join to apply for the

Lead Python Software Engineer, Commercial Systems

role at

Canonical Lead Python Software Engineer, Commercial Systems

3 days ago Be among the first 25 applicants Join to apply for the

Lead Python Software Engineer, Commercial Systems

role at

Canonical Canonical is a leading provider of open-source software and operating systems for global enterprise and technology markets. Our platform, Ubuntu, is very widely used in breakthrough enterprise initiatives such as public cloud, data science, AI, engineering innovation and IoT. Our customers include the world's leading public cloud and silicon providers, and industry leaders in many sectors. The company is a pioneer of global distributed collaboration, with 1200+ colleagues in more than 80 countries and very few office-based roles. Teams meet two to four times yearly in person, in interesting locations around the world, to align on strategy and execution.

The company is founder led, profitable and growing.

We are hiring a

Lead Python Software Engineer

who strives for the highest engineering quality, seeks improvements, continuously develops their skills, and applies them at work. This is an exciting opportunity to work with many popular software systems, integrations technologies, and exciting open source solutions.

The Commercial Systems unit is conceived as seven engineering teams that closely collaborate with other engineering and business teams at Canonical. Services designed, developed, and operated by the Commercial Systems unit are at the heart of Canonical business and Python plays an integral role in it. We are looking for Python Software Engineers for the Integrations team.

The

Integrations

team is responsible for the automation of SAAS user management and onboarding of new data sources to the data mesh. The team designs, develops, and operates a Python based solution to automate SAAS seat management and track spend across the application portfolio. Furthermore the team integrates internal and external data sources into the data mesh using open-source ETL solutions, enabling more data driven decisions in the organization.

Location : This role will be based remotely in the EMEA region.

The role entails

Develop engineering solutions leveraging Python Collaborate with colleagues on technical designs and code reviews Deploy and operate services developed by the team Depending on your seniority, coach, mentor, and offer career development feedback Develop and evangelize great engineering and organizational practices

What we are looking for in you

Exceptional academic track record from both high school and university Undergraduate degree in a technical subject or a compelling narrative about your alternative chosen path Track record of going above-and-beyond expectations to achieve outstanding results Experience with software development in Python Professional written and spoken English with excellent presentation skills Result-oriented, with a personal drive to meet commitments Ability to travel internationally twice a year, for company events up to two weeks long

Nice-to-have skills

Performance engineering and security experience Experience with Airbyte, Ranger, Temporal, or Trino

What we offer colleagues

We consider geographical location, experience, and performance in shaping compensation worldwide. We revisit compensation annually (and more often for graduates and associates) to ensure we recognize outstanding performance. In addition to base pay, we offer a performance-driven annual bonus or commission. We provide all team members with additional benefits, which reflect our values and ideals. We balance our programs to meet local needs and ensure fairness globally.

Distributed work environment with twice-yearly team sprints in person Personal learning and development budget of USD 2,000 per year Annual compensation review Recognition rewards Annual holiday leave Maternity and paternity leave Employee Assistance Program Opportunity to travel to new locations to meet colleagues Priority Pass, and travel upgrades for long haul company events

About Canonical

Canonical is a pioneering tech firm at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open source projects and the platform for AI, IoT and the cloud, we are changing the world of software. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence - in order to succeed, we need to be the best at what we do. Most colleagues at Canonical have worked from home since its inception in 2004. Working here is a step into the future, and will challenge you to think differently, work smarter, learn new skills, and raise your game.

Canonical is an equal opportunity employer

We are proud to foster a workplace free from discrimination. Diversity of experience, perspectives, and background create a better work environment and better products. Whatever your identity, we will give your application fair consideration.

Seniority level

Seniority level Mid-Senior level Employment type

Employment type Full-time Job function

Job function Information Technology Industries Software Development Referrals increase your chances of interviewing at Canonical by 2x Get notified about new Lead Software System Engineer jobs in

Doha, Doha, Qatar . Senior Software Engineer - packaging - optimize Ubuntu Server

Senior Software Engineer - packaging - optimize Ubuntu Server

Senior Software Engineer - packaging - optimize Ubuntu Server

Senior Software Engineer - packaging - optimize Ubuntu Server

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Doha, Doha Id8media

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

The Data Engineer handles thedesign, development, and maintenance of data pipelines, ETL processes, anddatabase management to support AI and data science initiatives. This roleinvolves ensuring data quality, scalability, and performance across all dataengineering activities.

Responsibilities and Duties
  • Design, develop, and maintain data pipelines, ETL processes,and database systems to support AI and data science initiatives.
  • Collaborate with data scientists, AI/ML engineers, and otherstakeholders to understand data requirements and ensure data availability andquality.
  • Implement data governance, security, and regulatorystandards in all data engineering activities.
  • Optimize data pipelines and processes for scalability,performance, and cost-efficiency.
  • Monitor and ensure the performance and reliability of datasystems, identifying and resolving issues as needed.
  • Stay updated with the latest advancements in dataengineering technologies and best practices.
  • Provide support and guidance to other team members asneeded.
  • Prepare and present data engineering reports anddocumentation to senior management and stakeholders.
  • Participate in project planning and contribute to thedevelopment of project timelines and deliverables.
  • Perform other duties relevant to the job as assigned by theSr. Data Engineer or senior management.
Requirements
  • Bachelor’s degree in Data Engineering, Computer Science, ora related field
  • Relevant certifications (e.g., Google Cloud ProfessionalData Engineer, AWS Certified Big Data – Specialty) are preferred
  • Minimum of 3 years of experience in data engineering orrelated fields
  • Experience in designing and implementing data pipelines, ETLprocesses, and database systems for AI or technology-focused products
  • Strong programming skills in languages such as Python, Java,or SQL
  • Proficiency in data engineering tools and frameworks (e.g.,Apache Spark, Kafka)
  • Excellent problem-solving and analytical skills
  • Strong communication and interpersonal skills
  • Attention to detail and commitment to quality
  • In-depth understanding of data engineering principles, ETLprocesses, and database management
  • Familiarity with cloud platforms (e.g., AWS, Azure, GoogleCloud) and their data services
  • Knowledge of data governance, security, and regulatorystandards
  • Ability to manage multiple tasks and prioritize effectively
  • Strong attention to detail and commitment to deliveringhigh-quality work
  • Ability to work independently and as part of a team
  • Programming languages (e.g., Python, Java, SQL)
  • Data engineering tools and frameworks (e.g., Apache Spark,Kafka)
  • Data management systems (e.g., SQL, NoSQL databases)
  • Collaboration and communication tools (e.g., Slack,Microsoft Teams)
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Doha, Doha Arab Solutions

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description

Role: Data Engineer

Location: Doha, Qatar

Position Type: Full-Time / Onsite / Contract 1 year renewable

Language requirement: English language is required; Arabic is preferred but not mandatory.

Role Description

The Data Engineer is responsible for building, managing, and optimizing data pipelines and ensuring data flow from source to destination. This role requires expertise in data architecture, data lakehouse, data warehousing, and data integration. You will work closely with data architects, data analysts, and business system owners to deliver high-quality data ingestion solutions that meet business needs.

Key Responsibilities

  1. Design, develop, and maintain scalable data pipelines for extracting, transforming, and loading (ETL) data from various sources.
  2. Automate data workflows to ensure data is processed consistently and reliably.
  3. Design and implement scalable and secure data solutions using Azure analytics services, Azure SQL Database, and/or Databricks. OAC experience is desired but not necessary.
  4. Develop and maintain data storage solutions, including data warehouses, data lakes, and databases, ensuring optimal performance and cost efficiency.
  5. Implement and maintain data quality checks to ensure data accuracy and consistency.
  6. Implement security measures to protect sensitive data, including encryption and access control.
  7. Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing.
  8. Monitor and optimize the performance of data pipelines, databases, and queries.

Qualifications

Education: Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or a related field.

Must Have

  • 5+ years of experience in data engineering.
  • Proven hands-on experience in data pipeline design and development on Azure cloud platform.
  • 5+ years experience developing solutions using Azure Data Factory and Databricks.
  • Proficiency in at least one scripting language such as Python, PowerShell, JavaScript, or Scala.

Nice To Have

  • Previous experience with Informatica IICS data governance or IDQ is a strong plus.
  • MS Power BI Development experience is good to have.
  • Experience with Oracle Analytics Cloud is a strong plus.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Sygmetiv Business Solutions

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a skilled Informatica Data Engineer to join our team. The successful candidate will have a minimum of five years of experience in data engineering, with expertise in the Informatica suite, including Informatica Data Governance and Data Quality. The role involves designing, developing, and optimizing data integration workflows, ensuring high-quality data standards, and supporting data governance initiatives.

The Role

You will be responsible for:

  1. Design, develop, and maintain ETL processes using Informatica PowerCenter and other Informatica tools.
  2. Implement and manage data quality processes using Informatica Data Quality (IDQ).
  3. Support data governance initiatives by leveraging Informatica Data Governance solutions.
  4. Develop data pipelines and workflows to integrate data from multiple sources.
  5. Optimize performance of data integration processes and troubleshoot issues.
  6. Collaborate with business analysts and data architects to understand data requirements.
  7. Ensure compliance with data governance and security policies.
  8. Perform data profiling, cleansing, and enrichment to improve data accuracy and reliability.
  9. Document technical solutions and maintain best practices for data engineering.
  10. Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing.
Ideal Profile
  1. Minimum of 5 years of experience in data engineering using Informatica suite.
  2. Hands-on experience with Informatica Data Management Cloud (IDMC), Informatica Data Quality (IDQ) and Informatica MDM.
  3. Strong knowledge of ETL development, data integration, and data transformation techniques.
  4. Experience with SQL and relational databases such as Oracle, SQL Server, or PostgreSQL.
  5. Familiarity with cloud platforms such as Azure or Oracle Cloud.
  6. Strong analytical and problem-solving skills.
  7. Excellent communication and collaboration abilities.
What's on Offer?
  1. Work within a company with a solid track record of success.
  2. Work alongside & learn from best in class talent.
  3. Join a well-known brand within IT Services.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Doha, Doha Leidos

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Join to apply for the Data Engineer role at Leidos

Join to apply for the Data Engineer role at Leidos

This range is provided by Leidos. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.

Base pay range

$104,650.00/yr - $89,175.00/yr

Description

Job Description

Leidos National Security Sector (NSS) is seeking a highly experienced and skilled mission-driven Data Engineer to support APOLLO/OPIAS in Tampa, FL. This role provides mission-critical support to Joint and Special Operations Forces (SOF) by designing, building, and maintaining scalable data infrastructure and pipelines that enable the planning, execution, and assessment of influence operations within the Irregular Warfare (IW) and Operations in the Information Environment (OIE). With expertise in cloud platforms, ETL frameworks, and both structured and semi-structured data, the data engineer ensures seamless ingestion, transformation, and normalization of diverse data sources—including social media, OSINT, SIGINT, cyber, and psychological operations platforms. The position will be responsible for providing OIE planners, analysts, and commanders with timely, mission-relevant data access for real-time alerting, influence mapping, and trend analysis. This position is on a future contract pending award announcement.

Possible locations for this position are as follows:

  • MacDill (Tampa, FL)
  • Al Udeid (Qatar)
  • Fort Meade (Maryland)
  • Northcom (Colorado Springs, CO)
  • Camp Humphreys (Korea)
  • Arifjan (Kuwait)
  • Joint Base Pearl Harbor-Hickam (Hawaii)
  • Fort Eisenhower (Georgia)
  • Offutt AFB (Omaha, NE)
  • Naval Operating Base Norfolk (Virginia)
  • Southcom (Doral, FL)
  • JB San Antonio (Texas)
  • Stuttgart (Germany)
  • Vicenza (Italy)
  • Tyndall AFB (Florida)

Key Responsibilities

  • Conduct analysis of structured and semi-structured data sets to identify the effective integration for mission use.
  • Design, build, and maintain the data infrastructure and pipelines that support the planning, execution, and assessment of influence operations.
  • Review existing and emerging technical capabilities and offer recommendations on its potential value to enables OIE planners, analysts, and commanders to access, analyze, and operationalize large-scale datasets—often derived from social media, open-source intelligence (OSINT), cyber, SIGINT, or psychological operations platforms.

Basic Qualifications

  • Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field
  • 8+ years of experience in data engineering or ETL pipeline development
  • Experience with data ingestion, transformation, and normalization from diverse structured and unstructured sources
  • Experience deploying in cloud environments (AWS, Azure, or GCP)
  • Proficient in Python and at least one ETL framework (e.g., Airflow, NiFi, Luigi)
  • Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL)
  • Familiarity with version control tools (e.g., Git) and collaborative DevOps practices
  • Ability to work in cross-functional teams alongside analysts, developers, and IO planners
  • Strong documentation, communication, and troubleshooting skills
  • Active TS/SCI security clearance

Preferred Qualifications

  • Master’s degree in a technical discipline
  • Experience supporting Information Operations, PSYOP/MISO, or WebOps
  • Experience with data lake architecture, graph databases (e.g., Neo4j), or NoSQL stores (e.g., MongoDB)
  • Experience building pipelines that support real-time alerting, trend analysis, and influence mapping
  • Proficiency with data visualization frameworks (e.g., Kibana, Grafana, Plotly, or D3.js)
  • Familiarity with OSINT data platforms (e.g., Babel Street, Echosec, Talkwalker, Pulsar, Meltwater, Maltego)
  • Familiarity with containerized environments (Docker, Kubernetes)
  • Understanding of foreign language datasets or multilingual processing (NLP/NLU in Arabic, Russian, Chinese, etc.)
  • Background in API integration with social media platforms or dark web forums

EIO2024

Original Posting

July 25, 2025

For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.

Pay Range

Pay Range $104,650 00 - 189,175.00

The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.

Seniority level
  • Seniority level Mid-Senior level
Employment type
  • Employment type Full-time
Job function
  • Job function Information Technology
  • Industries IT Services and IT Consulting

Referrals increase your chances of interviewing at Leidos by 2x

Sign in to set job alerts for “Data Engineer” roles. Data Security Engineer (HSM PKI) - Qatar (Onsite) location Software Engineer (Python/Linux/Packaging) Data Governance Analyst (Nationals Only) Python and Kubernetes Software Engineer - Data, AI/ML & Analytics Software Engineer - Python - Container Images Software Engineer - Python - Container Images Software Engineer - Python - Container Images Software Engineer (Infrastructure), International Public Sector Junior Software Engineer - Cross-platform C++ - Multipass Distributed Systems Software Engineer, Python / Go Software Engineer (Product), International Public Sector Software Engineer (Forward Deployed), International Public Sector

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Doha, Doha Leidos

Posted today

Job Viewed

Tap Again To Close

Job Description

Join to apply for the

Data Engineer

role at

Leidos Join to apply for the

Data Engineer

role at

Leidos This range is provided by Leidos. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Base pay range

$104,650.00/yr - $89,175.00/yr Description

Job Description

Leidos National Security Sector (NSS) is seeking a highly experienced and skilled mission-driven Data Engineer to support APOLLO/OPIAS in Tampa, FL.

This role provides mission-critical support to Joint and Special Operations Forces (SOF) by designing, building, and maintaining scalable data infrastructure and pipelines that enable the planning, execution, and assessment of influence operations within the Irregular Warfare (IW) and Operations in the Information Environment (OIE). With expertise in cloud platforms, ETL frameworks, and both structured and semi-structured data, the data engineer ensures seamless ingestion, transformation, and normalization of diverse data sources—including social media, OSINT, SIGINT, cyber, and psychological operations platforms. The position will be responsible for providing OIE planners, analysts, and commanders with timely, mission-relevant data access for real-time alerting, influence mapping, and trend analysis.

This position is on a future contract pending award announcement.

Possible locations for this position are as follows:

MacDill (Tampa, FL) Al Udeid (Qatar) Fort Meade (Maryland) Northcom (Colorado Springs, CO) Camp Humphreys (Korea) Arifjan (Kuwait) Joint Base Pearl Harbor-Hickam (Hawaii) Fort Eisenhower (Georgia) Offutt AFB (Omaha, NE) Naval Operating Base Norfolk (Virginia) Southcom (Doral, FL) JB San Antonio (Texas) Stuttgart (Germany) Vicenza (Italy) Tyndall AFB (Florida)

Key Responsibilities

Conduct analysis of structured and semi-structured data sets to identify the effective integration for mission use. Design, build, and maintain the data infrastructure and pipelines that support the planning, execution, and assessment of influence operations. Review existing and emerging technical capabilities and offer recommendations on its potential value to enables OIE planners, analysts, and commanders to access, analyze, and operationalize large-scale datasets—often derived from social media, open-source intelligence (OSINT), cyber, SIGINT, or psychological operations platforms.

Basic Qualifications

Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field 8+ years of experience in data engineering or ETL pipeline development Experience with data ingestion, transformation, and normalization from diverse structured and unstructured sources Experience deploying in cloud environments (AWS, Azure, or GCP) Proficient in Python and at least one ETL framework (e.g., Airflow, NiFi, Luigi) Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL) Familiarity with version control tools (e.g., Git) and collaborative DevOps practices Ability to work in cross-functional teams alongside analysts, developers, and IO planners Strong documentation, communication, and troubleshooting skills Active TS/SCI security clearance

Preferred Qualifications

Master’s degree in a technical discipline Experience supporting Information Operations, PSYOP/MISO, or WebOps Experience with data lake architecture, graph databases (e.g., Neo4j), or NoSQL stores (e.g., MongoDB) Experience building pipelines that support real-time alerting, trend analysis, and influence mapping Proficiency with data visualization frameworks (e.g., Kibana, Grafana, Plotly, or D3.js) Familiarity with OSINT data platforms (e.g., Babel Street, Echosec, Talkwalker, Pulsar, Meltwater, Maltego) Familiarity with containerized environments (Docker, Kubernetes) Understanding of foreign language datasets or multilingual processing (NLP/NLU in Arabic, Russian, Chinese, etc.) Background in API integration with social media platforms or dark web forums

EIO2024

Original Posting

July 25, 2025

For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.

Pay Range

Pay Range $104,650 00 - 189,175.00

The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law. Seniority level

Seniority level Mid-Senior level Employment type

Employment type Full-time Job function

Job function Information Technology Industries IT Services and IT Consulting Referrals increase your chances of interviewing at Leidos by 2x Sign in to set job alerts for “Data Engineer” roles.

Data Security Engineer (HSM PKI) - Qatar (Onsite) location

Software Engineer (Python/Linux/Packaging)

Data Governance Analyst (Nationals Only)

Python and Kubernetes Software Engineer - Data, AI/ML & Analytics

Software Engineer - Python - Container Images

Software Engineer - Python - Container Images

Software Engineer - Python - Container Images

Software Engineer (Infrastructure), International Public Sector

Junior Software Engineer - Cross-platform C++ - Multipass

Distributed Systems Software Engineer, Python / Go

Software Engineer (Product), International Public Sector

Software Engineer (Forward Deployed), International Public Sector

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Python engineer Jobs in Qatar !

Data Engineer

Doha, Doha Id8media

Posted 5 days ago

Job Viewed

Tap Again To Close

Job Description

The Data Engineer handles thedesign, development, and maintenance of data pipelines, ETL processes, anddatabase management to support AI and data science initiatives. This roleinvolves ensuring data quality, scalability, and performance across all dataengineering activities.

Responsibilities and Duties

Design, develop, and maintain data pipelines, ETL processes,and database systems to support AI and data science initiatives. Collaborate with data scientists, AI/ML engineers, and otherstakeholders to understand data requirements and ensure data availability andquality. Implement data governance, security, and regulatorystandards in all data engineering activities. Optimize data pipelines and processes for scalability,performance, and cost-efficiency. Monitor and ensure the performance and reliability of datasystems, identifying and resolving issues as needed. Stay updated with the latest advancements in dataengineering technologies and best practices. Provide support and guidance to other team members asneeded. Prepare and present data engineering reports anddocumentation to senior management and stakeholders. Participate in project planning and contribute to thedevelopment of project timelines and deliverables. Perform other duties relevant to the job as assigned by theSr. Data Engineer or senior management.

Requirements

Bachelor’s degree in Data Engineering, Computer Science, ora related field Relevant certifications (e.g., Google Cloud ProfessionalData Engineer, AWS Certified Big Data – Specialty) are preferred Minimum of 3 years of experience in data engineering orrelated fields Experience in designing and implementing data pipelines, ETLprocesses, and database systems for AI or technology-focused products Strong programming skills in languages such as Python, Java,or SQL Proficiency in data engineering tools and frameworks (e.g.,Apache Spark, Kafka) Excellent problem-solving and analytical skills Strong communication and interpersonal skills Attention to detail and commitment to quality In-depth understanding of data engineering principles, ETLprocesses, and database management Familiarity with cloud platforms (e.g., AWS, Azure, GoogleCloud) and their data services Knowledge of data governance, security, and regulatorystandards Ability to manage multiple tasks and prioritize effectively Strong attention to detail and commitment to deliveringhigh-quality work Ability to work independently and as part of a team Programming languages (e.g., Python, Java, SQL) Data engineering tools and frameworks (e.g., Apache Spark,Kafka) Data management systems (e.g., SQL, NoSQL databases) Collaboration and communication tools (e.g., Slack,Microsoft Teams)

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Doha, Doha Arab Solutions

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description Role: Data Engineer Location: Doha, Qatar Position Type: Full-Time / Onsite / Contract 1 year renewable Language requirement: English language is required; Arabic is preferred but not mandatory. Role Description The Data Engineer is responsible for building, managing, and optimizing data pipelines and ensuring data flow from source to destination. This role requires expertise in data architecture, data lakehouse, data warehousing, and data integration. You will work closely with data architects, data analysts, and business system owners to deliver high-quality data ingestion solutions that meet business needs. Key Responsibilities Design, develop, and maintain scalable data pipelines for extracting, transforming, and loading (ETL) data from various sources. Automate data workflows to ensure data is processed consistently and reliably. Design and implement scalable and secure data solutions using Azure analytics services, Azure SQL Database, and/or Databricks. OAC experience is desired but not necessary. Develop and maintain data storage solutions, including data warehouses, data lakes, and databases, ensuring optimal performance and cost efficiency. Implement and maintain data quality checks to ensure data accuracy and consistency. Implement security measures to protect sensitive data, including encryption and access control. Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing. Monitor and optimize the performance of data pipelines, databases, and queries. Qualifications Education:

Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Must Have 5+ years of experience in data engineering. Proven hands-on experience in data pipeline design and development on Azure cloud platform. 5+ years experience developing solutions using Azure Data Factory and Databricks. Proficiency in at least one scripting language such as Python, PowerShell, JavaScript, or Scala. Nice To Have Previous experience with Informatica IICS data governance or IDQ is a strong plus. MS Power BI Development experience is good to have. Experience with Oracle Analytics Cloud is a strong plus.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Doha, Doha Sygmetiv Business Solutions

Posted 10 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a skilled Informatica Data Engineer to join our team. The successful candidate will have a minimum of five years of experience in data engineering, with expertise in the Informatica suite, including Informatica Data Governance and Data Quality. The role involves designing, developing, and optimizing data integration workflows, ensuring high-quality data standards, and supporting data governance initiatives. The Role

You will be responsible for: Design, develop, and maintain ETL processes using Informatica PowerCenter and other Informatica tools. Implement and manage data quality processes using Informatica Data Quality (IDQ). Support data governance initiatives by leveraging Informatica Data Governance solutions. Develop data pipelines and workflows to integrate data from multiple sources. Optimize performance of data integration processes and troubleshoot issues. Collaborate with business analysts and data architects to understand data requirements. Ensure compliance with data governance and security policies. Perform data profiling, cleansing, and enrichment to improve data accuracy and reliability. Document technical solutions and maintain best practices for data engineering. Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing. Ideal Profile

Minimum of 5 years of experience in data engineering using Informatica suite. Hands-on experience with Informatica Data Management Cloud (IDMC), Informatica Data Quality (IDQ) and Informatica MDM. Strong knowledge of ETL development, data integration, and data transformation techniques. Experience with SQL and relational databases such as Oracle, SQL Server, or PostgreSQL. Familiarity with cloud platforms such as Azure or Oracle Cloud. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. What's on Offer?

Work within a company with a solid track record of success. Work alongside & learn from best in class talent. Join a well-known brand within IT Services.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Python Engineer Jobs