135 Python Engineer jobs in Qatar
Lead Python Software Engineer, Commercial Systems
Posted 7 days ago
Job Viewed
Job Description
Join to apply for the Lead Python Software Engineer, Commercial Systems role at Canonical
Lead Python Software Engineer, Commercial Systems3 days ago Be among the first 25 applicants
Join to apply for the Lead Python Software Engineer, Commercial Systems role at Canonical
Canonical is a leading provider of open-source software and operating systems for global enterprise and technology markets. Our platform, Ubuntu, is very widely used in breakthrough enterprise initiatives such as public cloud, data science, AI, engineering innovation and IoT. Our customers include the world's leading public cloud and silicon providers, and industry leaders in many sectors. The company is a pioneer of global distributed collaboration, with 1200+ colleagues in more than 80 countries and very few office-based roles. Teams meet two to four times yearly in person, in interesting locations around the world, to align on strategy and execution.
The company is founder led, profitable and growing.
We are hiring a Lead Python Software Engineer who strives for the highest engineering quality, seeks improvements, continuously develops their skills, and applies them at work. This is an exciting opportunity to work with many popular software systems, integrations technologies, and exciting open source solutions.
The Commercial Systems unit is conceived as seven engineering teams that closely collaborate with other engineering and business teams at Canonical. Services designed, developed, and operated by the Commercial Systems unit are at the heart of Canonical business and Python plays an integral role in it. We are looking for Python Software Engineers for the Integrations team.
The Integrations team is responsible for the automation of SAAS user management and onboarding of new data sources to the data mesh. The team designs, develops, and operates a Python based solution to automate SAAS seat management and track spend across the application portfolio. Furthermore the team integrates internal and external data sources into the data mesh using open-source ETL solutions, enabling more data driven decisions in the organization.
Location : This role will be based remotely in the EMEA region.
The role entails
- Develop engineering solutions leveraging Python
- Collaborate with colleagues on technical designs and code reviews
- Deploy and operate services developed by the team
- Depending on your seniority, coach, mentor, and offer career development feedback
- Develop and evangelize great engineering and organizational practices
- Exceptional academic track record from both high school and university
- Undergraduate degree in a technical subject or a compelling narrative about your alternative chosen path
- Track record of going above-and-beyond expectations to achieve outstanding results
- Experience with software development in Python
- Professional written and spoken English with excellent presentation skills
- Result-oriented, with a personal drive to meet commitments
- Ability to travel internationally twice a year, for company events up to two weeks long
- Performance engineering and security experience
- Experience with Airbyte, Ranger, Temporal, or Trino
We consider geographical location, experience, and performance in shaping compensation worldwide. We revisit compensation annually (and more often for graduates and associates) to ensure we recognize outstanding performance. In addition to base pay, we offer a performance-driven annual bonus or commission. We provide all team members with additional benefits, which reflect our values and ideals. We balance our programs to meet local needs and ensure fairness globally.
- Distributed work environment with twice-yearly team sprints in person
- Personal learning and development budget of USD 2,000 per year
- Annual compensation review
- Recognition rewards
- Annual holiday leave
- Maternity and paternity leave
- Employee Assistance Program
- Opportunity to travel to new locations to meet colleagues
- Priority Pass, and travel upgrades for long haul company events
Canonical is a pioneering tech firm at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open source projects and the platform for AI, IoT and the cloud, we are changing the world of software. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence - in order to succeed, we need to be the best at what we do. Most colleagues at Canonical have worked from home since its inception in 2004. Working here is a step into the future, and will challenge you to think differently, work smarter, learn new skills, and raise your game.
Canonical is an equal opportunity employer
We are proud to foster a workplace free from discrimination. Diversity of experience, perspectives, and background create a better work environment and better products. Whatever your identity, we will give your application fair consideration.
Seniority level
- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Information Technology
- Industries Software Development
Referrals increase your chances of interviewing at Canonical by 2x
Get notified about new Lead Software System Engineer jobs in Doha, Doha, Qatar .
Senior Software Engineer - packaging - optimize Ubuntu Server Senior Software Engineer - packaging - optimize Ubuntu Server Senior Software Engineer - packaging - optimize Ubuntu Server Senior Software Engineer - packaging - optimize Ubuntu ServerWe’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrLead Python Software Engineer, Commercial Systems
Posted 7 days ago
Job Viewed
Job Description
Join to apply for the
Lead Python Software Engineer, Commercial Systems
role at
Canonical Lead Python Software Engineer, Commercial Systems
3 days ago Be among the first 25 applicants Join to apply for the
Lead Python Software Engineer, Commercial Systems
role at
Canonical Canonical is a leading provider of open-source software and operating systems for global enterprise and technology markets. Our platform, Ubuntu, is very widely used in breakthrough enterprise initiatives such as public cloud, data science, AI, engineering innovation and IoT. Our customers include the world's leading public cloud and silicon providers, and industry leaders in many sectors. The company is a pioneer of global distributed collaboration, with 1200+ colleagues in more than 80 countries and very few office-based roles. Teams meet two to four times yearly in person, in interesting locations around the world, to align on strategy and execution.
The company is founder led, profitable and growing.
We are hiring a
Lead Python Software Engineer
who strives for the highest engineering quality, seeks improvements, continuously develops their skills, and applies them at work. This is an exciting opportunity to work with many popular software systems, integrations technologies, and exciting open source solutions.
The Commercial Systems unit is conceived as seven engineering teams that closely collaborate with other engineering and business teams at Canonical. Services designed, developed, and operated by the Commercial Systems unit are at the heart of Canonical business and Python plays an integral role in it. We are looking for Python Software Engineers for the Integrations team.
The
Integrations
team is responsible for the automation of SAAS user management and onboarding of new data sources to the data mesh. The team designs, develops, and operates a Python based solution to automate SAAS seat management and track spend across the application portfolio. Furthermore the team integrates internal and external data sources into the data mesh using open-source ETL solutions, enabling more data driven decisions in the organization.
Location : This role will be based remotely in the EMEA region.
The role entails
Develop engineering solutions leveraging Python Collaborate with colleagues on technical designs and code reviews Deploy and operate services developed by the team Depending on your seniority, coach, mentor, and offer career development feedback Develop and evangelize great engineering and organizational practices
What we are looking for in you
Exceptional academic track record from both high school and university Undergraduate degree in a technical subject or a compelling narrative about your alternative chosen path Track record of going above-and-beyond expectations to achieve outstanding results Experience with software development in Python Professional written and spoken English with excellent presentation skills Result-oriented, with a personal drive to meet commitments Ability to travel internationally twice a year, for company events up to two weeks long
Nice-to-have skills
Performance engineering and security experience Experience with Airbyte, Ranger, Temporal, or Trino
What we offer colleagues
We consider geographical location, experience, and performance in shaping compensation worldwide. We revisit compensation annually (and more often for graduates and associates) to ensure we recognize outstanding performance. In addition to base pay, we offer a performance-driven annual bonus or commission. We provide all team members with additional benefits, which reflect our values and ideals. We balance our programs to meet local needs and ensure fairness globally.
Distributed work environment with twice-yearly team sprints in person Personal learning and development budget of USD 2,000 per year Annual compensation review Recognition rewards Annual holiday leave Maternity and paternity leave Employee Assistance Program Opportunity to travel to new locations to meet colleagues Priority Pass, and travel upgrades for long haul company events
About Canonical
Canonical is a pioneering tech firm at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open source projects and the platform for AI, IoT and the cloud, we are changing the world of software. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence - in order to succeed, we need to be the best at what we do. Most colleagues at Canonical have worked from home since its inception in 2004. Working here is a step into the future, and will challenge you to think differently, work smarter, learn new skills, and raise your game.
Canonical is an equal opportunity employer
We are proud to foster a workplace free from discrimination. Diversity of experience, perspectives, and background create a better work environment and better products. Whatever your identity, we will give your application fair consideration.
Seniority level
Seniority level Mid-Senior level Employment type
Employment type Full-time Job function
Job function Information Technology Industries Software Development Referrals increase your chances of interviewing at Canonical by 2x Get notified about new Lead Software System Engineer jobs in
Doha, Doha, Qatar . Senior Software Engineer - packaging - optimize Ubuntu Server
Senior Software Engineer - packaging - optimize Ubuntu Server
Senior Software Engineer - packaging - optimize Ubuntu Server
Senior Software Engineer - packaging - optimize Ubuntu Server
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr
Data Engineer
Posted 4 days ago
Job Viewed
Job Description
The Data Engineer handles thedesign, development, and maintenance of data pipelines, ETL processes, anddatabase management to support AI and data science initiatives. This roleinvolves ensuring data quality, scalability, and performance across all dataengineering activities.
- Design, develop, and maintain data pipelines, ETL processes,and database systems to support AI and data science initiatives.
- Collaborate with data scientists, AI/ML engineers, and otherstakeholders to understand data requirements and ensure data availability andquality.
- Implement data governance, security, and regulatorystandards in all data engineering activities.
- Optimize data pipelines and processes for scalability,performance, and cost-efficiency.
- Monitor and ensure the performance and reliability of datasystems, identifying and resolving issues as needed.
- Stay updated with the latest advancements in dataengineering technologies and best practices.
- Provide support and guidance to other team members asneeded.
- Prepare and present data engineering reports anddocumentation to senior management and stakeholders.
- Participate in project planning and contribute to thedevelopment of project timelines and deliverables.
- Perform other duties relevant to the job as assigned by theSr. Data Engineer or senior management.
- Bachelor’s degree in Data Engineering, Computer Science, ora related field
- Relevant certifications (e.g., Google Cloud ProfessionalData Engineer, AWS Certified Big Data – Specialty) are preferred
- Minimum of 3 years of experience in data engineering orrelated fields
- Experience in designing and implementing data pipelines, ETLprocesses, and database systems for AI or technology-focused products
- Strong programming skills in languages such as Python, Java,or SQL
- Proficiency in data engineering tools and frameworks (e.g.,Apache Spark, Kafka)
- Excellent problem-solving and analytical skills
- Strong communication and interpersonal skills
- Attention to detail and commitment to quality
- In-depth understanding of data engineering principles, ETLprocesses, and database management
- Familiarity with cloud platforms (e.g., AWS, Azure, GoogleCloud) and their data services
- Knowledge of data governance, security, and regulatorystandards
- Ability to manage multiple tasks and prioritize effectively
- Strong attention to detail and commitment to deliveringhigh-quality work
- Ability to work independently and as part of a team
- Programming languages (e.g., Python, Java, SQL)
- Data engineering tools and frameworks (e.g., Apache Spark,Kafka)
- Data management systems (e.g., SQL, NoSQL databases)
- Collaboration and communication tools (e.g., Slack,Microsoft Teams)
Data Engineer
Posted 9 days ago
Job Viewed
Job Description
Job Description
Role: Data Engineer
Location: Doha, Qatar
Position Type: Full-Time / Onsite / Contract 1 year renewable
Language requirement: English language is required; Arabic is preferred but not mandatory.
Role Description
The Data Engineer is responsible for building, managing, and optimizing data pipelines and ensuring data flow from source to destination. This role requires expertise in data architecture, data lakehouse, data warehousing, and data integration. You will work closely with data architects, data analysts, and business system owners to deliver high-quality data ingestion solutions that meet business needs.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines for extracting, transforming, and loading (ETL) data from various sources.
- Automate data workflows to ensure data is processed consistently and reliably.
- Design and implement scalable and secure data solutions using Azure analytics services, Azure SQL Database, and/or Databricks. OAC experience is desired but not necessary.
- Develop and maintain data storage solutions, including data warehouses, data lakes, and databases, ensuring optimal performance and cost efficiency.
- Implement and maintain data quality checks to ensure data accuracy and consistency.
- Implement security measures to protect sensitive data, including encryption and access control.
- Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing.
- Monitor and optimize the performance of data pipelines, databases, and queries.
Qualifications
Education: Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or a related field.
Must Have
- 5+ years of experience in data engineering.
- Proven hands-on experience in data pipeline design and development on Azure cloud platform.
- 5+ years experience developing solutions using Azure Data Factory and Databricks.
- Proficiency in at least one scripting language such as Python, PowerShell, JavaScript, or Scala.
Nice To Have
- Previous experience with Informatica IICS data governance or IDQ is a strong plus.
- MS Power BI Development experience is good to have.
- Experience with Oracle Analytics Cloud is a strong plus.
Data Engineer
Posted 11 days ago
Job Viewed
Job Description
We are seeking a skilled Informatica Data Engineer to join our team. The successful candidate will have a minimum of five years of experience in data engineering, with expertise in the Informatica suite, including Informatica Data Governance and Data Quality. The role involves designing, developing, and optimizing data integration workflows, ensuring high-quality data standards, and supporting data governance initiatives.
The RoleYou will be responsible for:
- Design, develop, and maintain ETL processes using Informatica PowerCenter and other Informatica tools.
- Implement and manage data quality processes using Informatica Data Quality (IDQ).
- Support data governance initiatives by leveraging Informatica Data Governance solutions.
- Develop data pipelines and workflows to integrate data from multiple sources.
- Optimize performance of data integration processes and troubleshoot issues.
- Collaborate with business analysts and data architects to understand data requirements.
- Ensure compliance with data governance and security policies.
- Perform data profiling, cleansing, and enrichment to improve data accuracy and reliability.
- Document technical solutions and maintain best practices for data engineering.
- Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing.
- Minimum of 5 years of experience in data engineering using Informatica suite.
- Hands-on experience with Informatica Data Management Cloud (IDMC), Informatica Data Quality (IDQ) and Informatica MDM.
- Strong knowledge of ETL development, data integration, and data transformation techniques.
- Experience with SQL and relational databases such as Oracle, SQL Server, or PostgreSQL.
- Familiarity with cloud platforms such as Azure or Oracle Cloud.
- Strong analytical and problem-solving skills.
- Excellent communication and collaboration abilities.
- Work within a company with a solid track record of success.
- Work alongside & learn from best in class talent.
- Join a well-known brand within IT Services.
Data Engineer
Posted 11 days ago
Job Viewed
Job Description
Join to apply for the Data Engineer role at Leidos
Join to apply for the Data Engineer role at Leidos
This range is provided by Leidos. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.
Base pay range$104,650.00/yr - $89,175.00/yr
Description
Job Description
Leidos National Security Sector (NSS) is seeking a highly experienced and skilled mission-driven Data Engineer to support APOLLO/OPIAS in Tampa, FL. This role provides mission-critical support to Joint and Special Operations Forces (SOF) by designing, building, and maintaining scalable data infrastructure and pipelines that enable the planning, execution, and assessment of influence operations within the Irregular Warfare (IW) and Operations in the Information Environment (OIE). With expertise in cloud platforms, ETL frameworks, and both structured and semi-structured data, the data engineer ensures seamless ingestion, transformation, and normalization of diverse data sources—including social media, OSINT, SIGINT, cyber, and psychological operations platforms. The position will be responsible for providing OIE planners, analysts, and commanders with timely, mission-relevant data access for real-time alerting, influence mapping, and trend analysis. This position is on a future contract pending award announcement.
Possible locations for this position are as follows:
- MacDill (Tampa, FL)
- Al Udeid (Qatar)
- Fort Meade (Maryland)
- Northcom (Colorado Springs, CO)
- Camp Humphreys (Korea)
- Arifjan (Kuwait)
- Joint Base Pearl Harbor-Hickam (Hawaii)
- Fort Eisenhower (Georgia)
- Offutt AFB (Omaha, NE)
- Naval Operating Base Norfolk (Virginia)
- Southcom (Doral, FL)
- JB San Antonio (Texas)
- Stuttgart (Germany)
- Vicenza (Italy)
- Tyndall AFB (Florida)
- Conduct analysis of structured and semi-structured data sets to identify the effective integration for mission use.
- Design, build, and maintain the data infrastructure and pipelines that support the planning, execution, and assessment of influence operations.
- Review existing and emerging technical capabilities and offer recommendations on its potential value to enables OIE planners, analysts, and commanders to access, analyze, and operationalize large-scale datasets—often derived from social media, open-source intelligence (OSINT), cyber, SIGINT, or psychological operations platforms.
- Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field
- 8+ years of experience in data engineering or ETL pipeline development
- Experience with data ingestion, transformation, and normalization from diverse structured and unstructured sources
- Experience deploying in cloud environments (AWS, Azure, or GCP)
- Proficient in Python and at least one ETL framework (e.g., Airflow, NiFi, Luigi)
- Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL)
- Familiarity with version control tools (e.g., Git) and collaborative DevOps practices
- Ability to work in cross-functional teams alongside analysts, developers, and IO planners
- Strong documentation, communication, and troubleshooting skills
- Active TS/SCI security clearance
- Master’s degree in a technical discipline
- Experience supporting Information Operations, PSYOP/MISO, or WebOps
- Experience with data lake architecture, graph databases (e.g., Neo4j), or NoSQL stores (e.g., MongoDB)
- Experience building pipelines that support real-time alerting, trend analysis, and influence mapping
- Proficiency with data visualization frameworks (e.g., Kibana, Grafana, Plotly, or D3.js)
- Familiarity with OSINT data platforms (e.g., Babel Street, Echosec, Talkwalker, Pulsar, Meltwater, Maltego)
- Familiarity with containerized environments (Docker, Kubernetes)
- Understanding of foreign language datasets or multilingual processing (NLP/NLU in Arabic, Russian, Chinese, etc.)
- Background in API integration with social media platforms or dark web forums
Original Posting
July 25, 2025
For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
Pay Range
Pay Range $104,650 00 - 189,175.00
The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law. Seniority level
- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Information Technology
- Industries IT Services and IT Consulting
Referrals increase your chances of interviewing at Leidos by 2x
Sign in to set job alerts for “Data Engineer” roles. Data Security Engineer (HSM PKI) - Qatar (Onsite) location Software Engineer (Python/Linux/Packaging) Data Governance Analyst (Nationals Only) Python and Kubernetes Software Engineer - Data, AI/ML & Analytics Software Engineer - Python - Container Images Software Engineer - Python - Container Images Software Engineer - Python - Container Images Software Engineer (Infrastructure), International Public Sector Junior Software Engineer - Cross-platform C++ - Multipass Distributed Systems Software Engineer, Python / Go Software Engineer (Product), International Public Sector Software Engineer (Forward Deployed), International Public SectorWe’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrData Engineer
Posted today
Job Viewed
Job Description
Data Engineer
role at
Leidos Join to apply for the
Data Engineer
role at
Leidos This range is provided by Leidos. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Base pay range
$104,650.00/yr - $89,175.00/yr Description
Job Description
Leidos National Security Sector (NSS) is seeking a highly experienced and skilled mission-driven Data Engineer to support APOLLO/OPIAS in Tampa, FL.
This role provides mission-critical support to Joint and Special Operations Forces (SOF) by designing, building, and maintaining scalable data infrastructure and pipelines that enable the planning, execution, and assessment of influence operations within the Irregular Warfare (IW) and Operations in the Information Environment (OIE). With expertise in cloud platforms, ETL frameworks, and both structured and semi-structured data, the data engineer ensures seamless ingestion, transformation, and normalization of diverse data sources—including social media, OSINT, SIGINT, cyber, and psychological operations platforms. The position will be responsible for providing OIE planners, analysts, and commanders with timely, mission-relevant data access for real-time alerting, influence mapping, and trend analysis.
This position is on a future contract pending award announcement.
Possible locations for this position are as follows:
MacDill (Tampa, FL) Al Udeid (Qatar) Fort Meade (Maryland) Northcom (Colorado Springs, CO) Camp Humphreys (Korea) Arifjan (Kuwait) Joint Base Pearl Harbor-Hickam (Hawaii) Fort Eisenhower (Georgia) Offutt AFB (Omaha, NE) Naval Operating Base Norfolk (Virginia) Southcom (Doral, FL) JB San Antonio (Texas) Stuttgart (Germany) Vicenza (Italy) Tyndall AFB (Florida)
Key Responsibilities
Conduct analysis of structured and semi-structured data sets to identify the effective integration for mission use. Design, build, and maintain the data infrastructure and pipelines that support the planning, execution, and assessment of influence operations. Review existing and emerging technical capabilities and offer recommendations on its potential value to enables OIE planners, analysts, and commanders to access, analyze, and operationalize large-scale datasets—often derived from social media, open-source intelligence (OSINT), cyber, SIGINT, or psychological operations platforms.
Basic Qualifications
Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field 8+ years of experience in data engineering or ETL pipeline development Experience with data ingestion, transformation, and normalization from diverse structured and unstructured sources Experience deploying in cloud environments (AWS, Azure, or GCP) Proficient in Python and at least one ETL framework (e.g., Airflow, NiFi, Luigi) Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL) Familiarity with version control tools (e.g., Git) and collaborative DevOps practices Ability to work in cross-functional teams alongside analysts, developers, and IO planners Strong documentation, communication, and troubleshooting skills Active TS/SCI security clearance
Preferred Qualifications
Master’s degree in a technical discipline Experience supporting Information Operations, PSYOP/MISO, or WebOps Experience with data lake architecture, graph databases (e.g., Neo4j), or NoSQL stores (e.g., MongoDB) Experience building pipelines that support real-time alerting, trend analysis, and influence mapping Proficiency with data visualization frameworks (e.g., Kibana, Grafana, Plotly, or D3.js) Familiarity with OSINT data platforms (e.g., Babel Street, Echosec, Talkwalker, Pulsar, Meltwater, Maltego) Familiarity with containerized environments (Docker, Kubernetes) Understanding of foreign language datasets or multilingual processing (NLP/NLU in Arabic, Russian, Chinese, etc.) Background in API integration with social media platforms or dark web forums
EIO2024
Original Posting
July 25, 2025
For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
Pay Range
Pay Range $104,650 00 - 189,175.00
The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law. Seniority level
Seniority level Mid-Senior level Employment type
Employment type Full-time Job function
Job function Information Technology Industries IT Services and IT Consulting Referrals increase your chances of interviewing at Leidos by 2x Sign in to set job alerts for “Data Engineer” roles.
Data Security Engineer (HSM PKI) - Qatar (Onsite) location
Software Engineer (Python/Linux/Packaging)
Data Governance Analyst (Nationals Only)
Python and Kubernetes Software Engineer - Data, AI/ML & Analytics
Software Engineer - Python - Container Images
Software Engineer - Python - Container Images
Software Engineer - Python - Container Images
Software Engineer (Infrastructure), International Public Sector
Junior Software Engineer - Cross-platform C++ - Multipass
Distributed Systems Software Engineer, Python / Go
Software Engineer (Product), International Public Sector
Software Engineer (Forward Deployed), International Public Sector
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr
Be The First To Know
About the latest Python engineer Jobs in Qatar !
Data Engineer
Posted 5 days ago
Job Viewed
Job Description
Responsibilities and Duties
Design, develop, and maintain data pipelines, ETL processes,and database systems to support AI and data science initiatives. Collaborate with data scientists, AI/ML engineers, and otherstakeholders to understand data requirements and ensure data availability andquality. Implement data governance, security, and regulatorystandards in all data engineering activities. Optimize data pipelines and processes for scalability,performance, and cost-efficiency. Monitor and ensure the performance and reliability of datasystems, identifying and resolving issues as needed. Stay updated with the latest advancements in dataengineering technologies and best practices. Provide support and guidance to other team members asneeded. Prepare and present data engineering reports anddocumentation to senior management and stakeholders. Participate in project planning and contribute to thedevelopment of project timelines and deliverables. Perform other duties relevant to the job as assigned by theSr. Data Engineer or senior management.
Requirements
Bachelor’s degree in Data Engineering, Computer Science, ora related field Relevant certifications (e.g., Google Cloud ProfessionalData Engineer, AWS Certified Big Data – Specialty) are preferred Minimum of 3 years of experience in data engineering orrelated fields Experience in designing and implementing data pipelines, ETLprocesses, and database systems for AI or technology-focused products Strong programming skills in languages such as Python, Java,or SQL Proficiency in data engineering tools and frameworks (e.g.,Apache Spark, Kafka) Excellent problem-solving and analytical skills Strong communication and interpersonal skills Attention to detail and commitment to quality In-depth understanding of data engineering principles, ETLprocesses, and database management Familiarity with cloud platforms (e.g., AWS, Azure, GoogleCloud) and their data services Knowledge of data governance, security, and regulatorystandards Ability to manage multiple tasks and prioritize effectively Strong attention to detail and commitment to deliveringhigh-quality work Ability to work independently and as part of a team Programming languages (e.g., Python, Java, SQL) Data engineering tools and frameworks (e.g., Apache Spark,Kafka) Data management systems (e.g., SQL, NoSQL databases) Collaboration and communication tools (e.g., Slack,Microsoft Teams)
#J-18808-Ljbffr
Data Engineer
Posted 9 days ago
Job Viewed
Job Description
Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Must Have 5+ years of experience in data engineering. Proven hands-on experience in data pipeline design and development on Azure cloud platform. 5+ years experience developing solutions using Azure Data Factory and Databricks. Proficiency in at least one scripting language such as Python, PowerShell, JavaScript, or Scala. Nice To Have Previous experience with Informatica IICS data governance or IDQ is a strong plus. MS Power BI Development experience is good to have. Experience with Oracle Analytics Cloud is a strong plus.
#J-18808-Ljbffr
Data Engineer
Posted 10 days ago
Job Viewed
Job Description
You will be responsible for: Design, develop, and maintain ETL processes using Informatica PowerCenter and other Informatica tools. Implement and manage data quality processes using Informatica Data Quality (IDQ). Support data governance initiatives by leveraging Informatica Data Governance solutions. Develop data pipelines and workflows to integrate data from multiple sources. Optimize performance of data integration processes and troubleshoot issues. Collaborate with business analysts and data architects to understand data requirements. Ensure compliance with data governance and security policies. Perform data profiling, cleansing, and enrichment to improve data accuracy and reliability. Document technical solutions and maintain best practices for data engineering. Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing. Ideal Profile
Minimum of 5 years of experience in data engineering using Informatica suite. Hands-on experience with Informatica Data Management Cloud (IDMC), Informatica Data Quality (IDQ) and Informatica MDM. Strong knowledge of ETL development, data integration, and data transformation techniques. Experience with SQL and relational databases such as Oracle, SQL Server, or PostgreSQL. Familiarity with cloud platforms such as Azure or Oracle Cloud. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. What's on Offer?
Work within a company with a solid track record of success. Work alongside & learn from best in class talent. Join a well-known brand within IT Services.
#J-18808-Ljbffr