41 Data Engineering jobs in Qatar
Engineering Manager - Data Platform
Posted 7 days ago
Job Viewed
Job Description
Join to apply for the Engineering Manager - Data Platform role at Canonical
3 days ago Be among the first 25 applicants
Join to apply for the Engineering Manager - Data Platform role at Canonical
Get AI-powered advice on this job and more exclusive features.
Canonical is building a comprehensive suite of multi-cloud and on-premise data solutions for the enterprise. We want to make it easy to operate any database on any cloud, or on premise. The data platform team covers the full range of data stores and data technologies, spanning from big data, NoSQL, cache-layer capabilities, and analytics; all the way to structured SQL engines like Postgres and MySQL. We aim to deliver fault-tolerant mission-critical distributed systems, and the world's best data platform.
We are looking for technical Engineering Managers to lead teams focused on Big Data and MySQL databases. We write code in Python and encode modern operational practices for data applications at scale on Kubernetes and cloud machines.
Location: This role can be filled in European, Middle East, African or any American region / time zone.
What your day will look like
- You will lead a team building scalable data solutions for Kubernetes and cloud machines
- You will hire, coach, mentor, provide feedback, and lead your team by example
- You will demonstrate sound engineering skill by directly contributing code when needed
- Effectively set and manage expectations with other engineering teams, senior management, and external stakeholders
- Advocate modern, agile software development practices
- Develop and evangelize great engineering and organizational practices
- Ensure that your team delivers excellent products that users love by maintaining a culture of quality and engineering excellence
- Grow a healthy, collaborative engineering culture aligned with the company's values.
- Be an active part of the leadership team and collaborate with other leaders in the organization
- Work from home with global travel twice yearly, for internal events of one or two weeks duration
- A software engineering background, preferably with Python and Golang experience
- Experience running in production and at scale, preferably Big Data or MySQL
- Excellent judgement about people - their motivations, abilities, developmental needs, and prospects for success
- Proven ability to build high-quality, open-source software
- Proven to drive good engineering practices around performance and quality
- An open-minded attitude to new technologies and the drive to push the boundaries of what is possible
- The ambition to build products that improve how people operate software and infrastructure everywhere
- Love developing and growing people and have a track record of doing it
- Knowledgeable and passionate about software development
- Specialist knowledge in one or more of Spark, Superset, MySQL, or similar
- Prior experience working with open source and a will to build products with the community
Your base pay will depend on various factors including your geographical location, level of experience, knowledge and skills. In addition to the benefits above, certain roles are also eligible for additional benefits and rewards including annual bonuses and sales incentives based on revenue or utilisation. Our compensation philosophy is to ensure equity right across our global workforce.
In addition to a competitive base pay, we provide all team members with additional benefits, which reflect our values and ideals. Please note that additional benefits may apply depending on the work location and, for more information on these, please ask your Talent Partner.
- Fully remote working environment - we've been working remotely since 2004!
- Personal learning and development budget of 2,000USD per annum
- Annual compensation review
- Recognition rewards
- Annual holiday leave
- Parental Leave
- Employee Assistance Programme
- Opportunity to travel to new locations to meet colleagues at 'sprints'
- Priority Pass for travel and travel upgrades for long haul company events
Canonical is a pioneering tech firm that is at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open source projects and the platform for AI, IoT and the cloud, we are changing the world on a daily basis. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence - in order to succeed, we need to be the best at what we do.
Canonical has been a remote-first company since its inception in 2004. Work at Canonical is a step into the future, and will challenge you to think differently, work smarter, learn new skills, and raise your game. Canonical provides a unique window into the world of 21st-century digital business.
Canonical is an equal opportunity employer
We are proud to foster a workplace free from discrimination. Diversity of experience, perspectives, and background create a better work environment and better products. Whatever your identity we will give your application fair consideration.
Seniority level
- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Engineering and Information Technology
- Industries Software Development
Referrals increase your chances of interviewing at Canonical by 2x
Get notified about new Engineering Manager jobs in Doha, Doha, Qatar .
Engineering Manager - Solutions Engineering Software Engineering Manager - Sustaining Engineering Technical Manager - Automotive and Industrial Engineering Manager - Public Cloud, Python, Golang Engineering Manager - Build and Release Infrastructure Software Engineering Manager - Desktop and Embedded Linux Optimisation Engineering Manager - Build and Release Infrastructure Software Engineering Manager - Container and Virtualisation Infrastructure Software Engineering Manager, Ubuntu Gaming Salesforce Engineering Manager, Commercial Systems Software Platform Engineering Manager - Ubuntu for Next-Gen Silicon Engineering Manager - Security Standards and Hardening Linux Engineering Manager - Optimisation for Latest Hardware Embedded Linux Field Engineering Manager Linux Enablement - Software Engineering Manager Ubuntu Enablement - Software Engineering Manager Revenue Accounting Manager - Software Industry, IFRS 15, NetSuiteWe’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrEngineering Manager - Data Platform
Posted 6 days ago
Job Viewed
Job Description
Engineering Manager - Data Platform
role at
Canonical 3 days ago Be among the first 25 applicants Join to apply for the
Engineering Manager - Data Platform
role at
Canonical Get AI-powered advice on this job and more exclusive features. Canonical is building a comprehensive suite of multi-cloud and on-premise data solutions for the enterprise. We want to make it easy to operate any database on any cloud, or on premise. The data platform team covers the full range of data stores and data technologies, spanning from big data, NoSQL, cache-layer capabilities, and analytics; all the way to structured SQL engines like Postgres and MySQL. We aim to deliver fault-tolerant mission-critical distributed systems, and the world's best data platform.
We are looking for technical Engineering Managers to lead teams focused on Big Data and MySQL databases. We write code in Python and encode modern operational practices for data applications at scale on Kubernetes and cloud machines.
Location:
This role can be filled in European, Middle East, African or any American region / time zone.
What your day will look like
You will lead a team building scalable data solutions for Kubernetes and cloud machines You will hire, coach, mentor, provide feedback, and lead your team by example You will demonstrate sound engineering skill by directly contributing code when needed Effectively set and manage expectations with other engineering teams, senior management, and external stakeholders Advocate modern, agile software development practices Develop and evangelize great engineering and organizational practices Ensure that your team delivers excellent products that users love by maintaining a culture of quality and engineering excellence Grow a healthy, collaborative engineering culture aligned with the company's values. Be an active part of the leadership team and collaborate with other leaders in the organization Work from home with global travel twice yearly, for internal events of one or two weeks duration
What we are looking for in you
A software engineering background, preferably with Python and Golang experience Experience running in production and at scale, preferably Big Data or MySQL Excellent judgement about people - their motivations, abilities, developmental needs, and prospects for success Proven ability to build high-quality, open-source software Proven to drive good engineering practices around performance and quality An open-minded attitude to new technologies and the drive to push the boundaries of what is possible The ambition to build products that improve how people operate software and infrastructure everywhere Love developing and growing people and have a track record of doing it Knowledgeable and passionate about software development
Additional Skills That You Might Also Bring
Specialist knowledge in one or more of Spark, Superset, MySQL, or similar Prior experience working with open source and a will to build products with the community
What we offer you
Your base pay will depend on various factors including your geographical location, level of experience, knowledge and skills. In addition to the benefits above, certain roles are also eligible for additional benefits and rewards including annual bonuses and sales incentives based on revenue or utilisation. Our compensation philosophy is to ensure equity right across our global workforce.
In addition to a competitive base pay, we provide all team members with additional benefits, which reflect our values and ideals. Please note that additional benefits may apply depending on the work location and, for more information on these, please ask your Talent Partner.
Fully remote working environment - we've been working remotely since 2004! Personal learning and development budget of 2,000USD per annum Annual compensation review Recognition rewards Annual holiday leave Parental Leave Employee Assistance Programme Opportunity to travel to new locations to meet colleagues at 'sprints' Priority Pass for travel and travel upgrades for long haul company events
About Canonical
Canonical is a pioneering tech firm that is at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open source projects and the platform for AI, IoT and the cloud, we are changing the world on a daily basis. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence - in order to succeed, we need to be the best at what we do.
Canonical has been a remote-first company since its inception in 2004. Work at Canonical is a step into the future, and will challenge you to think differently, work smarter, learn new skills, and raise your game. Canonical provides a unique window into the world of 21st-century digital business.
Canonical is an equal opportunity employer
We are proud to foster a workplace free from discrimination. Diversity of experience, perspectives, and background create a better work environment and better products. Whatever your identity we will give your application fair consideration.
Seniority level
Seniority level Mid-Senior level Employment type
Employment type Full-time Job function
Job function Engineering and Information Technology Industries Software Development Referrals increase your chances of interviewing at Canonical by 2x Get notified about new Engineering Manager jobs in
Doha, Doha, Qatar . Engineering Manager - Solutions Engineering
Software Engineering Manager - Sustaining Engineering
Technical Manager - Automotive and Industrial
Engineering Manager - Public Cloud, Python, Golang
Engineering Manager - Build and Release Infrastructure
Software Engineering Manager - Desktop and Embedded Linux Optimisation
Engineering Manager - Build and Release Infrastructure
Software Engineering Manager - Container and Virtualisation Infrastructure
Software Engineering Manager, Ubuntu Gaming
Salesforce Engineering Manager, Commercial Systems
Software Platform Engineering Manager - Ubuntu for Next-Gen Silicon
Engineering Manager - Security Standards and Hardening
Linux Engineering Manager - Optimisation for Latest Hardware
Embedded Linux Field Engineering Manager
Linux Enablement - Software Engineering Manager
Ubuntu Enablement - Software Engineering Manager
Revenue Accounting Manager - Software Industry, IFRS 15, NetSuite
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr
Data Engineer
Posted 3 days ago
Job Viewed
Job Description
The Data Engineer handles thedesign, development, and maintenance of data pipelines, ETL processes, anddatabase management to support AI and data science initiatives. This roleinvolves ensuring data quality, scalability, and performance across all dataengineering activities.
- Design, develop, and maintain data pipelines, ETL processes,and database systems to support AI and data science initiatives.
- Collaborate with data scientists, AI/ML engineers, and otherstakeholders to understand data requirements and ensure data availability andquality.
- Implement data governance, security, and regulatorystandards in all data engineering activities.
- Optimize data pipelines and processes for scalability,performance, and cost-efficiency.
- Monitor and ensure the performance and reliability of datasystems, identifying and resolving issues as needed.
- Stay updated with the latest advancements in dataengineering technologies and best practices.
- Provide support and guidance to other team members asneeded.
- Prepare and present data engineering reports anddocumentation to senior management and stakeholders.
- Participate in project planning and contribute to thedevelopment of project timelines and deliverables.
- Perform other duties relevant to the job as assigned by theSr. Data Engineer or senior management.
- Bachelor’s degree in Data Engineering, Computer Science, ora related field
- Relevant certifications (e.g., Google Cloud ProfessionalData Engineer, AWS Certified Big Data – Specialty) are preferred
- Minimum of 3 years of experience in data engineering orrelated fields
- Experience in designing and implementing data pipelines, ETLprocesses, and database systems for AI or technology-focused products
- Strong programming skills in languages such as Python, Java,or SQL
- Proficiency in data engineering tools and frameworks (e.g.,Apache Spark, Kafka)
- Excellent problem-solving and analytical skills
- Strong communication and interpersonal skills
- Attention to detail and commitment to quality
- In-depth understanding of data engineering principles, ETLprocesses, and database management
- Familiarity with cloud platforms (e.g., AWS, Azure, GoogleCloud) and their data services
- Knowledge of data governance, security, and regulatorystandards
- Ability to manage multiple tasks and prioritize effectively
- Strong attention to detail and commitment to deliveringhigh-quality work
- Ability to work independently and as part of a team
- Programming languages (e.g., Python, Java, SQL)
- Data engineering tools and frameworks (e.g., Apache Spark,Kafka)
- Data management systems (e.g., SQL, NoSQL databases)
- Collaboration and communication tools (e.g., Slack,Microsoft Teams)
Data Engineer
Posted 9 days ago
Job Viewed
Job Description
Job Description
Role: Data Engineer
Location: Doha, Qatar
Position Type: Full-Time / Onsite / Contract 1 year renewable
Language requirement: English language is required; Arabic is preferred but not mandatory.
Role Description
The Data Engineer is responsible for building, managing, and optimizing data pipelines and ensuring data flow from source to destination. This role requires expertise in data architecture, data lakehouse, data warehousing, and data integration. You will work closely with data architects, data analysts, and business system owners to deliver high-quality data ingestion solutions that meet business needs.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines for extracting, transforming, and loading (ETL) data from various sources.
- Automate data workflows to ensure data is processed consistently and reliably.
- Design and implement scalable and secure data solutions using Azure analytics services, Azure SQL Database, and/or Databricks. OAC experience is desired but not necessary.
- Develop and maintain data storage solutions, including data warehouses, data lakes, and databases, ensuring optimal performance and cost efficiency.
- Implement and maintain data quality checks to ensure data accuracy and consistency.
- Implement security measures to protect sensitive data, including encryption and access control.
- Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing.
- Monitor and optimize the performance of data pipelines, databases, and queries.
Qualifications
Education: Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or a related field.
Must Have
- 5+ years of experience in data engineering.
- Proven hands-on experience in data pipeline design and development on Azure cloud platform.
- 5+ years experience developing solutions using Azure Data Factory and Databricks.
- Proficiency in at least one scripting language such as Python, PowerShell, JavaScript, or Scala.
Nice To Have
- Previous experience with Informatica IICS data governance or IDQ is a strong plus.
- MS Power BI Development experience is good to have.
- Experience with Oracle Analytics Cloud is a strong plus.
Data Engineer
Posted 10 days ago
Job Viewed
Job Description
We are seeking a skilled Informatica Data Engineer to join our team. The successful candidate will have a minimum of five years of experience in data engineering, with expertise in the Informatica suite, including Informatica Data Governance and Data Quality. The role involves designing, developing, and optimizing data integration workflows, ensuring high-quality data standards, and supporting data governance initiatives.
The RoleYou will be responsible for:
- Design, develop, and maintain ETL processes using Informatica PowerCenter and other Informatica tools.
- Implement and manage data quality processes using Informatica Data Quality (IDQ).
- Support data governance initiatives by leveraging Informatica Data Governance solutions.
- Develop data pipelines and workflows to integrate data from multiple sources.
- Optimize performance of data integration processes and troubleshoot issues.
- Collaborate with business analysts and data architects to understand data requirements.
- Ensure compliance with data governance and security policies.
- Perform data profiling, cleansing, and enrichment to improve data accuracy and reliability.
- Document technical solutions and maintain best practices for data engineering.
- Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing.
- Minimum of 5 years of experience in data engineering using Informatica suite.
- Hands-on experience with Informatica Data Management Cloud (IDMC), Informatica Data Quality (IDQ) and Informatica MDM.
- Strong knowledge of ETL development, data integration, and data transformation techniques.
- Experience with SQL and relational databases such as Oracle, SQL Server, or PostgreSQL.
- Familiarity with cloud platforms such as Azure or Oracle Cloud.
- Strong analytical and problem-solving skills.
- Excellent communication and collaboration abilities.
- Work within a company with a solid track record of success.
- Work alongside & learn from best in class talent.
- Join a well-known brand within IT Services.
Data Engineer
Posted 10 days ago
Job Viewed
Job Description
Join to apply for the Data Engineer role at Leidos
Join to apply for the Data Engineer role at Leidos
This range is provided by Leidos. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.
Base pay range$104,650.00/yr - $89,175.00/yr
Description
Job Description
Leidos National Security Sector (NSS) is seeking a highly experienced and skilled mission-driven Data Engineer to support APOLLO/OPIAS in Tampa, FL. This role provides mission-critical support to Joint and Special Operations Forces (SOF) by designing, building, and maintaining scalable data infrastructure and pipelines that enable the planning, execution, and assessment of influence operations within the Irregular Warfare (IW) and Operations in the Information Environment (OIE). With expertise in cloud platforms, ETL frameworks, and both structured and semi-structured data, the data engineer ensures seamless ingestion, transformation, and normalization of diverse data sources—including social media, OSINT, SIGINT, cyber, and psychological operations platforms. The position will be responsible for providing OIE planners, analysts, and commanders with timely, mission-relevant data access for real-time alerting, influence mapping, and trend analysis. This position is on a future contract pending award announcement.
Possible locations for this position are as follows:
- MacDill (Tampa, FL)
- Al Udeid (Qatar)
- Fort Meade (Maryland)
- Northcom (Colorado Springs, CO)
- Camp Humphreys (Korea)
- Arifjan (Kuwait)
- Joint Base Pearl Harbor-Hickam (Hawaii)
- Fort Eisenhower (Georgia)
- Offutt AFB (Omaha, NE)
- Naval Operating Base Norfolk (Virginia)
- Southcom (Doral, FL)
- JB San Antonio (Texas)
- Stuttgart (Germany)
- Vicenza (Italy)
- Tyndall AFB (Florida)
- Conduct analysis of structured and semi-structured data sets to identify the effective integration for mission use.
- Design, build, and maintain the data infrastructure and pipelines that support the planning, execution, and assessment of influence operations.
- Review existing and emerging technical capabilities and offer recommendations on its potential value to enables OIE planners, analysts, and commanders to access, analyze, and operationalize large-scale datasets—often derived from social media, open-source intelligence (OSINT), cyber, SIGINT, or psychological operations platforms.
- Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field
- 8+ years of experience in data engineering or ETL pipeline development
- Experience with data ingestion, transformation, and normalization from diverse structured and unstructured sources
- Experience deploying in cloud environments (AWS, Azure, or GCP)
- Proficient in Python and at least one ETL framework (e.g., Airflow, NiFi, Luigi)
- Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL)
- Familiarity with version control tools (e.g., Git) and collaborative DevOps practices
- Ability to work in cross-functional teams alongside analysts, developers, and IO planners
- Strong documentation, communication, and troubleshooting skills
- Active TS/SCI security clearance
- Master’s degree in a technical discipline
- Experience supporting Information Operations, PSYOP/MISO, or WebOps
- Experience with data lake architecture, graph databases (e.g., Neo4j), or NoSQL stores (e.g., MongoDB)
- Experience building pipelines that support real-time alerting, trend analysis, and influence mapping
- Proficiency with data visualization frameworks (e.g., Kibana, Grafana, Plotly, or D3.js)
- Familiarity with OSINT data platforms (e.g., Babel Street, Echosec, Talkwalker, Pulsar, Meltwater, Maltego)
- Familiarity with containerized environments (Docker, Kubernetes)
- Understanding of foreign language datasets or multilingual processing (NLP/NLU in Arabic, Russian, Chinese, etc.)
- Background in API integration with social media platforms or dark web forums
Original Posting
July 25, 2025
For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
Pay Range
Pay Range $104,650 00 - 189,175.00
The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law. Seniority level
- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Information Technology
- Industries IT Services and IT Consulting
Referrals increase your chances of interviewing at Leidos by 2x
Sign in to set job alerts for “Data Engineer” roles. Data Security Engineer (HSM PKI) - Qatar (Onsite) location Software Engineer (Python/Linux/Packaging) Data Governance Analyst (Nationals Only) Python and Kubernetes Software Engineer - Data, AI/ML & Analytics Software Engineer - Python - Container Images Software Engineer - Python - Container Images Software Engineer - Python - Container Images Software Engineer (Infrastructure), International Public Sector Junior Software Engineer - Cross-platform C++ - Multipass Distributed Systems Software Engineer, Python / Go Software Engineer (Product), International Public Sector Software Engineer (Forward Deployed), International Public SectorWe’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrData Engineer
Posted 4 days ago
Job Viewed
Job Description
Responsibilities and Duties
Design, develop, and maintain data pipelines, ETL processes,and database systems to support AI and data science initiatives. Collaborate with data scientists, AI/ML engineers, and otherstakeholders to understand data requirements and ensure data availability andquality. Implement data governance, security, and regulatorystandards in all data engineering activities. Optimize data pipelines and processes for scalability,performance, and cost-efficiency. Monitor and ensure the performance and reliability of datasystems, identifying and resolving issues as needed. Stay updated with the latest advancements in dataengineering technologies and best practices. Provide support and guidance to other team members asneeded. Prepare and present data engineering reports anddocumentation to senior management and stakeholders. Participate in project planning and contribute to thedevelopment of project timelines and deliverables. Perform other duties relevant to the job as assigned by theSr. Data Engineer or senior management.
Requirements
Bachelor’s degree in Data Engineering, Computer Science, ora related field Relevant certifications (e.g., Google Cloud ProfessionalData Engineer, AWS Certified Big Data – Specialty) are preferred Minimum of 3 years of experience in data engineering orrelated fields Experience in designing and implementing data pipelines, ETLprocesses, and database systems for AI or technology-focused products Strong programming skills in languages such as Python, Java,or SQL Proficiency in data engineering tools and frameworks (e.g.,Apache Spark, Kafka) Excellent problem-solving and analytical skills Strong communication and interpersonal skills Attention to detail and commitment to quality In-depth understanding of data engineering principles, ETLprocesses, and database management Familiarity with cloud platforms (e.g., AWS, Azure, GoogleCloud) and their data services Knowledge of data governance, security, and regulatorystandards Ability to manage multiple tasks and prioritize effectively Strong attention to detail and commitment to deliveringhigh-quality work Ability to work independently and as part of a team Programming languages (e.g., Python, Java, SQL) Data engineering tools and frameworks (e.g., Apache Spark,Kafka) Data management systems (e.g., SQL, NoSQL databases) Collaboration and communication tools (e.g., Slack,Microsoft Teams)
#J-18808-Ljbffr
Be The First To Know
About the latest Data engineering Jobs in Qatar !
Data Engineer
Posted 8 days ago
Job Viewed
Job Description
Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Must Have 5+ years of experience in data engineering. Proven hands-on experience in data pipeline design and development on Azure cloud platform. 5+ years experience developing solutions using Azure Data Factory and Databricks. Proficiency in at least one scripting language such as Python, PowerShell, JavaScript, or Scala. Nice To Have Previous experience with Informatica IICS data governance or IDQ is a strong plus. MS Power BI Development experience is good to have. Experience with Oracle Analytics Cloud is a strong plus.
#J-18808-Ljbffr
Data Engineer
Posted 9 days ago
Job Viewed
Job Description
You will be responsible for: Design, develop, and maintain ETL processes using Informatica PowerCenter and other Informatica tools. Implement and manage data quality processes using Informatica Data Quality (IDQ). Support data governance initiatives by leveraging Informatica Data Governance solutions. Develop data pipelines and workflows to integrate data from multiple sources. Optimize performance of data integration processes and troubleshoot issues. Collaborate with business analysts and data architects to understand data requirements. Ensure compliance with data governance and security policies. Perform data profiling, cleansing, and enrichment to improve data accuracy and reliability. Document technical solutions and maintain best practices for data engineering. Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing. Ideal Profile
Minimum of 5 years of experience in data engineering using Informatica suite. Hands-on experience with Informatica Data Management Cloud (IDMC), Informatica Data Quality (IDQ) and Informatica MDM. Strong knowledge of ETL development, data integration, and data transformation techniques. Experience with SQL and relational databases such as Oracle, SQL Server, or PostgreSQL. Familiarity with cloud platforms such as Azure or Oracle Cloud. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. What's on Offer?
Work within a company with a solid track record of success. Work alongside & learn from best in class talent. Join a well-known brand within IT Services.
#J-18808-Ljbffr
Data Engineer
Posted 27 days ago
Job Viewed
Job Description
Data Engineer
role at
Leidos Join to apply for the
Data Engineer
role at
Leidos This range is provided by Leidos. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Base pay range
$104,650.00/yr - $89,175.00/yr Description
Job Description
Leidos National Security Sector (NSS) is seeking a highly experienced and skilled mission-driven Data Engineer to support APOLLO/OPIAS in Tampa, FL.
This role provides mission-critical support to Joint and Special Operations Forces (SOF) by designing, building, and maintaining scalable data infrastructure and pipelines that enable the planning, execution, and assessment of influence operations within the Irregular Warfare (IW) and Operations in the Information Environment (OIE). With expertise in cloud platforms, ETL frameworks, and both structured and semi-structured data, the data engineer ensures seamless ingestion, transformation, and normalization of diverse data sources—including social media, OSINT, SIGINT, cyber, and psychological operations platforms. The position will be responsible for providing OIE planners, analysts, and commanders with timely, mission-relevant data access for real-time alerting, influence mapping, and trend analysis.
This position is on a future contract pending award announcement.
Possible locations for this position are as follows:
MacDill (Tampa, FL) Al Udeid (Qatar) Fort Meade (Maryland) Northcom (Colorado Springs, CO) Camp Humphreys (Korea) Arifjan (Kuwait) Joint Base Pearl Harbor-Hickam (Hawaii) Fort Eisenhower (Georgia) Offutt AFB (Omaha, NE) Naval Operating Base Norfolk (Virginia) Southcom (Doral, FL) JB San Antonio (Texas) Stuttgart (Germany) Vicenza (Italy) Tyndall AFB (Florida)
Key Responsibilities
Conduct analysis of structured and semi-structured data sets to identify the effective integration for mission use. Design, build, and maintain the data infrastructure and pipelines that support the planning, execution, and assessment of influence operations. Review existing and emerging technical capabilities and offer recommendations on its potential value to enables OIE planners, analysts, and commanders to access, analyze, and operationalize large-scale datasets—often derived from social media, open-source intelligence (OSINT), cyber, SIGINT, or psychological operations platforms.
Basic Qualifications
Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field 8+ years of experience in data engineering or ETL pipeline development Experience with data ingestion, transformation, and normalization from diverse structured and unstructured sources Experience deploying in cloud environments (AWS, Azure, or GCP) Proficient in Python and at least one ETL framework (e.g., Airflow, NiFi, Luigi) Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL) Familiarity with version control tools (e.g., Git) and collaborative DevOps practices Ability to work in cross-functional teams alongside analysts, developers, and IO planners Strong documentation, communication, and troubleshooting skills Active TS/SCI security clearance
Preferred Qualifications
Master’s degree in a technical discipline Experience supporting Information Operations, PSYOP/MISO, or WebOps Experience with data lake architecture, graph databases (e.g., Neo4j), or NoSQL stores (e.g., MongoDB) Experience building pipelines that support real-time alerting, trend analysis, and influence mapping Proficiency with data visualization frameworks (e.g., Kibana, Grafana, Plotly, or D3.js) Familiarity with OSINT data platforms (e.g., Babel Street, Echosec, Talkwalker, Pulsar, Meltwater, Maltego) Familiarity with containerized environments (Docker, Kubernetes) Understanding of foreign language datasets or multilingual processing (NLP/NLU in Arabic, Russian, Chinese, etc.) Background in API integration with social media platforms or dark web forums
EIO2024
Original Posting
July 25, 2025
For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
Pay Range
Pay Range $104,650 00 - 189,175.00
The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law. Seniority level
Seniority level Mid-Senior level Employment type
Employment type Full-time Job function
Job function Information Technology Industries IT Services and IT Consulting Referrals increase your chances of interviewing at Leidos by 2x Sign in to set job alerts for “Data Engineer” roles.
Data Security Engineer (HSM PKI) - Qatar (Onsite) location
Software Engineer (Python/Linux/Packaging)
Data Governance Analyst (Nationals Only)
Python and Kubernetes Software Engineer - Data, AI/ML & Analytics
Software Engineer - Python - Container Images
Software Engineer - Python - Container Images
Software Engineer - Python - Container Images
Software Engineer (Infrastructure), International Public Sector
Junior Software Engineer - Cross-platform C++ - Multipass
Distributed Systems Software Engineer, Python / Go
Software Engineer (Product), International Public Sector
Software Engineer (Forward Deployed), International Public Sector
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr