43 Big Data Hadoop jobs in Qatar

Big Data engineer

Doha, Doha Arizoglobal

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Experience : 4-6yrs

Job description :

We seek a talented Data Engineer with AI & ML knowledge to join our team. As a Data Engineer or MLOps Engineer, your primary responsibility will be to develop & integrate ML solutions that focus on technology improvements. Specifically, you will be working on projects involving leveraging AI/ML for Data Management Efficiencies & Query Optimizations.

Responsibilities:

  1. Collaborate with cross-functional teams such as Data Scientists, Product Partners, and Partner Team Developers to identify opportunities for Big Data, Query (Spark, Hive SQL, BigQuery, SQL) tuning opportunities that can be solved using machine learning and generative AI.
  2. Write clean, high-performance, high-quality, maintainable code.
  3. Create backend applications using Python, Docker, Google Cloud & in-house ML frameworks to orchestrate end-to-end applications.
  4. Design and develop Big Data Engineering Solutions & generative AI Applications ensuring scalability, efficiency, and maintainability of such solutions.
  5. Implement prompt engineering techniques to fine-tune and enhance LLMs for better performance and application-specific needs.
  6. Stay abreast of the latest advancements in the field of Generative AI Application Development and actively contribute to the research and development of new Generative AI Applications.

Requirements:

  1. Proven experience working as a Big Data & MLOps Engineer, with a focus on Python, Google Cloud, Spark, Spark SQL, BigQuery, and Generative AI Applications.
  2. Deep understanding and experience in tuning Dataproc, BigQuery, and Spark Applications.
  3. Solid knowledge of software engineering best practices, including version control systems (e.g., Git), code reviews, and testing methodologies.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Big Data engineer

Doha, Doha Arizoglobal

Posted 17 days ago

Job Viewed

Tap Again To Close

Job Description

Experience :

4-6yrs

Job description :

We seek a talented Data Engineer with AI & ML knowledge to join our team. As a Data Engineer or MLOps Engineer, your primary responsibility will be to develop & integrate ML solutions that focus on technology improvements. Specifically, you will be working on projects involving leveraging AI/ML for Data Management Efficiencies & Query Optimizations.

Responsibilities:

Collaborate with cross-functional teams such as Data Scientists, Product Partners, and Partner Team Developers to identify opportunities for Big Data, Query (Spark, Hive SQL, BigQuery, SQL) tuning opportunities that can be solved using machine learning and generative AI.

Write clean, high-performance, high-quality, maintainable code.

Create backend applications using Python, Docker, Google Cloud & in-house ML frameworks to orchestrate end-to-end applications.

Design and develop Big Data Engineering Solutions & generative AI Applications ensuring scalability, efficiency, and maintainability of such solutions.

Implement prompt engineering techniques to fine-tune and enhance LLMs for better performance and application-specific needs.

Stay abreast of the latest advancements in the field of Generative AI Application Development and actively contribute to the research and development of new Generative AI Applications.

Requirements:

Proven experience working as a Big Data & MLOps Engineer, with a focus on Python, Google Cloud, Spark, Spark SQL, BigQuery, and Generative AI Applications.

Deep understanding and experience in tuning Dataproc, BigQuery, and Spark Applications.

Solid knowledge of software engineering best practices, including version control systems (e.g., Git), code reviews, and testing methodologies.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Doha, Doha Id8media

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

The Data Engineer handles thedesign, development, and maintenance of data pipelines, ETL processes, anddatabase management to support AI and data science initiatives. This roleinvolves ensuring data quality, scalability, and performance across all dataengineering activities.

Responsibilities and Duties
  • Design, develop, and maintain data pipelines, ETL processes,and database systems to support AI and data science initiatives.
  • Collaborate with data scientists, AI/ML engineers, and otherstakeholders to understand data requirements and ensure data availability andquality.
  • Implement data governance, security, and regulatorystandards in all data engineering activities.
  • Optimize data pipelines and processes for scalability,performance, and cost-efficiency.
  • Monitor and ensure the performance and reliability of datasystems, identifying and resolving issues as needed.
  • Stay updated with the latest advancements in dataengineering technologies and best practices.
  • Provide support and guidance to other team members asneeded.
  • Prepare and present data engineering reports anddocumentation to senior management and stakeholders.
  • Participate in project planning and contribute to thedevelopment of project timelines and deliverables.
  • Perform other duties relevant to the job as assigned by theSr. Data Engineer or senior management.
Requirements
  • Bachelor’s degree in Data Engineering, Computer Science, ora related field
  • Relevant certifications (e.g., Google Cloud ProfessionalData Engineer, AWS Certified Big Data – Specialty) are preferred
  • Minimum of 3 years of experience in data engineering orrelated fields
  • Experience in designing and implementing data pipelines, ETLprocesses, and database systems for AI or technology-focused products
  • Strong programming skills in languages such as Python, Java,or SQL
  • Proficiency in data engineering tools and frameworks (e.g.,Apache Spark, Kafka)
  • Excellent problem-solving and analytical skills
  • Strong communication and interpersonal skills
  • Attention to detail and commitment to quality
  • In-depth understanding of data engineering principles, ETLprocesses, and database management
  • Familiarity with cloud platforms (e.g., AWS, Azure, GoogleCloud) and their data services
  • Knowledge of data governance, security, and regulatorystandards
  • Ability to manage multiple tasks and prioritize effectively
  • Strong attention to detail and commitment to deliveringhigh-quality work
  • Ability to work independently and as part of a team
  • Programming languages (e.g., Python, Java, SQL)
  • Data engineering tools and frameworks (e.g., Apache Spark,Kafka)
  • Data management systems (e.g., SQL, NoSQL databases)
  • Collaboration and communication tools (e.g., Slack,Microsoft Teams)
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Doha, Doha Arab Solutions

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description

Role: Data Engineer

Location: Doha, Qatar

Position Type: Full-Time / Onsite / Contract 1 year renewable

Language requirement: English language is required; Arabic is preferred but not mandatory.

Role Description

The Data Engineer is responsible for building, managing, and optimizing data pipelines and ensuring data flow from source to destination. This role requires expertise in data architecture, data lakehouse, data warehousing, and data integration. You will work closely with data architects, data analysts, and business system owners to deliver high-quality data ingestion solutions that meet business needs.

Key Responsibilities

  1. Design, develop, and maintain scalable data pipelines for extracting, transforming, and loading (ETL) data from various sources.
  2. Automate data workflows to ensure data is processed consistently and reliably.
  3. Design and implement scalable and secure data solutions using Azure analytics services, Azure SQL Database, and/or Databricks. OAC experience is desired but not necessary.
  4. Develop and maintain data storage solutions, including data warehouses, data lakes, and databases, ensuring optimal performance and cost efficiency.
  5. Implement and maintain data quality checks to ensure data accuracy and consistency.
  6. Implement security measures to protect sensitive data, including encryption and access control.
  7. Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing.
  8. Monitor and optimize the performance of data pipelines, databases, and queries.

Qualifications

Education: Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or a related field.

Must Have

  • 5+ years of experience in data engineering.
  • Proven hands-on experience in data pipeline design and development on Azure cloud platform.
  • 5+ years experience developing solutions using Azure Data Factory and Databricks.
  • Proficiency in at least one scripting language such as Python, PowerShell, JavaScript, or Scala.

Nice To Have

  • Previous experience with Informatica IICS data governance or IDQ is a strong plus.
  • MS Power BI Development experience is good to have.
  • Experience with Oracle Analytics Cloud is a strong plus.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Sygmetiv Business Solutions

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a skilled Informatica Data Engineer to join our team. The successful candidate will have a minimum of five years of experience in data engineering, with expertise in the Informatica suite, including Informatica Data Governance and Data Quality. The role involves designing, developing, and optimizing data integration workflows, ensuring high-quality data standards, and supporting data governance initiatives.

The Role

You will be responsible for:

  1. Design, develop, and maintain ETL processes using Informatica PowerCenter and other Informatica tools.
  2. Implement and manage data quality processes using Informatica Data Quality (IDQ).
  3. Support data governance initiatives by leveraging Informatica Data Governance solutions.
  4. Develop data pipelines and workflows to integrate data from multiple sources.
  5. Optimize performance of data integration processes and troubleshoot issues.
  6. Collaborate with business analysts and data architects to understand data requirements.
  7. Ensure compliance with data governance and security policies.
  8. Perform data profiling, cleansing, and enrichment to improve data accuracy and reliability.
  9. Document technical solutions and maintain best practices for data engineering.
  10. Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing.
Ideal Profile
  1. Minimum of 5 years of experience in data engineering using Informatica suite.
  2. Hands-on experience with Informatica Data Management Cloud (IDMC), Informatica Data Quality (IDQ) and Informatica MDM.
  3. Strong knowledge of ETL development, data integration, and data transformation techniques.
  4. Experience with SQL and relational databases such as Oracle, SQL Server, or PostgreSQL.
  5. Familiarity with cloud platforms such as Azure or Oracle Cloud.
  6. Strong analytical and problem-solving skills.
  7. Excellent communication and collaboration abilities.
What's on Offer?
  1. Work within a company with a solid track record of success.
  2. Work alongside & learn from best in class talent.
  3. Join a well-known brand within IT Services.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Doha, Doha Leidos

Posted 11 days ago

Job Viewed

Tap Again To Close

Job Description

Join to apply for the Data Engineer role at Leidos

Join to apply for the Data Engineer role at Leidos

This range is provided by Leidos. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.

Base pay range

$104,650.00/yr - $89,175.00/yr

Description

Job Description

Leidos National Security Sector (NSS) is seeking a highly experienced and skilled mission-driven Data Engineer to support APOLLO/OPIAS in Tampa, FL. This role provides mission-critical support to Joint and Special Operations Forces (SOF) by designing, building, and maintaining scalable data infrastructure and pipelines that enable the planning, execution, and assessment of influence operations within the Irregular Warfare (IW) and Operations in the Information Environment (OIE). With expertise in cloud platforms, ETL frameworks, and both structured and semi-structured data, the data engineer ensures seamless ingestion, transformation, and normalization of diverse data sources—including social media, OSINT, SIGINT, cyber, and psychological operations platforms. The position will be responsible for providing OIE planners, analysts, and commanders with timely, mission-relevant data access for real-time alerting, influence mapping, and trend analysis. This position is on a future contract pending award announcement.

Possible locations for this position are as follows:

  • MacDill (Tampa, FL)
  • Al Udeid (Qatar)
  • Fort Meade (Maryland)
  • Northcom (Colorado Springs, CO)
  • Camp Humphreys (Korea)
  • Arifjan (Kuwait)
  • Joint Base Pearl Harbor-Hickam (Hawaii)
  • Fort Eisenhower (Georgia)
  • Offutt AFB (Omaha, NE)
  • Naval Operating Base Norfolk (Virginia)
  • Southcom (Doral, FL)
  • JB San Antonio (Texas)
  • Stuttgart (Germany)
  • Vicenza (Italy)
  • Tyndall AFB (Florida)

Key Responsibilities

  • Conduct analysis of structured and semi-structured data sets to identify the effective integration for mission use.
  • Design, build, and maintain the data infrastructure and pipelines that support the planning, execution, and assessment of influence operations.
  • Review existing and emerging technical capabilities and offer recommendations on its potential value to enables OIE planners, analysts, and commanders to access, analyze, and operationalize large-scale datasets—often derived from social media, open-source intelligence (OSINT), cyber, SIGINT, or psychological operations platforms.

Basic Qualifications

  • Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field
  • 8+ years of experience in data engineering or ETL pipeline development
  • Experience with data ingestion, transformation, and normalization from diverse structured and unstructured sources
  • Experience deploying in cloud environments (AWS, Azure, or GCP)
  • Proficient in Python and at least one ETL framework (e.g., Airflow, NiFi, Luigi)
  • Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL)
  • Familiarity with version control tools (e.g., Git) and collaborative DevOps practices
  • Ability to work in cross-functional teams alongside analysts, developers, and IO planners
  • Strong documentation, communication, and troubleshooting skills
  • Active TS/SCI security clearance

Preferred Qualifications

  • Master’s degree in a technical discipline
  • Experience supporting Information Operations, PSYOP/MISO, or WebOps
  • Experience with data lake architecture, graph databases (e.g., Neo4j), or NoSQL stores (e.g., MongoDB)
  • Experience building pipelines that support real-time alerting, trend analysis, and influence mapping
  • Proficiency with data visualization frameworks (e.g., Kibana, Grafana, Plotly, or D3.js)
  • Familiarity with OSINT data platforms (e.g., Babel Street, Echosec, Talkwalker, Pulsar, Meltwater, Maltego)
  • Familiarity with containerized environments (Docker, Kubernetes)
  • Understanding of foreign language datasets or multilingual processing (NLP/NLU in Arabic, Russian, Chinese, etc.)
  • Background in API integration with social media platforms or dark web forums

EIO2024

Original Posting

July 25, 2025

For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.

Pay Range

Pay Range $104,650 00 - 189,175.00

The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law.

Seniority level
  • Seniority level Mid-Senior level
Employment type
  • Employment type Full-time
Job function
  • Job function Information Technology
  • Industries IT Services and IT Consulting

Referrals increase your chances of interviewing at Leidos by 2x

Sign in to set job alerts for “Data Engineer” roles. Data Security Engineer (HSM PKI) - Qatar (Onsite) location Software Engineer (Python/Linux/Packaging) Data Governance Analyst (Nationals Only) Python and Kubernetes Software Engineer - Data, AI/ML & Analytics Software Engineer - Python - Container Images Software Engineer - Python - Container Images Software Engineer - Python - Container Images Software Engineer (Infrastructure), International Public Sector Junior Software Engineer - Cross-platform C++ - Multipass Distributed Systems Software Engineer, Python / Go Software Engineer (Product), International Public Sector Software Engineer (Forward Deployed), International Public Sector

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Doha, Doha Id8media

Posted 4 days ago

Job Viewed

Tap Again To Close

Job Description

The Data Engineer handles thedesign, development, and maintenance of data pipelines, ETL processes, anddatabase management to support AI and data science initiatives. This roleinvolves ensuring data quality, scalability, and performance across all dataengineering activities.

Responsibilities and Duties

Design, develop, and maintain data pipelines, ETL processes,and database systems to support AI and data science initiatives. Collaborate with data scientists, AI/ML engineers, and otherstakeholders to understand data requirements and ensure data availability andquality. Implement data governance, security, and regulatorystandards in all data engineering activities. Optimize data pipelines and processes for scalability,performance, and cost-efficiency. Monitor and ensure the performance and reliability of datasystems, identifying and resolving issues as needed. Stay updated with the latest advancements in dataengineering technologies and best practices. Provide support and guidance to other team members asneeded. Prepare and present data engineering reports anddocumentation to senior management and stakeholders. Participate in project planning and contribute to thedevelopment of project timelines and deliverables. Perform other duties relevant to the job as assigned by theSr. Data Engineer or senior management.

Requirements

Bachelor’s degree in Data Engineering, Computer Science, ora related field Relevant certifications (e.g., Google Cloud ProfessionalData Engineer, AWS Certified Big Data – Specialty) are preferred Minimum of 3 years of experience in data engineering orrelated fields Experience in designing and implementing data pipelines, ETLprocesses, and database systems for AI or technology-focused products Strong programming skills in languages such as Python, Java,or SQL Proficiency in data engineering tools and frameworks (e.g.,Apache Spark, Kafka) Excellent problem-solving and analytical skills Strong communication and interpersonal skills Attention to detail and commitment to quality In-depth understanding of data engineering principles, ETLprocesses, and database management Familiarity with cloud platforms (e.g., AWS, Azure, GoogleCloud) and their data services Knowledge of data governance, security, and regulatorystandards Ability to manage multiple tasks and prioritize effectively Strong attention to detail and commitment to deliveringhigh-quality work Ability to work independently and as part of a team Programming languages (e.g., Python, Java, SQL) Data engineering tools and frameworks (e.g., Apache Spark,Kafka) Data management systems (e.g., SQL, NoSQL databases) Collaboration and communication tools (e.g., Slack,Microsoft Teams)

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Big data hadoop Jobs in Qatar !

Data Engineer

Doha, Doha Arab Solutions

Posted 8 days ago

Job Viewed

Tap Again To Close

Job Description

Job Description Role: Data Engineer Location: Doha, Qatar Position Type: Full-Time / Onsite / Contract 1 year renewable Language requirement: English language is required; Arabic is preferred but not mandatory. Role Description The Data Engineer is responsible for building, managing, and optimizing data pipelines and ensuring data flow from source to destination. This role requires expertise in data architecture, data lakehouse, data warehousing, and data integration. You will work closely with data architects, data analysts, and business system owners to deliver high-quality data ingestion solutions that meet business needs. Key Responsibilities Design, develop, and maintain scalable data pipelines for extracting, transforming, and loading (ETL) data from various sources. Automate data workflows to ensure data is processed consistently and reliably. Design and implement scalable and secure data solutions using Azure analytics services, Azure SQL Database, and/or Databricks. OAC experience is desired but not necessary. Develop and maintain data storage solutions, including data warehouses, data lakes, and databases, ensuring optimal performance and cost efficiency. Implement and maintain data quality checks to ensure data accuracy and consistency. Implement security measures to protect sensitive data, including encryption and access control. Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing. Monitor and optimize the performance of data pipelines, databases, and queries. Qualifications Education:

Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or a related field. Must Have 5+ years of experience in data engineering. Proven hands-on experience in data pipeline design and development on Azure cloud platform. 5+ years experience developing solutions using Azure Data Factory and Databricks. Proficiency in at least one scripting language such as Python, PowerShell, JavaScript, or Scala. Nice To Have Previous experience with Informatica IICS data governance or IDQ is a strong plus. MS Power BI Development experience is good to have. Experience with Oracle Analytics Cloud is a strong plus.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Doha, Doha Sygmetiv Business Solutions

Posted 9 days ago

Job Viewed

Tap Again To Close

Job Description

We are seeking a skilled Informatica Data Engineer to join our team. The successful candidate will have a minimum of five years of experience in data engineering, with expertise in the Informatica suite, including Informatica Data Governance and Data Quality. The role involves designing, developing, and optimizing data integration workflows, ensuring high-quality data standards, and supporting data governance initiatives. The Role

You will be responsible for: Design, develop, and maintain ETL processes using Informatica PowerCenter and other Informatica tools. Implement and manage data quality processes using Informatica Data Quality (IDQ). Support data governance initiatives by leveraging Informatica Data Governance solutions. Develop data pipelines and workflows to integrate data from multiple sources. Optimize performance of data integration processes and troubleshoot issues. Collaborate with business analysts and data architects to understand data requirements. Ensure compliance with data governance and security policies. Perform data profiling, cleansing, and enrichment to improve data accuracy and reliability. Document technical solutions and maintain best practices for data engineering. Identify, troubleshoot, and resolve issues related to data pipelines, storage, and processing. Ideal Profile

Minimum of 5 years of experience in data engineering using Informatica suite. Hands-on experience with Informatica Data Management Cloud (IDMC), Informatica Data Quality (IDQ) and Informatica MDM. Strong knowledge of ETL development, data integration, and data transformation techniques. Experience with SQL and relational databases such as Oracle, SQL Server, or PostgreSQL. Familiarity with cloud platforms such as Azure or Oracle Cloud. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. What's on Offer?

Work within a company with a solid track record of success. Work alongside & learn from best in class talent. Join a well-known brand within IT Services.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineer

Doha, Doha Leidos

Posted 27 days ago

Job Viewed

Tap Again To Close

Job Description

Join to apply for the

Data Engineer

role at

Leidos Join to apply for the

Data Engineer

role at

Leidos This range is provided by Leidos. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more. Base pay range

$104,650.00/yr - $89,175.00/yr Description

Job Description

Leidos National Security Sector (NSS) is seeking a highly experienced and skilled mission-driven Data Engineer to support APOLLO/OPIAS in Tampa, FL.

This role provides mission-critical support to Joint and Special Operations Forces (SOF) by designing, building, and maintaining scalable data infrastructure and pipelines that enable the planning, execution, and assessment of influence operations within the Irregular Warfare (IW) and Operations in the Information Environment (OIE). With expertise in cloud platforms, ETL frameworks, and both structured and semi-structured data, the data engineer ensures seamless ingestion, transformation, and normalization of diverse data sources—including social media, OSINT, SIGINT, cyber, and psychological operations platforms. The position will be responsible for providing OIE planners, analysts, and commanders with timely, mission-relevant data access for real-time alerting, influence mapping, and trend analysis.

This position is on a future contract pending award announcement.

Possible locations for this position are as follows:

MacDill (Tampa, FL) Al Udeid (Qatar) Fort Meade (Maryland) Northcom (Colorado Springs, CO) Camp Humphreys (Korea) Arifjan (Kuwait) Joint Base Pearl Harbor-Hickam (Hawaii) Fort Eisenhower (Georgia) Offutt AFB (Omaha, NE) Naval Operating Base Norfolk (Virginia) Southcom (Doral, FL) JB San Antonio (Texas) Stuttgart (Germany) Vicenza (Italy) Tyndall AFB (Florida)

Key Responsibilities

Conduct analysis of structured and semi-structured data sets to identify the effective integration for mission use. Design, build, and maintain the data infrastructure and pipelines that support the planning, execution, and assessment of influence operations. Review existing and emerging technical capabilities and offer recommendations on its potential value to enables OIE planners, analysts, and commanders to access, analyze, and operationalize large-scale datasets—often derived from social media, open-source intelligence (OSINT), cyber, SIGINT, or psychological operations platforms.

Basic Qualifications

Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field 8+ years of experience in data engineering or ETL pipeline development Experience with data ingestion, transformation, and normalization from diverse structured and unstructured sources Experience deploying in cloud environments (AWS, Azure, or GCP) Proficient in Python and at least one ETL framework (e.g., Airflow, NiFi, Luigi) Strong knowledge of SQL and relational databases (e.g., PostgreSQL, MySQL) Familiarity with version control tools (e.g., Git) and collaborative DevOps practices Ability to work in cross-functional teams alongside analysts, developers, and IO planners Strong documentation, communication, and troubleshooting skills Active TS/SCI security clearance

Preferred Qualifications

Master’s degree in a technical discipline Experience supporting Information Operations, PSYOP/MISO, or WebOps Experience with data lake architecture, graph databases (e.g., Neo4j), or NoSQL stores (e.g., MongoDB) Experience building pipelines that support real-time alerting, trend analysis, and influence mapping Proficiency with data visualization frameworks (e.g., Kibana, Grafana, Plotly, or D3.js) Familiarity with OSINT data platforms (e.g., Babel Street, Echosec, Talkwalker, Pulsar, Meltwater, Maltego) Familiarity with containerized environments (Docker, Kubernetes) Understanding of foreign language datasets or multilingual processing (NLP/NLU in Arabic, Russian, Chinese, etc.) Background in API integration with social media platforms or dark web forums

EIO2024

Original Posting

July 25, 2025

For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.

Pay Range

Pay Range $104,650 00 - 189,175.00

The Leidos pay range for this job level is a general guideline onlyand not a guarantee of compensation or salary. Additional factors considered in extending an offer include (but are not limited to) responsibilities of the job, education, experience, knowledge, skills, and abilities, as well as internal equity, alignment with market data, applicable bargaining agreement (if any), or other law. Seniority level

Seniority level Mid-Senior level Employment type

Employment type Full-time Job function

Job function Information Technology Industries IT Services and IT Consulting Referrals increase your chances of interviewing at Leidos by 2x Sign in to set job alerts for “Data Engineer” roles.

Data Security Engineer (HSM PKI) - Qatar (Onsite) location

Software Engineer (Python/Linux/Packaging)

Data Governance Analyst (Nationals Only)

Python and Kubernetes Software Engineer - Data, AI/ML & Analytics

Software Engineer - Python - Container Images

Software Engineer - Python - Container Images

Software Engineer - Python - Container Images

Software Engineer (Infrastructure), International Public Sector

Junior Software Engineer - Cross-platform C++ - Multipass

Distributed Systems Software Engineer, Python / Go

Software Engineer (Product), International Public Sector

Software Engineer (Forward Deployed), International Public Sector

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Big Data Hadoop Jobs