12 Data Scientist jobs in Qatar
Data Scientist
Posted today
Job Viewed
Job Description
Job Description
We are currently looking Data Scientist for our Qatar operations with the following terms & conditions.
Core Responsibilities:
Design, build, and optimize predictive models and machine learning algorithms using structured and semi-structured data.
Perform data pre-processing, feature engineering, and model selection independently.
Build and maintain automated model pipelines including training, validation, scoring and monitoring.
Implement model drift detection, retraining logic, and performance diagnostics.
Conduct code-based model explainability (eg. SHAP, LIME), support documentation for governance review.
Expertise Required
Advanced proficiency in Python (Pandas, NumPy, Scikit-leam, XGBoost, LightGBM)
Strong command of SQL arid handling large datasets (via warehouse or lake)
Experience deploying models using MLflow, Airflow, Docker, or similar tools
Familiarity with model performance metrics (ROC AUC, F1, lift/gain, etc.)
Hands-on in training and evaluating models for binary classification, multi-class, regression, or time series
Exposure to deep learning (PyTorch or Tensorflow) for advanced use cases
Joining time frame: 2 weeks (maximum 1 month)
Data Scientist
Posted today
Job Viewed
Job Description
Nair Systems is currently looking Data Scientist for our Qatar operations with the following terms & conditions.
Core Responsibilities:
Design, build, and optimize predictive models and machine learning algorithms using structured and semi-structured data.
Perform data pre-processing, feature engineering, and model selection independently.
Build and maintain automated model pipelines including training, validation, scoring and monitoring.
Implement model drift detection, retraining logic, and performance diagnostics.
Conduct code-based model explainability (eg. SHAP, LIME), support documentation for governance review.
Expertise Required
Advanced proficiency in Python (Pandas, NumPy, Scikit-leam, XGBoost, LightGBM)
Strong command of SQL arid handling large datasets (via warehouse or lake)
Experience deploying models using MLflow, Airflow, Docker, or similar tools
Familiarity with model performance metrics (ROC AUC, F1, lift/gain, etc.)
Hands-on in training and evaluating models for binary classification, multi-class, regression, or time series
Exposure to deep learning (PyTorch or Tensorflow) for advanced use cases
Working knowledge of embeddings, vector stores, or text-based models
Git-based versioning and reproducible ML workflow setup
Joining time frame: 2 weeks (maximum 1 month)
Should you be interested in this opportunity, please send your latest resume in MS Word format at the earliest
Data Scientist
Posted today
Job Viewed
Job Description
About us:
Artefact is a global services company that sits at the intersection of consulting, data science, AI technologies and marketing. Our 1700+ people break Business and Tech silos and transform organizations into consumer-centric leaders using digital, data and AI.
Key Responsibilities:
Data Quality Analysis & Remediation: Collect, extract, and clean data from various sources to prepare comprehensive datasets for analysis This includes handling missing data, removing inconsistencies, and ensuring the data is structured correctly for further analysis.
Analyze data using appropriate statistical techniques and tools to uncover trends, patterns, and anomalies in the information. This could involve anything from simple descriptive analysis to more complex exploratory data analysis
Validate and cross-check analysis results to ensure accuracy. This involves verifying calculations, comparing findings against external or historical data for consistency, and investigating any irregularities in the results. Data Analysts are expected to be vigilant about data quality during analysis, catching any issues that might have passed initial data processing.
Create clear, concise reports and visualizations to communicate findings. While the focus is on analysis rather than just creating dashboards, analysts will use tools like
Power BI
to illustrate key insights. They should present data in an understandable way for internal stakeholders, writing summaries that interpret the numbers and highlight important conclusions.Work closely with statisticians and other team members to interpret results and refine analysis approaches. This collaborative approach ensures that the analysis aligns with methodological standards and project goals. Analysts may adjust their methods based on feedback, contribute to discussions on what the data means, and help integrate their findings into broader census reports.
Key Qualifications:
Bachelor's degree in a quantitative field such as Statistics, Data Science, Economics, Mathematics, or similar. This educational background provides a solid foundation in data analysis techniques and statistical reasoning.
Strong proficiency in data analysis tools and languages, particularly
SQL
for database querying, Excel for data manipulation, and programming in Python or R for more advanced analysis and automation. Experience with statistical libraries or packages (e.g., pandas, R's tidyverse) is highly valuable.Demonstrated ability to perform in-depth data analysis and not just generate visuals. We need analysts who can write complex queries, perform calculations, and apply statistical tests if necessary – beyond assembling dashboard components. An analytical mindset and attention to detail are crucial for interpreting data correctly and spotting outliers or errors.
Familiarity with business intelligence and visualization tools, especially
Power BI
, to aid in presenting analysis outcomes. While creating polished dashboards is not the primary focus, the analyst should be capable of using Power BI (and similar tools like Tableau if needed) to turn data into interpretable charts and graphs for reporting purposes. They should also understand data visualization best practices to communicate information effectively.Excellent communication skills, both written and verbal, with the ability to explain data findings in layman's terms. A collaborative attitude is important, as Data Analysts will frequently interact with other teams (IT, field operations, subject-matter experts) to gather context and ensure that analyses meet the needs of the census project. They should be proactive in sharing insights and contributing to data-driven decisions within the organization.
Data Scientist(Bangalore)
Posted today
Job Viewed
Job Description
Data Scientist :
Company Overview:
At Codvo, software and people transformations go hand-in-hand. We are a global empathyled technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day. We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.
Required Skills (Technical):-
- Advanced knowledge of statistical techniques, NLP, machine learning algorithms and deep learning frameworks like TensorFlow, Theano, Kera's, Pytorch.
- Proficiency with modern statistical modelling (regression, boosting trees, random forests, etc.), machine learning (text mining, neural network, NLP, etc.), optimization (linear optimization, nonlinear optimization, stochastic optimization, etc.) methodologies.
- Build complex predictive models using ML and DL techniques with production quality code and jointly own complex data science workflows with the Data Engineering team.
- Familiar with modern data analytics architecture and data engineering technologies (SQL and No-SQL databases).
- Knowledge of REST APIs and Web Services
- Experience with Python, R, sh/bash
Required Skills (Non-Technical):-
- Fluent in English Communication (Spoken and verbal)
- Should be a team player
- Should have a learning aptitude
- Detail-oriented, analytically.
- Extremely organized with strong time-management skills
- Problem Solving & Critical Thinking Require
Note : C3 (version 8) certification is Mandatory
Experience Level : 5+Years
Work Location : Qatar
Senior Data Scientist
Posted today
Job Viewed
Job Description
Company Description
HyperThink is a prominent IT and business services company with a comprehensive range of offerings focused on integrating Technology and Operations. Serving various industries including Oil and Gas, Manufacturing, Financial Services, Real Estate, Education, and the Public Sector, we offer IT consulting, systems integration, elearning, and business process outsourcing services. Our global delivery model ensures customized, high-quality solutions that provide significant returns on investment for our clients. We are dedicated to helping our customers achieve leadership positions in their respective markets through relentless customer satisfaction.
Required Skills:
● Knowledge of statistical modelling and popular machine learning models.
● Good analytical skills, with expertise in analytical tool-kits such as Logistic Regression, Cluster Analysis, Factor Analysis,
Multivariate Regression, Statistical modelling, predictive analysis.
● Hands-on with Python/R programming and knowledge of machine learning tools like Scikit-Learn, Pandas, TensorFlow,
Pytorch, Keras, OpenCV, XGBoost etc.
● Computer Vision: Image preprocessing, image matching/comparison using SIFT/SURF/ORB, Object detection (Faster RCNN,
YOLO, SSD, etc.), Image similarity (Siamese network), Image classification.
● NLP: Vector Space modeling in NLP, LSTMS, Sequence modeling, Attention modeling, BERT, Transformers. Using the above-
mentioned techniques performing Document classification, Semantic similarity, NER, Sentiment Analysis
● Time Series Analysis using ARIMA, SARIMA, LSTMs, etc.
● Feature engineering, Feature selection/Feature importance, Dimensionality
● Reduction (PCA, etc.), Hyperparameter tuning, Ensembling techniques like Bagging, Boosting, Stacking
● Should possess knowledge of multiple cloud platforms (GCP-Preferable, Azure or AWS) and machine learning services
offered by them.
● Critical eye for the quality of data and strong desire to get it right.
● A pleasantly forceful personality and charismatic communication style.
Senior Data Scientist
Posted today
Job Viewed
Job Description
Designation:
Senior Data Scientist
Experience level:
8 to 15 years
Skills - Required:
- Knowledge of statistical modelling and popular machine learning models.
- Good analytical skills, with expertise in analytical tool-kits such as Logistic Regression, Cluster Analysis, Factor Analysis, Multivariate Regression, Statistical modelling, predictive analysis.
- Hands-on with Python/R programming and knowledge of machine learning tools like Scikit-Learn, Pandas, TensorFlow, Pytorch, Keras, OpenCV, XGBoost etc.
- Computer Vision: Image pre-processing, image matching/comparison using SIFT/SURF/ORB, Object detection (Faster RCNN, YOLO, SSD, etc.), Image similarity (Siamese network), Image classification.
- NLP: Vector Space modelling in NLP, LSTMS, Sequence modelling, Attention modelling, BERT, Transformers. Using the abovementioned techniques performing Document classification, Semantic similarity, NER, Sentiment Analysis.
- Time Series Analysis using ARIMA, SARIMA, LSTMs, etc.
- Feature engineering, Feature selection/Feature importance, Dimensionality.
- Reduction (PCA, etc.), Hyperparameter tuning, Ensembling techniques like Bagging, Boosting, Stacking.
- Should possess knowledge of multiple cloud platforms (GCP-Preferable, Azure or AWS) and machine learning services offered by them.
- Critical eye for the quality of data and strong desire to get it right.
- A pleasantly forceful personality and charismatic communication style.
Skills - Nice to Have:
- Experience in building end to end machine learning pipelines, deployment using any ML Ops Framework (Kube Flow Pipelines, Vertex AI), model performance monitoring, CI/CD.
- Experience in GCP based ML solutions and services like Contact Centre AI, Vertex AI, Document AI, Speech-to-Text, Text-to Speech, Natural Language, Dialog Flow, AutoML, Vision, Video Intelligence, Base OCR, Form Parser, Invoice Parser, etc.
Your Job will require:
- Ability to multitask and work on multiple engagements related to different domains.
- Be able to pick up newer technologies in a short span of time
- Work in a highly collaborative environment by interacting with the stakeholders and various IT teams within the company to facilitate design and development of ML/AI solutions.
- Be responsible for the successful delivery of all allocated projects with respect to schedule, quality and customer satisfaction.
- Follow Agile standards and methodologies in all phases of the project.
- Ensure excellence in delivery to customers
- Self-driven individuals with a passion for research and development and the zest for solving what matters.
What's in it for you?
- The experience of working in a category defining high growth start-ups in the transformational AI, Decision Science and Big Data Domain.
- The opportunity of getting on boarded to the phenomenal growth journey and helping the customers take the next big leap in digital transformation
- The opportunity to work with a diverse, lively and proactive group of techies who are constantly raising the bar on the art of translating mounds of data into tangible business value for clients.
- An ergonomic and beautiful working space in the heart of Mumbai open for you 24 hours
Data Scientist – C3i Certified
Posted today
Job Viewed
Job Description
Job Title:
Data Scientist – C3i Certified (Version 8)
Location:
Doha,Qatar
Job Type:
Full-Time
Experience Level:
4+ years
About the Role:
We are seeking a highly motivated and analytically strong
Data Scientist
with a
C3i Certification (Version 8)
to join our team on a project-based engagement. This role involves applying data science techniques to solve real-world business challenges, collaborating with cross-functional teams, and contributing to key analytics initiatives.
This is a
hybrid role
, with
20 days remote
and
85 days on-site
expected during the engagement.
Engagement Details:
- Expected Duration:
Late September start to February - Work Model:
- Remote:
20 days - On-site:
85 days (Doha, Qatar) - Extension Possibility:
The engagement may be extended based on business needs and performance
Key Responsibilities:
- Apply statistical and machine learning techniques to analyze and interpret complex data sets
- Build predictive models and deploy them into production environments
- Collaborate with data engineers, business analysts, and domain experts to define and solve problems
- Present findings and insights to stakeholders in a clear and compelling manner
- Continuously improve model performance and ensure data quality and integrity
Required Qualifications:
- Bachelor's or Master's degree in Data Science, Computer Science, Statistics, Mathematics, or a related field
- 4+ years of experience as a Data Scientist or in a similar analytical role
- Proficiency in Python and associated libraries (Pandas, NumPy, Scikit-learn, etc.)
- Strong experience with SQL and working with large datasets
- Solid understanding of machine learning, data wrangling, and model evaluation techniques
Preferred Qualifications:
- C3i Certification – Version 8
(Must have or nearing completion) - Familiarity with cloud platforms (AWS, Azure, or GCP)
- Experience with MLOps tools and deployment practices
- Knowledge of data visualization tools like Tableau, Power BI, or Plotly
- Excellent problem-solving and communication skills
Be The First To Know
About the latest Data scientist Jobs in Qatar !
Microsoft Power Bi And Data scientist Engineer
Posted today
Job Viewed
Job Description
Job Description
ECCO Gulf Majorel Qatar is seeking a highly skilled Microsoft Power BI and Data Scientist Engineer to join our dynamic team in Qatar. As a key player in our Information Technology and Services sector, you will be responsible for leveraging data to drive strategic decisions and improve business outcomes. Your role will involve designing, developing, and deploying business intelligence solutions using Microsoft Power BI, as well as conducting complex data analysis to support our research initiatives.
- Experience in building, managing, and optimizing data pipelines, data lakes, and data warehouses on Azure.
- Develop and maintain Power BI dashboards and reports to provide actionable insights.
- Integrate Azure Data Factory with external data sources such as SQL Databases, NoSQL
- Databases, REST APIs, flat files, and cloud-based storage (Azure Blob, Data Lake).Analyze large datasets to identify trends, patterns, and insights.
- Ensure data quality by implementing data validation, transformation rules, and ensuring seamless error handling in pipelines.
- Automate data movement and orchestration between different storage systems, databases, and analytics platforms using ADF pipelines and Data Flows.
- Work with Azure Data Factory Mapping Data Flows to perform complex transformations and aggregations.
- Build and manage data lakes and data warehouses on Azure Synapse Analytics, ensuring seamless integration with Azure SQL Data Warehouse, Azure Data Lake, and other Azure services.
- Implement data transformation processes to support large-scale data analytics, including batch processing and real-time analytics.
- Develop data models for OLAP (Online Analytical Processing) and OLTP (Online Transaction Processing) to support reporting and data exploration.
- Work with Azure Synapse Studio to design serverless SQL pools, Spark Pools, and data integration pipelines.
- Optimize performance of data workloads in Azure Synapse Analytics, ensuring scalability and efficiency for reporting and analysis.
- Implement Microsoft Fabric for end-to-end data integration across Azure Data Services, ensuring seamless collaboration between data engineering, data science, and business intelligence teams.
- Ensure compliance with data governance frameworks (GDPR, CCPA) while handling sensitive data in cloud environments.
- Implement role-based access control (RBAC) and data encryption techniques to safeguard data throughout the pipeline.
- Leverage Microsoft Fabric's Lakehouses, Data Warehouses, and Data Pipelines for advanced analytics and AI-driven insights.
- Build data-driven applications and business intelligence solutions within Microsoft Fabric to empower business users with self-service data tools.
- Design cloud-based architectures for scalable, cost-effective, and reliable data processing systems using Azure services.
- Implement best practices for cloud security, data governance, and data privacy using Azure tools like Azure Security Center and Azure Policy.
- Develop and maintain CI/CD pipelines for automating the deployment of data pipeline changes using tools like Azure DevOps, GitHub Actions, and Azure Pipelines.
- Use Azure Logic Apps and Power Automate to automate routine tasks and improve operational efficiency.
Required Profile
We are looking for an experienced professional with a strong background in data science and business intelligence. The ideal candidate will possess the following qualifications and skills:
- Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
- 6+ years of hands-on experience in data engineering, building and managing data pipelines, data integration, and cloud data architecture.
- Proven experience with Microsoft Power BI and data visualization tools.
- Expertise in Azure Data Factory (ADF) for building scalable ETL pipelines.
- Strong experience with Azure Synapse Analytics and designing data models for data lakes, data warehouses, and analytics.
- Experience using Microsoft Fabric for collaborative data workflows, integrating data sources, and building data pipelines.
- Proficiency in SQL, T-SQL, and Azure SQL Database for querying and managing data.
- Knowledge of data storage options, including Azure Blob Storage, Data Lake, and Azure SQL Database.
- Experience with big data technologies (Hadoop, Spark) and integration within Azure Synapse or Azure Databricks.
- Familiarity with DevOps practices, CI/CD pipelines, and automation for data workflows and deployments.
- Strong understanding of data governance, data quality, and cloud security in the Azure ecosystem.
- Familiarity with Power BI, Azure Machine Learning, or other BI/AI tools is a plus.
Certifications
Microsoft Certified: Azure Data Engineer Associate
Microsoft Certified: Azure Synapse Analytics
Developer - Machine Learning
Posted today
Job Viewed
Job Description
Job Summary
You will be responsible for end-to-end data science cycles, encompassing designing, training, implementing, evaluating, and monitoring machine learning models, and will also design and implement highly scalable tools and algorithms based on state-of-the-art Machine Learning and Deep Learning methodologies.
You will work across the ML stack, from researching models, working with large datasets, training, and tuning existing models to creating new models, deploying them at scale, analyzing results, and presenting findings to stakeholders across tech and business domains.
Job Objectives
- Develop and implement advanced predictive models to optimize customer experiences and other business outcomes.
- Analyze large and complex datasets to extract actionable insights and drive business decisions.
- Interpret results and provide actionable insights to guide real-time decision-making within the business context.
- Collaborate with cross-functional teams to ensure proper deployment and integration of ML models for new releases.
Job Responsibilities
Predictive Modeling and Deployment
- Develop and implement advanced predictive models to forecast key business metrics such as sales, customer churn, or product demand.
- Utilize predictive modeling to optimize customer experiences and other business outcomes.
- Execute machine learning models, algorithms, and statistical techniques to analyze historical data and ensure scalability and efficiency.
Data Preparation and Analysis
- Develop and use advanced software programs, algorithms, and query techniques to cleanse, integrate, and evaluate datasets for model inputs.
- Analyze large and complex datasets to extract actionable insights and identify trends and patterns that can drive business decisions.
- Identify manual human processes, understand user behaviors, and analyze use cases that can be augmented or automated.
Model Deployment and Interpretation
- Deploy models into production environments and monitor their performance over time.
- Apply statistical, mathematical, and predictive modeling techniques to build, maintain, and improve real-time decision systems.
- Interpret results, develop insights within the business context, and provide guidance on risks and limitations
Development & Documentation
- Write the code as per agreed software design rules to keep it aligned with the rest of the code base.
- Code the final implementation that the generated code is referring to.
- Follow company software data protection and security guidelines in developing software.
- Accurately estimate the time needed to complete an assigned task.
- Identify possible causes of issues or problems.
- Think through and recommend solutions when raising issues around code, requirements, etc.
- Write technical design documentation that fully defines all application code.
- Maintain detailed knowledge of iHorizons products and services.
- Understand the business impact for labs outcomes.
Collaboration & Team Guidance
- Stay updated on the latest research, learn new applications, tools, and technologies in the fields of data science and machine learning through intensive and focused effort.
- Collaborate with technical and non-technical business partners to develop analytical dashboards describing ML algorithm findings to stakeholders.
- Collaborate with other teams to perform code reviews and oversee proper deployment for new releases.
- Actively mentor and support mid-level and junior developers in their professional growth.
- Provide guidance on best practices in machine learning, code reviews, and project design.
- Foster an inclusive and collaborative environment that encourages continuous learning and development within the team.
- Oversee interactions with vendors and third-party service providers, including collaborating on the design and implementation of technical architectures, and acting as a point of contact for resolving technical issues. Maintain clear communication with internal stakeholders regarding vendor-related activities, updates, and issues to facilitate smooth collaboration and decision-making processes.
Job Requirements
Educational Qualification
- Bachelor's degree in data science, statistics, and computer science is a MUST.
- Google Cloud - Professional Machine Learning Engineer Certificate
- Google AgentSpace implementation experience
Previous Work Experience
- 3-4+ Years of experience in data science or machine learning.
- Must have strong experience in at least one of the following areas:
- Vision models
- NLP models (Experience in Arabic NLP is a huge plus)
Skills And Abilities
- Proficient in python, TensorFlow, keras and pytorch.
- Good experience in:
- SQL and non-relational databases.
- Data analytics reports generation.
- ML model development deployment.
Powered by JazzHR
zv78nVZ4lb
Machine Learning Engineer
Posted today
Job Viewed
Job Description
Apt Resources is seeking an experienced Machine Learning Engineer for a client in Abu Dhabi's Government & Public Sector. In this role, you will design and deploy cutting-edge AI/ML solutions using Large Language Models (LLMs) like GPT, Llama, and BERT to drive innovation in public services.
This is an exciting opportunity to work on high-impact projects involving Retrieval-Augmented Generation (RAG), fine-tuning, and prompt engineering, ensuring secure, scalable, and compliant AI systems for government applications.
Key Responsibilities:- Develop and optimize AI/ML pipelines for LLMs, focusing on RAG architectures, fine-tuning, and prompt engineering tailored for public sector needs.
- Implement scalable solutions using Python, LangChain, HuggingFace, PyTorch/TensorFlow, and cloud-based ML services (Azure ML preferred).
- Integrate vector/graph databases (Weaviate, Neo4j) into production systems to enhance data retrieval and analysis.
- Deploy and monitor models in production, ensuring adherence to government security and compliance standards.
- Collaborate with cross-functional teams to align AI solutions with public sector objectives (e.g., citizen services, data governance, operational efficiency).
- 6-14 years of hands-on experience in AI/ML, with a strong focus on LLMs and GenAI.
- Expertise in LLM architectures (Transformers), prompt engineering, and RAG implementations.
- Proficiency in Python and ML frameworks (LangChain, LlamaIndex, HuggingFace, Scikit-learn).
- Experience with cloud platforms (Azure ML, AWS, or GCP) and MLOps tools (MLflow, model monitoring).
- Familiarity with vector databases, ETL pipelines, and unstructured data handling.
- Knowledge of government IT standards or secure deployments is a plus.
To be discussed