Call Us
Home / Careers

Building Tomorrow's Workforce

360DigiTMG is a global leader in education and career enablement, connecting talent with opportunity. We have trained over 50,000 professionals worldwide, and prepared them for leadership roles in the digital economy.

Our AiSPRY powered network and partnerships with top-tier companies ensure that our alumni and job seekers have access to job opportunities in cutting-edge technologies like Data Science, Data Analytics, AI, Machine Learning, and Data Engineering.

We pride ourselves on fostering innovation, offering real-world project experience, and empowering individuals to thrive in their careers. Whether you're an experienced professional or a fresh graduate, we help you bridge the gap to your dream job.

Why Choose Us?

Innovative Environment
Exclusive Access to Opportunities
Job roles through our vast network of industry collaborators and stakeholders.
Growth Opportunities
Alumni-Driven Ecosystem
Be part of a thriving community excelling in global corporations like Google, EY, and PwC.
Impactful Work
Real-World Skills
Our candidates bring hands-on experience through projects across 76+ domains.
Collaborative Culture
Tailored Hiring Solutions
Flexible roles, ranging from internships to project-based hiring and opportunities.

Latest Job Vacancies You Can Apply

  • Technical Requirements:  AI, Big Data, NLP, Machine Learning, Python, Microsoft LUIS

     

    Responsibilities: Develop and implement AI models and solutions, Analysis large datasets, insights data driven decision making. 
    Monitor and evaluate performance of AI models and continuously improve 
    Documents processes methodologies and results to stakeholders.

     

    Experience: 3 to 4 years

  • Job Description: 
    We are seeking a skilled and detail-oriented professional to oversee and manage the organization's business data and analytics functions. The role involves designing and maintaining robust data pipelines, infrastructure, and models to process and analyze critical data, including website traffic, leads, sales, and P&L metrics. 
    The ideal candidate will be responsible for ensuring data quality, developing scalable ETL processes, and building dashboards to enable actionable insights. This position requires close collaboration with cross-functional teams to align technical solutions with business objectives and support data-driven decision-making across the organization. 


    Education / Job Experience Required: 
    1. Bachelor's or master's degree in computer science, data science, or a related field 
    Location: Bangalore


    Day-to-Day Work: 

    • Responsible for managing all aspects of business data including website traffic, leads, sales & final P&L. 
    • Design and develop robust, scalable, and efficient data pipelines and ETL processes to extract, transform, and load data from various sources into data warehouses or data lakes. 
    • Build and maintain data infrastructure, including data warehouses and data processing frameworks. 
    • Ensure data quality and integrity by implementing appropriate data validation, monitoring, and error handling mechanisms. 
    • Develop and maintain data models and schemas to support data analytics and reporting needs. 
    • Identify and address data-related issues and bottlenecks, ensuring the smooth operation of data pipelines and systems. 
    • Construct and maintain dashboards that allow users to understand performance and generate Insights. 
    • Collaborate closely with cross-functional teams, including engineering, product management, and business stakeholders, to ensure alignment between business objectives and technical solutions. 

    Must-Have: 

    • Must have experience with Google cloud platform (Bigquery, Dataproc, Cloud functions) 
    • Solid understanding of SQL and No SQL Databases like Postgresql & MongoDB. 
    • Strong proficiency in Python with experience in data manipulation and scripting. 
    • Hands-on experience with ETL tools and data integration using third party APIs. 
    • Proven experience with data modelling & report automation. 
    • Must have experience in designing, building and maintaining database & data warehouses. 
    • Any further experience with Google Apps Script and Google AppSheet. 

    Good to Have: 

    • Any prior experience with Power BI. 
    • Understanding of real time data streaming technologies. 
  • Job Description:

    Technical knowledge: AWS, Python, SQL, S3, EC2, Glue, Athena, Lambda, DynamoDB, RedShift, Step Functions, Cloud Formation, CI/CD Pipelines, Github, EMR, RDS,AWS Lake Formation, GitLab, Jenkins, and AWS CodePipeline.
    Role Summary: As a Senior Data Engineer,with over 5 years of expertise in Python, PySpark,  SQL to  design, develop and optimize complex data pipelines, support data modeling, and contribute to the architecture that supports big data processing and analytics to cutting-edge cloud solutions that drive business growth. You will lead the design and implementation of scalable, high-performance data solutions on AWS and mentor junior team members.This role demands a deep understanding of AWS services, big data tools, and complex architectures to support large-scale data processing and advanced analytics.

     

    Key Responsibilities:

    • Design and develop robust, scalable data pipelines using AWS services, Python, PySpark, and SQL that integrate seamlessly with the broader data and product ecosystem.
    • Lead the migration of legacy data warehouses and data marts to AWS cloud-based data lake and data warehouse solutions.
    • Optimize data processing and storage for performance and cost.
    • Implement data security and compliance best practices, in collaboration with the IT security team.
    • Build flexible and scalable systems to handle the growing demands of real-time analytics and big data processing.
    • Work closely with data scientists and analysts to support their data needs and assist in building complex queries and data analysis pipelines.
    • Collaborate with cross-functional teams to understand their data needs and translate them into technical requirements.
    • Continuously evaluate new technologies and AWS services to enhance data capabilities and performance.
    • Create and maintain comprehensive documentation of data pipelines, architectures, and workflows.
    • Participate in code reviews and ensure that all solutions are aligned to pre-defined architectural specifications.
    • Present findings to executive leadership and recommend data-driven strategies for business growth.
    • Communicate effectively with different levels of management to gather use cases/requirements and provide designs that cater to those stakeholders.
    • Handle clients in multiple industries at the same time, balancing their unique needs.
    • Provide mentoring and guidance to junior data engineers and team members.

    Requirements:

    • 5+ years of experience in a data engineering role with a strong focus on AWS, Python, PySpark, Hive, and SQL.
    • Proven experience in designing and delivering large-scale data warehousing and data processing solutions.
    • Lead the design and implementation of complex, scalable data pipelines using AWS services such as S3, EC2, EMR, RDS, Redshift, Glue, Lambda, Athena, and AWS Lake Formation.
    • Bachelor's or Master’s degree in Computer Science, Engineering, or a related technical field.
    • Deep knowledge of big data technologies and ETL tools, such as Apache Spark, PySpark, Hadoop, Kafka, and Spark Streaming.
    • Implement data architecture patterns, including event-driven pipelines, Lambda architectures, and data lakes.
    • Experience with cloud platforms such as AWS, Azure, and GCP.
    • Incorporate modern tools like Databricks, Airflow, and Terraform for orchestration and infrastructure as code.
    • Implement continuous integration and delivery pipelines using GitLab, Jenkins, and AWS CodePipeline.
    • Ensure data security, governance, and compliance by leveraging tools such as IAM, KMS, and AWS CloudTrail.
    • Mentor junior engineers, fostering a culture of continuous learning and improvement.
    • Excellent problem-solving and analytical skills, with a strategic mindset.
    • Strong communication and leadership skills, with the ability to influence stakeholders at all levels.
    • Ability to work independently as well as part of a team in a fast-paced environment.
    • Advanced data visualization skills and the ability to present complex data in a clear and concise manner.
    • Excellent communication skills, both written and verbal, to collaborate effectively across teams and levels.

    Preferred Skills:

    • Experience with Databricks, Snowflake, and machine learning pipelines.
    • Exposure to real-time data streaming technologies and architectures.
    • Familiarity with containerization and serverless computing (Docker, Kubernetes, AWS Lambda).
  • Job Description:

    Technical knowledge: AWS, Python, SQL, S3, EC2, Glue, Athena, Lambda, DynamoDB, RedShift, Step Functions, Cloud Formation, CI/CD Pipelines, Github, EMR, RDS,AWS Lake Formation, GitLab, Jenkins and AWS CodePipeline.
    Role Summary: As a Senior Data Engineer,with over 2 years of expertise in Python, PySpark,  SQL to  design, develop and optimize complex data pipelines, support data modeling, and contribute to the architecture that supports big data processing and analytics to cutting-edge cloud solutions that drive business growth. You will lead the design and implementation of scalable, high-performance data solutions on AWS and mentor junior team members.This role demands a deep understanding of AWS services, big data tools, and complex architectures to support large-scale data processing and advanced analytics.

     

    Key Responsibilities:

    • Design and develop robust, scalable data pipelines using AWS services, Python, PySpark, and SQL that integrate seamlessly with the broader data and product ecosystem.
    • Lead the migration of legacy data warehouses and data marts to AWS cloud-based data lake and data warehouse solutions.
    • Optimize data processing and storage for performance and cost.
    • Implement data security and compliance best practices, in collaboration with the IT security team.
    • Build flexible and scalable systems to handle the growing demands of real-time analytics and big data processing.
    • Work closely with data scientists and analysts to support their data needs and assist in building complex queries and data analysis pipelines.
    • Collaborate with cross-functional teams to understand their data needs and translate them into technical requirements.
    • Continuously evaluate new technologies and AWS services to enhance data capabilities and performance.
    • Create and maintain comprehensive documentation of data pipelines, architectures, and workflows.
    • Participate in code reviews and ensure that all solutions are aligned to pre-defined architectural specifications.
    • Present findings to executive leadership and recommend data-driven strategies for business growth.
    • Communicate effectively with different levels of management to gather use cases/requirements and provide designs that cater to those stakeholders.
    • Handle clients in multiple industries at the same time, balancing their unique needs.
    • Provide mentoring and guidance to junior data engineers and team members.

    Requirements:

    • 2+ years of experience in a data engineering role with a strong focus on AWS, Python, PySpark, Hive, and SQL.
    • Proven experience in designing and delivering large-scale data warehousing and data processing solutions.
    • Lead the design and implementation of complex, scalable data pipelines using AWS services such as S3, EC2, EMR, RDS, Redshift, Glue, Lambda, Athena, and AWS Lake Formation.
    • Bachelor's or Master’s degree in Computer Science, Engineering, or a related technical field.
    • Deep knowledge of big data technologies and ETL tools, such as Apache Spark, PySpark, Hadoop, Kafka, and Spark Streaming.
    • Implement data architecture patterns, including event-driven pipelines, Lambda architectures, and data lakes.
    • Incorporate modern tools like Databricks, Airflow, and Terraform for orchestration and infrastructure as code.
    • Implement CI/CD using GitLab, Jenkins, and AWS CodePipeline.
    • Ensure data security, governance, and compliance by leveraging tools such as IAM, KMS, and AWS CloudTrail.
    • Mentor junior engineers, fostering a culture of continuous learning and improvement.
    • Excellent problem-solving and analytical skills, with a strategic mindset.
    • Strong communication and leadership skills, with the ability to influence stakeholders at all levels.
    • Ability to work independently as well as part of a team in a fast-paced environment.
    • Advanced data visualization skills and the ability to present complex data in a clear and concise manner.
    • Excellent communication skills, both written and verbal, to collaborate effectively across teams and levels.

    Preferred Skills:

    • Experience with Databricks, Snowflake, and machine learning pipelines.
    • Exposure to real-time data streaming technologies and architectures.
    • Familiarity with containerization and serverless computing (Docker, Kubernetes, AWS Lambda).
       
  • Job Description:

    We are seeking a talented Python and SQL Developer with 2 - 3 years of experience to join our team. As a Python and SQL Developer, you will be responsible for designing, developing, and maintaining data-driven applications and solutions. You will work closely with cross-functional teams to deliver high-quality software products that meet the clients needs.

     

    Responsibilities:

    • Design, develop, and maintain Python-based applications that interact with SQL databases.
    • Write efficient and optimized SQL queries, stored procedures, and functions to retrieve, manipulate, and analyze data.
    • Design and implement scalable solutions leveraging cloud technologies
    • Perform data validation, cleansing, and transformation to ensure data integrity and accuracy.
    • Understanding of the threading limitations of Python, and multi-process architecture
    • Work with version control systems such as Git for code management and collaboration.
    • Conduct code reviews, testing, debugging, and troubleshooting to ensure the reliability and performance of applications.
    • Develop data visualization tools and reports to present insights and findings to stakeholders.

    Requirements:

    • Bachelor's degree in Computer Science, Engineering, or a related field.
    • Hands-on experience in Python development and SQL database management.
    • Proficiency in writing complex SQL queries, including joins, subqueries, and aggregations.
    • Strong understanding of Python programming concepts, data structures, and algorithms.
    • Experience with Python frameworks such as Django, Flask etc .
    • Familiarity with database systems such as PostgreSQL, MySQL, or SQL Server.
    • Knowledge of data visualization tools and libraries such as Matplotlib, Plotly etc.
    • Excellent problem-solving, analytical, and communication skills.
    • Ability to work effectively in a collaborative team environment.
    • Collaboration with data engineers and analysts to design and implement data models and ETL processes is a plus.
    • Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) is a plus.
  • Job Description:

    As a Senior Data Scientist, you will lead data-driven projects and play a key role in shaping our data science strategy. You will be responsible for developing advanced algorithms, managing large-scale datasets, and mentoring junior team members.

    Responsibilities

    • Design and implement complex statistical models and machine learning algorithms.
    • Lead data science projects from conception to deployment, ensuring high-quality deliverables.
    • Guide junior data scientists and provide technical mentorship.
    • Collaborate with business stakeholders to define project objectives and deliver actionable insights.
    • Explore and implement new technologies and best practices in data science.
    • Present findings to executive leadership and recommend data-driven strategies for business growth.

    Requirements

    • Bachelor's or Master's degree in Data Science, Computer Science, Statistics, Mathematics, or a related field.
    • 5+ years of experience in data science, with a proven track record of leading successful projects.
    • Deep understanding of advanced machine learning techniques, including deep learning and ensemble methods.
    • Should have good experience in projects including LLMs and GenAI models.
    • Strong proficiency in programming languages such as Python or R.
    • Experience with big data technologies (e.g., Apache Spark, Hadoop) and cloud platforms (e.g., AWS, Azure, GCP).
    • Advanced data visualization skills and the ability to present complex data in a clear and concise manner.
    • Excellent problem-solving and analytical skills, with a strategic mindset.
    • Strong communication and leadership skills, with the ability to influence stakeholders at all levels.

    Business Needs

    • Ability to meet with different levels of management and communicate at that level to gather use cases/requirements and the ability to provide designs that cater to that level of stakeholders.
    • Ability to quickly learn the basics of the industry that the client belongs to and converse with the stakeholders in the industry.
    • Ability to handle clients in multiple industries at the same time.
    • The ability to take the dashboards created and build stories that can be presented to the top management for the clients.
  • Collaborative Culture
    Cloud Computing Engineer (AWS, Azure, GCP) 

    Job Description: 
    We are looking for a skilled Cloud Computing Engineer with expertise in AWS, Azure, and Google  Cloud Platform (GCP) to join our dynamic team. The ideal candidate will be responsible for  designing, implementing, and managing cloud infrastructure solutions across multiple cloud  platforms. This position requires hands-on experience in cloud architecture, security, automation,  and optimizing cloud resources to meet business requirements. 

    Key Responsibilities: 
    • Design and deploy scalable, secure, and cost-effective cloud infrastructure solutions on AWS,  Azure, and GCP. 
    • Manage and maintain cloud environments, ensuring availability, performance, and security. 
    • Implement Infrastructure as Code (IaC) using tools such as Terraform, CloudFormation, or  Azure Resource Manager (ARM) templates. 
    • Configure, deploy, and manage cloud-based applications, services, and databases. 
    • Automate cloud infrastructure provisioning, monitoring, and management to enhance  efficiency and minimize downtime. 
    • Optimize cloud infrastructure for performance, cost, and scalability. 
    • Troubleshoot and resolve cloud-related infrastructure issues, providing ongoing support and  maintenance. 
    • Work closely with DevOps, development, and security teams to integrate cloud  infrastructure with software solutions and CI/CD pipelines. 
    • Implement cloud security best practices, including identity and access management (IAM),  network security, and data protection. 
    • Monitor and manage cloud resource utilization, optimize cost-efficiency, and recommend  improvements. 
    • Conduct regular cloud performance reviews and report on cloud usage, trends, and cost  analysis.

    Key Qualifications: 
    • Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or  a related field. 
    • Proven experience working with cloud platforms, specifically AWS, Azure, and GCP
    • Strong knowledge of cloud infrastructure and services, including computing, storage,  networking, and databases. 
    • Proficiency in Infrastructure as Code (IaC) using tools such as Terraform, CloudFormation, or  Azure Resource Manager (ARM) templates. 
    • Experience with cloud security, IAM, encryption, and data protection protocols. 
    • Expertise in setting up and managing CI/CD pipelines using cloud-native tools or third-party  tools (Jenkins, GitLab CI, etc.). 
    • Familiarity with containerization and orchestration tools such as Docker and Kubernetes
    • Solid experience with cloud monitoring and management tools (AWS CloudWatch, Azure  Monitor, Google Stackdriver). 
    • Experience in managing cloud cost optimization and using tools to manage and forecast  cloud expenses. 
    • Excellent problem-solving skills and the ability to troubleshoot complex cloud-related issues.

  • Job Description: 

    We are looking for a highly skilled and motivated AI/ML Engineer to join our team. The ideal candidate will have expertise in developing machine learning models, algorithms, and AI systems.  You will be responsible for researching, designing, and implementing AI-driven solutions that solve  complex business problems and enhance our products and services. 

    Key Responsibilities: 

    • Design, implement, and optimize machine learning models and AI algorithms for real-world  applications. 
    • Develop and deploy AI models and systems that enhance business operations and customer  experiences. 
    • Work closely with data scientists and engineers to collect, process, and analyze data for  training AI models. 
    • Collaborate with cross-functional teams to define and implement AI-driven solutions.
    • Optimize and tune AI models to improve accuracy, efficiency, and performance. • Conduct experiments, validate results, and provide insights from model performance. 
    • Stay updated on AI and ML advancements and integrate new techniques into existing  models. 
    • Write production-ready code for AI models, ensuring maintainability and scalability. 
    • Implement machine learning pipelines, including data preprocessing, model training,  evaluation, and deployment. 
    • Troubleshoot issues related to AI models and improve the model lifecycle.

    Key Qualifications: 

    • Bachelor's or Master’s degree in Computer Science, Data Science, Engineering, or a related  field. 
    • Proven experience as an AI/ML Engineer or in a similar role.
    • Strong proficiency in programming languages such as Python, Java, or C++. 
    • Expertise in machine learning frameworks and libraries like TensorFlow, Keras, PyTorch, and  Scikit-learn. 
    • Experience in data manipulation and analysis using libraries like Pandas, NumPy, and  Matplotlib. 
    • Solid understanding of machine learning algorithms such as regression, clustering, neural  networks, and deep learning. 
    • Strong problem-solving, analytical, and debugging skills. 

    • Experience in deploying machine learning models in production environments. • Familiarity with cloud platforms (AWS, Google Cloud, Azure) and using cloud-based ML tools.

  • Job Description: 
    We are seeking an experienced Data Scientist & Big Data Analyst to join our team and help drive  data-driven decision-making by analysing large datasets and developing predictive models. As a Data  Scientist and Big Data Analyst, you will work with cutting-edge technologies and methodologies to  extract actionable insights, build machine learning models, and help solve complex business  problems using big data technologies like Hadoop, Spark, and cloud-based platforms.

    Key Responsibilities: 
    • Analyze large and complex datasets to identify trends, patterns, and business insights. 
    • Develop machine learning models for classification, regression, and clustering to solve real world business problems. 
    • Leverage Big Data technologies (Hadoop, Spark, etc.) to process and analyze massive  datasets in distributed computing environments. 
    • Work with data engineering teams to design and build data pipelines and data warehousing  solutions. 
    • Clean, preprocess, and validate large datasets to ensure high-quality and accurate analysis. 
    • Build and deploy predictive models and algorithms using machine learning frameworks like  TensorFlow, Scikit-learn, or PyTorch
    • Create data visualizations and dashboards to communicate findings to business stakeholders. 
    • Implement data-driven strategies to optimize business processes, marketing campaigns,  customer engagement, and more. 
    • Use statistical analysis, hypothesis testing, and A/B testing to generate insights and support  business decisions. 
    • Design and execute experiments and simulations to validate model predictions.
    • Stay up to date with the latest developments in data science, machine learning, and big data  analytics to ensure the organization is leveraging the best technologies and methodologies. 

    Key Qualifications: 
    • Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, Engineering, or  a related field. 
    • Proven experience as a Data Scientist or Big Data Analyst, with hands-on experience in  analyzing large datasets and building machine learning models. 
    • Expertise in Big Data technologies such as Hadoop, Spark, Hive, and MapReduce. • Proficiency in programming languages such as Python, R, Scala, or Java. • Strong understanding of statistical analysis, data mining, and machine learning algorithms. 
    • Experience with data visualization tools like Tableau, Power BI, or programming libraries like  Matplotlib, Seaborn, or ggplot2
    • Hands-on experience with SQL and NoSQL databases such as MongoDB, Cassandra, or  HBase
    • Familiarity with cloud platforms like AWS, Azure, or Google Cloud and their data processing  services. 
    • Experience with data wrangling, data cleaning, and feature engineering.
    • Strong problem-solving, analytical, and critical thinking skills. 
    • Ability to communicate complex technical concepts to non-technical stakeholders.

  • Job Description: 
    We are seeking a talented Power BI Developer to join our team and help create interactive,  insightful, and visually appealing reports and dashboards using Power BI. In this role, you will work  closely with business stakeholders to understand their data needs and deliver effective data  visualisations that drive strategic decisions. The ideal candidate will have hands-on experience with  Power BI, data modelling, data transformation, and dashboard/report development. 

    Key Responsibilities: 

    • Design, develop, and maintain Power BI reports and dashboards, ensuring they meet  business requirements and deliver actionable insights. 
    • Work with business users to gather requirements, translate them into reporting solutions,  and ensure data accuracy. 
    • Create data models, develop and optimise complex DAX (Data Analysis Expressions) queries,  and integrate multiple data sources. 
    • Perform data cleaning, data transformation, and data validation using Power Query and  Power BI Desktop
    • Design and implement efficient data visualisation techniques, including interactive charts,  graphs, and KPI dashboards. 
    • Develop and automate reporting processes to streamline data analysis and reporting  workflows. 
    • Collaborate with the IT and Data Engineering teams to integrate Power BI with various data  sources like SQL databases, cloud data sources (Azure, AWS), and third-party applications. 
    • Conduct performance tuning and optimisation of Power BI reports to ensure optimal  performance for large datasets. 
    • Create and maintain SQL queries and stored procedures for extracting data from different  data sources.
    • Implement role-based security features in Power BI to control data access and ensure  compliance. 
    • Stay updated with the latest Power BI features and best practices, and proactively apply  them to enhance reporting capabilities. 
    • Assist in training users on how to use reports and dashboards effectively. 
    • Provide ongoing support, troubleshooting, and enhancement of existing reports and  dashboards. 

    Key Qualifications: 

    • Bachelor’s degree in Computer Science, Information Technology, Data Science, or a related  field. 
    • Proven experience as a Power BI Developer or similar role, with hands-on experience in  report/dashboard development. 
    • Expertise in Power BI Desktop, Power BI Service, Power Query, and Power BI Report Server
    • Strong proficiency in Data Analysis Expressions (DAX) and experience with creating complex  calculations and metrics. 
    • Solid experience in SQL for querying, filtering, and transforming data from relational  databases (e.g., SQL Server, MySQL). 
    • Experience with integrating data from a variety of sources, including Excel, SharePointAzure, and other cloud-based data platforms. 
    • Knowledge of data modelling techniques, including star and snowflake schemas, and best  practices for organising and structuring data. 
    • Understanding of data visualisation principles and best practices for creating intuitive and  user-friendly reports. 
    • Familiarity with advanced data analytics techniques such as predictive modelling, machine  learning integration, or statistical analysis (optional). 
    • Experience in creating and implementing role-level security (RLS) in Power BI reports. • Strong problem-solving skills and attention to detail.

  • Collaborative Culture
    Developer - Python

    Job Description:
    The primary responsibilities of this role include:
    • Building and maintaining pipelines for model development, testing, deployment, and monitoring.
    • Automating repetitive tasks such as model re-training, hyper parameter tuning, and data validation.
    • Developing CI/CD pipelines for seamless code migration.
    • Collaborating with cross-functional teams to ensure proper integration of models into production systems.

    Key Skills Required:
    • 3+ years of experience in developing and deploying ML models in production.
    • Strong programming skills in Python (with familiarity in Bash/Shell scripting).
    • Hands-on experience with tools like Docker, Kubernetes, MLflow, or Airflow.
    • Knowledge of cloud services such as AWS SageMaker or equivalent.
    • Familiarity with DevOps principles and tools like Jenkins, Git, or Terraform.
    • Understanding of versioning systems for data, models, and code.
    • Solid understanding of MLflow, ML services, model monitoring, and enabling logging services for performance tracking.

    Education Qualification: BE/BTECH/MCA
    Skill to Evaluate: Technical | App Development | Python
    Experience: 1 to 4 Years
    Role: Developer
    Skills: Python

  • Collaborative Culture
    Grant Thornton, Surana Group, Vinayaka Steels, CMR Group - AI/ML Engineer

    Data Science

    Data Science

    Data Science

    Data Science

    Job Description :
    We are seeking an experienced AI/ML Engineer II to join our Furnace Operations Efficiency Program. In this role, you will develop cutting-edge AI and machine learning solutions to optimize furnace operations, improve energy efficiency, minimize waste, and enhance the overall quality of production. You will work directly with the operations team to create data-driven insights that drive smarter, more efficient manufacturing processes.

    Your Mission:

    • Furnace Process Optimization: Develop and deploy machine learning models to optimize the entire furnace operation cycle, from pre-treatment to melting and refining, improving both yield and energy efficiency.
    • Predictive Maintenance: Implement predictive maintenance models using sensor data to anticipate failures, reduce downtime, and extend the lifespan of furnace equipment.
    • Dross Reduction: Leverage AI to monitor and analyze furnace performance, identifying factors that contribute to dross formation and recommending operational adjustments.
    • Energy Efficiency: Design models to predict and optimize energy usage, ensuring the most efficient operation of furnaces with minimal waste and cost.
    • Real-Time Monitoring: Develop systems to monitor furnace performance in real time, using advanced sensors and IoT technologies, to instantly adjust operations for optimal results.
    • Data Integration: Integrate real-time and historical data from various sources, including sensors, control systems, and operational logs, to create comprehensive models for better decision-making.

    What We’re Looking For:

    • Experience Level: 4-6 years of experience in AI/ML engineering, with a focus on industrial operations or energy efficiency. Experience in metal, glass, or manufacturing industries is a plus.
    • Educational Background: Bachelor’s or Master’s degree in Computer Science, Data Science, Mechanical Engineering, Electrical Engineering, or a related field.
    • Technical Expertise:
      • Strong proficiency in machine learning algorithms (e.g., Random Forests, Neural Networks, XGBoost) and optimization techniques.
      • Familiarity with time-series forecasting, anomaly detection, and predictive modeling.
      • Experience in working with IoT data from industrial sensors (e.g., temperature, pressure, gas emissions).
      • Proficiency in programming languages such as Python, R, and SQL.
      • Experience with big data platforms (e.g., Spark, Hadoop) and cloud environments (AWS, GCP, Azure).
    • Domain Knowledge: Understanding of industrial operations, furnace systems, and energy optimization techniques is highly desirable.
    • Problem-Solving Skills: Ability to identify key inefficiencies in furnace operations and apply AI solutions to solve complex, real-world challenges.

    What You’ll Achieve:

    • Improve Operational Efficiency: Your models will directly enhance furnace performance, reducing energy consumption, dross formation, and operational costs.
    • Maximize Equipment Lifespan: Prevent costly equipment failures through AI-driven predictive maintenance, improving the overall uptime and productivity.
    • Drive Sustainability: Contribute to sustainability goals by optimizing furnace operations to minimize waste, energy use, and emissions.

    Why Join Us?
    You will be part of a team dedicated to revolutionizing industrial furnace operations with the power of AI. This is an exciting opportunity to work on real-world challenges and directly impact the efficiency and sustainability of manufacturing processes.

    Apply Now to be at the forefront of AI-powered furnace optimization!

  • Data Science

    Data Science

    Data Science

    Data Science

    Job Description :
    Are you passionate about improving healthcare systems through intelligent solutions?
    Join us as an AI/ML Engineer I, where you’ll design AI-driven tools to optimize inventory management and human resource allocation in hospital operations.

    Your Mission:

    • Inventory Management Optimization: Build predictive models to anticipate supply shortages, reduce excess inventory, and optimize stock levels for critical medical supplies.
    • HR Resource Allocation: Develop scheduling algorithms to ensure the optimal deployment of medical staff, balancing workloads and maximizing patient care.
    • Real-Time Decision Support: Design systems to monitor hospital operations in real-time, providing actionable insights for resource management.
    • Automation and Workflow Enhancement: Contribute to AI-powered automation systems that reduce inefficiencies and improve hospital logistics.

    What We’re Looking For:

    • Experience Level: 2–4 years of experience in AI/ML engineering or data analysis, preferably with exposure to healthcare or operations domains.
    • Educational Background: Bachelor’s degree in Computer Science, Data Science, Operations Research, or a related field.
    • Technical Expertise:
      • Proficiency in machine learning frameworks such as Scikit-learn, XGBoost, or TensorFlow.
      • Strong understanding of optimization techniques (e.g., linear programming, genetic algorithms).
      • Experience with tools for inventory forecasting and workforce management analytics.
      • Familiarity with healthcare operations or ERP systems is a plus.
    • Programming Skills: Proficient in Python, R, and SQL, with strong data visualization skills using tools like Tableau or Power BI.
    • Analytical Mindset: Ability to identify inefficiencies and design AI-driven solutions to solve operational challenges.

    What You’ll Achieve:

    • Ensure the uninterrupted availability of critical supplies by optimizing hospital inventory systems.
    • Enhance workforce productivity and staff satisfaction through intelligent scheduling solutions.
    • Contribute to improving patient outcomes by streamlining hospital operations.

    Be part of the team that modernizes healthcare operations with AI and data-driven innovation.

  • Data Science

    Job Description :
    Are you ready to drive efficiency and sustainability with AI?
    We are looking for an experienced AI/ML Engineer II to develop transformative solutions that enhance operational efficiency and optimize solar power systems for maximum performance.

    Your Mission:

    • Operational Workflow Optimization: Create ML models to streamline manufacturing processes, reduce resource waste, and enhance production output.
    • Solar Power Analytics: Develop predictive and prescriptive systems for solar power generation, yield forecasting, and energy distribution optimization.
    • IoT-Driven Efficiency: Design real-time analytics pipelines using IoT sensors for operational monitoring and predictive maintenance.
    • Advanced Algorithms: Utilize optimization techniques such as linear programming, dynamic scheduling, and reinforcement learning to improve industrial workflows.
    • Scalable Deployment: Implement and deploy solutions on edge devices and cloud platforms to ensure scalability across large operations.

    What We’re Looking For:

    • Experience Level: 1-4 years of experience in AI/ML engineering, with a focus on operational efficiency or energy optimization.
    • Educational Background: Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, Data Science, or a related field.
    • Technical Expertise:
      • Advanced proficiency in machine learning frameworks such as TensorFlow, PyTorch, and Scikit-learn.
      • Strong understanding of optimization algorithms, time-series analysis, and predictive modeling.
      • Experience with IoT platforms and data integration (e.g., MQTT, OPC-UA).
      • Knowledge of renewable energy systems, particularly solar power generation and grid optimization.
    • Programming Skills: Expertise in Python, SQL, and experience with distributed computing frameworks like Apache Spark.
    • Cloud Experience: Proficiency in deploying solutions on AWS, GCP, or Azure for scalable and resilient applications.

    What You’ll Achieve:

    • Enable industries to reduce resource consumption and improve efficiency by up to 25%.
    • Develop AI-driven systems that optimize solar energy production and minimize downtime.
    • Drive sustainability initiatives, blending technology and environmental consciousness.

    Join us to redefine efficiency and sustainability with AI-powered solutions.

  • Collaborative Culture
    Dr.Reddys, LaurusLabs, CMRGroup - AI/ML Engineer I

    Data Science

    Data Science

    Data Science

    Job Description :
    Are you ready to safeguard lives with cutting-edge AI solutions?
    We are seeking an experienced AI/ML Engineer II to design intelligent systems that enhance workplace safety and precision object detection across industries. This role requires technical expertise, a proactive mindset, and a passion for creating impactful solutions.

    Your Mission:

    • Build Advanced Object Detection Models: Develop and deploy high-performance models using frameworks like YOLO, RetinaNet, and Faster R-CNN.
    • Real-Time Hazard Detection: Create systems capable of detecting workplace hazards, safety violations, and potential risks in real time.
    • Optimize Industrial Processes: Leverage computer vision to identify anomalies, defects, or inefficiencies in industrial workflows.
    • Scalable Integration: Deploy AI solutions on edge devices and cloud platforms, ensuring scalability and seamless integration into existing systems.
    • Safety Protocol Alignment: Collaborate with safety officers, engineers, and IT teams to align solutions with industry regulations and workplace safety standards.

    What We’re Looking For:

    • Experience Level: 2+ years of hands-on experience in developing and deploying AI/ML models, with a focus on computer vision and object detection.
    • Educational Background: Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, or related fields.
    • Technical Expertise:
      • Proficiency in deep learning frameworks such as TensorFlow, PyTorch, and ONNX.
      • Strong knowledge of image processing and video analytics using OpenCV and related libraries.
      • Experience in deploying models on edge devices (e.g., NVIDIA Jetson, Coral TPU) for low-latency applications.
    • Programming Skills: Advanced expertise in Python and C++ is required. Familiarity with CUDA for GPU optimization is highly desirable.
    • Industrial Knowledge: Familiarity with workplace safety standards (e.g., OSHA) and protocols is a significant advantage.
    • Cloud & IoT: Hands-on experience in integrating AI models with cloud platforms (AWS, Azure, GCP) and IoT ecosystems.

    What You’ll Achieve:

    • Reduce workplace incidents by deploying AI systems that detect hazards proactively.
    • Enhance industrial efficiency through automated defect detection and anomaly monitoring.
    • Design scalable solutions that can adapt to diverse industrial environments, from factories to construction sites.

    Why Join Us?
    This is your opportunity to lead innovation in safety technology. Your work will directly protect lives and set new benchmarks for AI-driven workplace safety. Be the force behind safer, smarter workplaces.

  • Collaborative Culture
    Vinayaka Steels, Surana Group, Grant Thornton - AI/ML Engineer II

    Data Science

    Data Science

    Data Science

    Job Description:
    Can you harness the power of AI to predict and prevent breakdowns before they disrupt operations?
    We are seeking an innovative AI/ML Engineer II to develop predictive maintenance solutions that redefine reliability, efficiency, and asset performance across industries.

    Your Mission:

    • Architect Predictive Models: Develop state-of-the-art ML models for failure prediction, leveraging techniques like time-series analysis, anomaly detection, and Bayesian optimization.
    • Seamless IoT Integration: Design solutions that integrate with IoT ecosystems, collecting and analyzing sensor data in real time to anticipate maintenance needs.
    • Optimize Maintenance Strategies: Build intelligent systems to minimize downtime, reduce costs, and extend equipment lifecycles through data-driven scheduling.
    • Scalable AI Pipelines: Design and deploy robust pipelines capable of handling large-scale, high-velocity data streams, ensuring reliable and scalable solutions.
    • Insights and Automation: Transform raw data into actionable insights using advanced analytics and automation tools to empower decision-makers.

    What We’re Looking For:

    • Expertise in Advanced ML: Strong experience with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn, and a deep understanding of unsupervised learning, clustering, and time-series forecasting techniques.
    • Data Engineering Skills: Proficiency in working with real-time data processing tools like Apache Kafka, Spark, or Flink, and familiarity with databases like Snowflake, DynamoDB, or PostgreSQL.
    • IoT and Edge Computing: Hands-on experience with IoT platforms and edge computing frameworks, ensuring low-latency, real-time predictions in distributed environments.
    • Cloud Computing: Expertise in cloud ecosystems such as AWS (IoT Core, SageMaker), GCP, or Azure, with a focus on scaling and deploying AI models.
    • Strong Programming Skills: Advanced proficiency in Python, with additional skills in C++, or R being a plus.
    • Domain Knowledge: Familiarity with maintenance workflows, CMMS systems, and industrial protocols (e.g., MQTT, OPC-UA) is highly desirable.

    What You’ll Achieve:

    • Build next-generation maintenance systems that reduce unplanned downtime by up to 30%.
    • Enable clients to transition from reactive to predictive maintenance models, ensuring greater efficiency and reliability.
    • Develop solutions that seamlessly scale across industrial assets, from factories to energy grids.

    Join us in shaping the future of industrial resilience and operational excellence through AI.

  • Data Science

    Data Science

    Data Science

    Data Science

    Job Description:
    We are seeking an experienced AI/ML Engineer II to lead the development of advanced forecasting and predictive analytics solutions for the retail and e-commerce sectors. This role will focus on leveraging machine learning, deep learning, and data-driven insights to optimize demand forecasting, inventory management, and customer behavior predictions to enhance business decisions.

    Your Mission:

    • Demand Forecasting Models: Develop and implement state-of-the-art forecasting models using time-series analysis, machine learning, and deep learning techniques (e.g., ARIMA, Prophet, LSTM, XGBoost) to predict product demand, sales trends, and seasonality patterns.
    • Customer Behavior Prediction: Use advanced predictive analytics to forecast customer behavior, buying patterns, and lifetime value, improving personalized marketing, promotions, and inventory strategies.
    • Inventory Optimization: Design predictive models to optimize stock levels, reduce overstock or stockouts, and streamline supply chain processes.
    • Real-time Analytics Integration: Build and deploy real-time data pipelines that integrate with e-commerce platforms and inventory management systems for dynamic decision-making and forecasting.
    • Data-Driven Insights: Leverage big data platforms (e.g., Hadoop, Spark) and cloud environments (AWS, GCP, Azure) to mine insights, predict trends, and support strategic decision-making.
    • Cross-Functional Collaboration: Work with cross-functional teams (product managers, marketing, supply chain) to align forecasting models with business goals and operational needs.

    What We’re Looking For:

    • Experience Level: 2-4 years of hands-on experience in AI/ML engineering, with a focus on forecasting, time-series analysis, or predictive analytics in retail or e-commerce.
    • Educational Background: Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, Mathematics, or a related field. Advanced degrees (Ph.D.) are a plus.
    • Technical Expertise:
      • Proficiency in machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn.
      • Strong background in time-series forecasting and predictive modeling techniques.
      • Hands-on experience with data engineering, including working with big data tools (Spark, Hadoop) and cloud-based platforms (AWS, GCP, Azure).
      • Knowledge of advanced algorithms like Random Forest, XGBoost, and Recurrent Neural Networks (RNNs).
      • Expertise in Python, SQL, and data visualization tools (Power BI, Tableau).
    • Domain Knowledge: Understanding of retail and e-commerce business models, especially in inventory management, demand forecasting, and customer behavior prediction.
    • Problem-Solving Skills: Strong ability to identify key business problems, formulate data-driven solutions, and provide actionable insights.

    What You’ll Achieve:

    • Impact Business Decisions: Your models will drive smarter inventory management, marketing strategies, and pricing optimization, increasing sales and profitability.
    • Streamline Operations: Reduce waste and inefficiencies by optimizing stock levels and improving demand accuracy.
    • Enhance Customer Experience: Predict customer needs and behavior to create highly personalized shopping experiences.

    Why Join Us?
    You will be at the forefront of transforming how retail and e-commerce businesses leverage AI to enhance forecasting, operations, and customer engagement. Join a dynamic team of innovators and problem-solvers working to shape the future of the industry.

    Apply Now and help us revolutionize retail and e-commerce with the power of predictive analytics and AI!

  • Data Science

    Data Science

    Data Science

    Job Description :

    We are looking for a talented and driven AI/ML Engineer II to join our team and lead the development of integrated digital solutions across healthcare, finance, and technology sectors. This role will involve creating sophisticated, AI-driven systems that optimize operations, improve customer experiences, and unlock val

    uable insights for businesses in these critical industries.

    Your Mission:

    • Cross-Industry AI Solutions: Design and implement advanced machine learning models tailored to address unique challenges in healthcare, finance, and technology, with a focus on data integration, process automation, and predictive insights.
    • Healthcare Innovation: Develop AI solutions for improving patient outcomes, predictive healthcare diagnostics, and healthcare operations optimization (e.g., resource management, scheduling, and medical supply chain).
    • Finance Optimization: Create models to optimize financial processes, including fraud detection, credit scoring, risk assessment, algorithmic trading, and customer segmentation.
    • Tech-Driven Efficiency: Develop AI-driven tools for technology companies to streamline operations, optimize user engagement, and enhance product innovation.
    • Data Integration: Work with diverse data sources, including structured, semi-structured, and unstructured data, to build integrated solutions that provide actionable insights across different industries.
    • Real-Time Analytics: Implement real-time analytics platforms that process large-scale datasets, providing immediate insights for decision-making.
    • Collaboration & Deployment: Collaborate with cross-functional teams, ensuring the alignment of AI models with business objectives and seamless deployment into operational systems.

    What We’re Looking For:

    • Experience Level: 1-4 years of experience in AI/ML engineering, with a strong background in solving problems for healthcare, finance, or technology sectors.
    • Educational Background: Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, Mathematics, or a related field.
    • Technical Expertise:
      • Advanced proficiency in machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn.
      • Knowledge of predictive modeling, time-series analysis, and classification techniques.
      • Experience in working with big data tools (e.g., Hadoop, Spark) and cloud platforms (AWS, GCP, Azure).
      • Expertise in programming languages such as Python, R, and SQL.
      • Experience in deploying machine learning models at scale in production environments.
    • Domain Knowledge: Experience or strong understanding of healthcare systems, financial services, or technology operations is highly desirable.
    • Problem-Solving Skills: Ability to design AI solutions that meet diverse industry needs, providing practical and effective applications.

    What You’ll Achieve:

    • Transform Industries: Your work will directly improve operations in healthcare, finance, and technology, creating measurable value through AI-driven solutions.
    • Innovate for the Future: Develop cutting-edge solutions that address real-world challenges and drive innovation within these industries.
    • Create Seamless Integrations: Help companies break down data silos, creating seamless systems that offer a holistic view of operations and customer experiences.

    Why Join Us?
    This is your chance to contribute to impactful, cross-industry AI solutions that shape the future of healthcare, finance, and technology. Be part of a team that is driving change, empowering industries with innovative digital solutions that create long-lasting benefits.

    Apply Now and become a key player in our mission to build the next generation of integrated AI solutions for diverse industries!

    Take the First Step to Your Dream Job

    Find Your Fit
    Steps to Dream Job

    Explore job roles posted by our industry partners, alumni network, and collaborators.

    Submit Your Profile
    Steps to Dream Job

    Apply with your updated resume and showcase your unique strengths or email us at hr@360digitmg.com

    Connect
    Steps to Dream Job

    Let our career advisors and placement team guide you through the process.

    Excel

    Leverage your skills and training to make a mark in your new role.

    Leadership Talks

    Leadership Talks

    Join our CEO and various industry experts, including successful alumni specializing in AI, Industry 4.0, and other cutting-edge fields, for 10-15 minute recorded sessions on Zoom.

    Watch Now to Learn and Grow
    These talks aim to:
    Inspire and motivate professionals in their careers
    Raise awareness about industry trends and innovations
    Encourage reskilling and upskilling initiatives
    Support corporate communication strategies

    Resources to Help You Succeed

    Empower your journey to success by utilizing our tools and guides to sharpen your skills and stand out in your career.

    Data Skills Assessment
    Measure your analytics proficiency with on-demand exams. Receive instant feedback and a personalized report to identify strengths and areas for improvement.
    Resume Optimization Hub
    Make your resume industry-ready. Use our AI-powered Resume Optimization Hub to match your skills with job descriptions and get actionable feedback to improve.
    Interview Readiness Program
    Prepare for your big day with mock interviews conducted by industry leaders, receive detailed feedback, and job-specific guidance to boost your confidence.
    Career Transition Guide
    Transform your career path with expert guidance. Access tailored recommendations, real-world project opportunities, and proven success strategies to navigate your career change.
    Make an Enquiry