Call Us
Home / Careers

Building Tomorrow's Workforce

360DigiTMG is a global leader in education and career enablement, connecting talent with opportunity. We have trained over 50,000 professionals worldwide, and prepared them for leadership roles in the digital economy.

Our AiSPRY powered network and partnerships with top-tier companies ensure that our alumni and job seekers have access to job opportunities in cutting-edge technologies like Data Science, Data Analytics, AI, Machine Learning, and Data Engineering.

We pride ourselves on fostering innovation, offering real-world project experience, and empowering individuals to thrive in their careers. Whether you're an experienced professional or a fresh graduate, we help you bridge the gap to your dream job.

Why Choose Us?

Innovative Environment
Exclusive Access to Opportunities
Job roles through our vast network of industry collaborators and stakeholders.
Growth Opportunities
Alumni-Driven Ecosystem
Be part of a thriving community excelling in global corporations like Google, EY, and PwC.
Impactful Work
Real-World Skills
Our candidates bring hands-on experience through projects across 76+ domains.
Collaborative Culture
Tailored Hiring Solutions
Flexible roles, ranging from internships to project-based hiring and opportunities.

Latest Job Vacancies You Can Apply

  • Job Description:

    We are looking for an enthusiastic "AI/ML Developer (Fresher)" with a strong foundation in Python, Machine Learning, and Deep Learning. As part of our dynamic team, you will work on cutting-edge AI projects, developing intelligent solutions that transform industries and enhance user experiences.

     

    Responsibilities:

    As an AIML Developer (Freshers), you will be a part of a team focused on building AI and ML solutions for real-world problems.
    Model Development: Assist in developing machine learning and deep learning models for various use cases.
    Data Processing: Help with preprocessing and cleaning large datasets.
    Algorithm Implementation: Implement AI algorithms for supervised and unsupervised learning.
    Collaboration: Work closely with senior developers and data scientists to enhance AI/ML models.

    Testing and Debugging: Assist in model testing and evaluating performance metrics.
    Documentation: Maintain clear documentation for models and algorithms.
    Learning and Growth: Continuously update skills in the rapidly evolving AI/ML space.

     

    Requirements:

    • Bachelor’s degree in Computer Science, Information Technology, Data Science, or a related field.
    • Basic knowledge or understanding of machine learning algorithms, deep learning, and frameworks like TensorFlow, Keras, or PyTorch.
    • Proficiency in Python and basic libraries such as Pandas, NumPy, Matplotlib.
    • Understanding of data preprocessing, feature engineering, and model evaluation techniques.
    • Passion for solving complex problems using AI/ML techniques.
    • Strong analytical, mathematical, and problem-solving skills.
    • Ability to work in a collaborative team environment.
    • Strong communication skills to interact with cross-functional teams.
    • Eagerness to learn and grow in the field of Artificial Intelligence and Machine Learning.

     

    Job Details & Compensation:

    • Department: Development
    • Package: 2.5 to 5 Lakhs CTC (based on performance)
    • Job Type: Full-time, Work from Home
    • Working Days: 5 days (Monday to Friday)
    • Internship Duration: 100 Working Days (Internship followed by Full-time Job based on performance)

     

    Selection Process:

    1 .Personal Interview by HR.
    2. Technical Interview by Development Lead.
    3. Online Interview via Google Meet.
    4. Coding Challenge (Live or Take-home).
    5. Onboarding & Induction Process.

     

    Benefits:
    1. Direct Job Offer after successful completion of training
    2. Work from Home.
    3. Competitive salary with performance incentives.
    4. Opportunity for growth and mentorship in a collaborative team environment.
    5. Training and hands-on experience with real-time AI/ML projects.
    6. Rewards & recognitions based on performance.

     

    Terms and Conditions:
    Project Continuation: Based on successful completion of sprints and evaluations during training.

     

    Candidate Requirements:

    • 100% attendance during training and project phases.
    • Successful completion of tasks and project goals.
    • Job offer will be based on performance and management discretion.
  • Job Description:
    We're seeking enthusiastic Fresher graduates for the "Data Analyst" role, Respective candidates should demonstrate strong analytical and computer skills to effectively manage & analyze data for loading into the platform, along with strong communication skills.

     

    Responsibilities:

    • Ability to create courses, load syllabus, and timetables in the platform.
    • Perform health checks on platform data and functionalities.
    • Collaborate effectively with cross-functional development teams.
    • Communicate with clients and gather requirements.
    • Ability to give orientations to students and college stakeholders.
    • Understand the product thoroughly to provide insights and improvements.

     

    Requirements:  

    • Proficiency in Microsoft Excel with Strong analytical and problem-solving skills.
    • Excellent communication skills, both written and verbal.
    • Client management and requirement gathering skills.
    • Adaptability and willingness to learn new technologies.
    • Attention to detail and organizational skills.

     

    Additional Information:

    • This role involves usage of built-in tools and Advanced Excel for data analysis and loading.
    • No usage of coding languages like Python,SQL & visualization tools such as PowerBI and Tableau.

     

    About Company:

    Reference globe was established with the aim to provide software services in the educational domain across India. With over 10 years of experience, we have been working with 1000+ Engineering colleges.

     

    Education / Job Experience Required: 

    • B.Tech /M.Tech (Minimum 60% Score)
    • Fresher 

     

    Job Type : Full Time 

    Salary : 1.8 LPA to 2.4 LPA (Based on Skills)

  • Job Description:
    We're seeking enthusiastic Fresher graduates for the "Business Analyst" role, Respective candidates should hold a command of Communication and negotiation skills to communicate with the stakeholders in the educational domain.

     

    Responsibilities:

    • Familiarizing yourself with all services offered by our company.
    •  Able to demonstrate the product by mapping it with relevant use cases.
    • Answer phone inquiries, direct calls, and provide basic company information.
    • Procuring new clients through direct contact, word - of - mouth.
    • Negotiating with clients to secure the most attractive prices.
    • Conduct promotional activities and devise marketing strategies.
    • Collaborating with the development team to ensure project deliverables meet client expectations.

     

    Requirements:  

    • Should possess market and domain knowledge.
    • Proactive in communicating with the clients.
    • Ability to generate revenue by identifying pain points and suggesting suitable services.
    • Good written and verbal communication, as well as presentation skills.
    • Resourceful, with outstanding research skills.

     

    About Company:

    Reference globe was established with the aim to provide software services in the educational domain across India. With over 10 years of experience, we have been working with 1000+ Engineering colleges.

     

    Education / Job Experience Required: 

    • MBA /B.Tech (Minimum 60% Score)
    • Fresher / Experienced

     

    Job Type : Full Time / Day shift

    Salary : 2.4LPA to 3.6LPA (Based on the Skills)

  • Job Description:
    We are looking for AWS Data Analytics Specialist for 100% hands on role working with our team at a large American utility provider. Ideal candidate has at least 3 years of hands-on development experience with AWS services and programming languages detailed below.

     

    Additional Details

    • Programming skills: Be proficient in programming languages such as PySpark, Python, SQL, as well Bash and UNIX coding for rapid data ingestion. 
    • Data Modeling skills: Be able to design and implement data models that can be used to store and query large volumes of enterprise data across multiple systems. 
    • Analysis skills: Be able to use statistical and machine learning techniques to analyze large amounts of data, including experience with Spark and Hadoop. 
    • Cloud Computing skills: Big Data Developers will ideally possess certified credentials on cloud computing platforms such as Amazon Web Services (AWS) and Microsoft Azure, including experience with services such as Glue and Lambda.

     

    Notice Period:

    • Immediate to 30 days, we can consider someone around 6 months to 1 Years gap, need technical strong profiles.

     

    Good to Have:

    • Hive, Python 

     

    Must have skills:  

    • AWS (S3, EMR, Athena, Glue, Lambda, Cloud Formation, Redshift), PySpark, Unix, streaming (Kafka)

     

    Soft Skills:  

    • Strong Communication skills, ability to lead offshore development teams. 

     

    Certification:

    • Preferred - AWS Data Analytics Specialty Certification 
    • At least - AWS Certified Developer 

     

    Experience: 4years to 9 Years

  • Roles & Responsibilities:

    • Assist in data collection, cleaning, and preprocessing tasks.
    • Conduct exploratory data analysis (EDA) and create meaningful visualizations.
    • Build and implement machine learning models using various algorithms (supervised/unsupervised learning).
    • Participate in AI-driven projects, including deep learning and natural language processing (NLP).
    • Collaborate with team members to deliver high-quality insights and solutions.
    • Document and present findings to stakeholders clearly and effectively.

     

    Skills & Qualifications:

    • Completed a data science course or equivalent certification.
    • Strong understanding of data science concepts such as data processing, statistical analysis, and exploratory data analysis.
    • Experience with machine learning algorithms (regression, classification, clustering) and model evaluation techniques.
    • Proficiency in programming languages such as Python, R, and/or SQL.
    • Familiarity with tools like Pandas, NumPy, Matplotlib, Seaborn, and machine learning libraries (TensorFlow, Scikit-learn, etc.).
    • Good understanding of AI concepts (e.g., deep learning, neural networks).
    • Strong problem-solving skills, analytical thinking, and attention to detail.
    • Ability to work collaboratively in a fast-paced team environment.

     

    Eligibility Criteria:

    • Fresh graduates degree in Data Science, Computer Science, Mathematics, or related fields.
    • Available for a full-time internship for 3 months.
    • Knowledge of AI and Data Science concepts is essential.
       
  • Technical Requirements:  AI, Big Data, NLP, Machine Learning, Python, Microsoft LUIS

     

    Responsibilities: Develop and implement AI models and solutions, Analysis large datasets, insights data driven decision making. 
    Monitor and evaluate performance of AI models and continuously improve 
    Documents processes methodologies and results to stakeholders.

     

    Experience: 3 to 4 years

  • Job Description: 
    We are seeking a skilled and detail-oriented professional to oversee and manage the organization's business data and analytics functions. The role involves designing and maintaining robust data pipelines, infrastructure, and models to process and analyze critical data, including website traffic, leads, sales, and P&L metrics. 
    The ideal candidate will be responsible for ensuring data quality, developing scalable ETL processes, and building dashboards to enable actionable insights. This position requires close collaboration with cross-functional teams to align technical solutions with business objectives and support data-driven decision-making across the organization. 


    Education / Job Experience Required: 
    1. Bachelor's or master's degree in computer science, data science, or a related field 
    Location: Bangalore


    Day-to-Day Work: 

    • Responsible for managing all aspects of business data including website traffic, leads, sales & final P&L. 
    • Design and develop robust, scalable, and efficient data pipelines and ETL processes to extract, transform, and load data from various sources into data warehouses or data lakes. 
    • Build and maintain data infrastructure, including data warehouses and data processing frameworks. 
    • Ensure data quality and integrity by implementing appropriate data validation, monitoring, and error handling mechanisms. 
    • Develop and maintain data models and schemas to support data analytics and reporting needs. 
    • Identify and address data-related issues and bottlenecks, ensuring the smooth operation of data pipelines and systems. 
    • Construct and maintain dashboards that allow users to understand performance and generate Insights. 
    • Collaborate closely with cross-functional teams, including engineering, product management, and business stakeholders, to ensure alignment between business objectives and technical solutions. 

    Must-Have: 

    • Must have experience with Google cloud platform (Bigquery, Dataproc, Cloud functions) 
    • Solid understanding of SQL and No SQL Databases like Postgresql & MongoDB. 
    • Strong proficiency in Python with experience in data manipulation and scripting. 
    • Hands-on experience with ETL tools and data integration using third party APIs. 
    • Proven experience with data modelling & report automation. 
    • Must have experience in designing, building and maintaining database & data warehouses. 
    • Any further experience with Google Apps Script and Google AppSheet. 

    Good to Have: 

    • Any prior experience with Power BI. 
    • Understanding of real time data streaming technologies. 
  • Job Description:

    Technical knowledge: AWS, Python, SQL, S3, EC2, Glue, Athena, Lambda, DynamoDB, RedShift, Step Functions, Cloud Formation, CI/CD Pipelines, Github, EMR, RDS,AWS Lake Formation, GitLab, Jenkins, and AWS CodePipeline.
    Role Summary: As a Senior Data Engineer,with over 5 years of expertise in Python, PySpark,  SQL to  design, develop and optimize complex data pipelines, support data modeling, and contribute to the architecture that supports big data processing and analytics to cutting-edge cloud solutions that drive business growth. You will lead the design and implementation of scalable, high-performance data solutions on AWS and mentor junior team members.This role demands a deep understanding of AWS services, big data tools, and complex architectures to support large-scale data processing and advanced analytics.

     

    Key Responsibilities:

    • Design and develop robust, scalable data pipelines using AWS services, Python, PySpark, and SQL that integrate seamlessly with the broader data and product ecosystem.
    • Lead the migration of legacy data warehouses and data marts to AWS cloud-based data lake and data warehouse solutions.
    • Optimize data processing and storage for performance and cost.
    • Implement data security and compliance best practices, in collaboration with the IT security team.
    • Build flexible and scalable systems to handle the growing demands of real-time analytics and big data processing.
    • Work closely with data scientists and analysts to support their data needs and assist in building complex queries and data analysis pipelines.
    • Collaborate with cross-functional teams to understand their data needs and translate them into technical requirements.
    • Continuously evaluate new technologies and AWS services to enhance data capabilities and performance.
    • Create and maintain comprehensive documentation of data pipelines, architectures, and workflows.
    • Participate in code reviews and ensure that all solutions are aligned to pre-defined architectural specifications.
    • Present findings to executive leadership and recommend data-driven strategies for business growth.
    • Communicate effectively with different levels of management to gather use cases/requirements and provide designs that cater to those stakeholders.
    • Handle clients in multiple industries at the same time, balancing their unique needs.
    • Provide mentoring and guidance to junior data engineers and team members.

    Requirements:

    • 5+ years of experience in a data engineering role with a strong focus on AWS, Python, PySpark, Hive, and SQL.
    • Proven experience in designing and delivering large-scale data warehousing and data processing solutions.
    • Lead the design and implementation of complex, scalable data pipelines using AWS services such as S3, EC2, EMR, RDS, Redshift, Glue, Lambda, Athena, and AWS Lake Formation.
    • Bachelor's or Master’s degree in Computer Science, Engineering, or a related technical field.
    • Deep knowledge of big data technologies and ETL tools, such as Apache Spark, PySpark, Hadoop, Kafka, and Spark Streaming.
    • Implement data architecture patterns, including event-driven pipelines, Lambda architectures, and data lakes.
    • Experience with cloud platforms such as AWS, Azure, and GCP.
    • Incorporate modern tools like Databricks, Airflow, and Terraform for orchestration and infrastructure as code.
    • Implement continuous integration and delivery pipelines using GitLab, Jenkins, and AWS CodePipeline.
    • Ensure data security, governance, and compliance by leveraging tools such as IAM, KMS, and AWS CloudTrail.
    • Mentor junior engineers, fostering a culture of continuous learning and improvement.
    • Excellent problem-solving and analytical skills, with a strategic mindset.
    • Strong communication and leadership skills, with the ability to influence stakeholders at all levels.
    • Ability to work independently as well as part of a team in a fast-paced environment.
    • Advanced data visualization skills and the ability to present complex data in a clear and concise manner.
    • Excellent communication skills, both written and verbal, to collaborate effectively across teams and levels.

    Preferred Skills:

    • Experience with Databricks, Snowflake, and machine learning pipelines.
    • Exposure to real-time data streaming technologies and architectures.
    • Familiarity with containerization and serverless computing (Docker, Kubernetes, AWS Lambda).
  • Job Description:

    Technical knowledge: AWS, Python, SQL, S3, EC2, Glue, Athena, Lambda, DynamoDB, RedShift, Step Functions, Cloud Formation, CI/CD Pipelines, Github, EMR, RDS,AWS Lake Formation, GitLab, Jenkins and AWS CodePipeline.
    Role Summary: As a Senior Data Engineer,with over 2 years of expertise in Python, PySpark,  SQL to  design, develop and optimize complex data pipelines, support data modeling, and contribute to the architecture that supports big data processing and analytics to cutting-edge cloud solutions that drive business growth. You will lead the design and implementation of scalable, high-performance data solutions on AWS and mentor junior team members.This role demands a deep understanding of AWS services, big data tools, and complex architectures to support large-scale data processing and advanced analytics.

     

    Key Responsibilities:

    • Design and develop robust, scalable data pipelines using AWS services, Python, PySpark, and SQL that integrate seamlessly with the broader data and product ecosystem.
    • Lead the migration of legacy data warehouses and data marts to AWS cloud-based data lake and data warehouse solutions.
    • Optimize data processing and storage for performance and cost.
    • Implement data security and compliance best practices, in collaboration with the IT security team.
    • Build flexible and scalable systems to handle the growing demands of real-time analytics and big data processing.
    • Work closely with data scientists and analysts to support their data needs and assist in building complex queries and data analysis pipelines.
    • Collaborate with cross-functional teams to understand their data needs and translate them into technical requirements.
    • Continuously evaluate new technologies and AWS services to enhance data capabilities and performance.
    • Create and maintain comprehensive documentation of data pipelines, architectures, and workflows.
    • Participate in code reviews and ensure that all solutions are aligned to pre-defined architectural specifications.
    • Present findings to executive leadership and recommend data-driven strategies for business growth.
    • Communicate effectively with different levels of management to gather use cases/requirements and provide designs that cater to those stakeholders.
    • Handle clients in multiple industries at the same time, balancing their unique needs.
    • Provide mentoring and guidance to junior data engineers and team members.

    Requirements:

    • 2+ years of experience in a data engineering role with a strong focus on AWS, Python, PySpark, Hive, and SQL.
    • Proven experience in designing and delivering large-scale data warehousing and data processing solutions.
    • Lead the design and implementation of complex, scalable data pipelines using AWS services such as S3, EC2, EMR, RDS, Redshift, Glue, Lambda, Athena, and AWS Lake Formation.
    • Bachelor's or Master’s degree in Computer Science, Engineering, or a related technical field.
    • Deep knowledge of big data technologies and ETL tools, such as Apache Spark, PySpark, Hadoop, Kafka, and Spark Streaming.
    • Implement data architecture patterns, including event-driven pipelines, Lambda architectures, and data lakes.
    • Incorporate modern tools like Databricks, Airflow, and Terraform for orchestration and infrastructure as code.
    • Implement CI/CD using GitLab, Jenkins, and AWS CodePipeline.
    • Ensure data security, governance, and compliance by leveraging tools such as IAM, KMS, and AWS CloudTrail.
    • Mentor junior engineers, fostering a culture of continuous learning and improvement.
    • Excellent problem-solving and analytical skills, with a strategic mindset.
    • Strong communication and leadership skills, with the ability to influence stakeholders at all levels.
    • Ability to work independently as well as part of a team in a fast-paced environment.
    • Advanced data visualization skills and the ability to present complex data in a clear and concise manner.
    • Excellent communication skills, both written and verbal, to collaborate effectively across teams and levels.

    Preferred Skills:

    • Experience with Databricks, Snowflake, and machine learning pipelines.
    • Exposure to real-time data streaming technologies and architectures.
    • Familiarity with containerization and serverless computing (Docker, Kubernetes, AWS Lambda).
       
  • Job Description:

    We are seeking a talented Python and SQL Developer with 2 - 3 years of experience to join our team. As a Python and SQL Developer, you will be responsible for designing, developing, and maintaining data-driven applications and solutions. You will work closely with cross-functional teams to deliver high-quality software products that meet the clients needs.

     

    Responsibilities:

    • Design, develop, and maintain Python-based applications that interact with SQL databases.
    • Write efficient and optimized SQL queries, stored procedures, and functions to retrieve, manipulate, and analyze data.
    • Design and implement scalable solutions leveraging cloud technologies
    • Perform data validation, cleansing, and transformation to ensure data integrity and accuracy.
    • Understanding of the threading limitations of Python, and multi-process architecture
    • Work with version control systems such as Git for code management and collaboration.
    • Conduct code reviews, testing, debugging, and troubleshooting to ensure the reliability and performance of applications.
    • Develop data visualization tools and reports to present insights and findings to stakeholders.

    Requirements:

    • Bachelor's degree in Computer Science, Engineering, or a related field.
    • Hands-on experience in Python development and SQL database management.
    • Proficiency in writing complex SQL queries, including joins, subqueries, and aggregations.
    • Strong understanding of Python programming concepts, data structures, and algorithms.
    • Experience with Python frameworks such as Django, Flask etc .
    • Familiarity with database systems such as PostgreSQL, MySQL, or SQL Server.
    • Knowledge of data visualization tools and libraries such as Matplotlib, Plotly etc.
    • Excellent problem-solving, analytical, and communication skills.
    • Ability to work effectively in a collaborative team environment.
    • Collaboration with data engineers and analysts to design and implement data models and ETL processes is a plus.
    • Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) is a plus.
  • Job Description:

    As a Senior Data Scientist, you will lead data-driven projects and play a key role in shaping our data science strategy. You will be responsible for developing advanced algorithms, managing large-scale datasets, and mentoring junior team members.

    Responsibilities

    • Design and implement complex statistical models and machine learning algorithms.
    • Lead data science projects from conception to deployment, ensuring high-quality deliverables.
    • Guide junior data scientists and provide technical mentorship.
    • Collaborate with business stakeholders to define project objectives and deliver actionable insights.
    • Explore and implement new technologies and best practices in data science.
    • Present findings to executive leadership and recommend data-driven strategies for business growth.

    Requirements

    • Bachelor's or Master's degree in Data Science, Computer Science, Statistics, Mathematics, or a related field.
    • 5+ years of experience in data science, with a proven track record of leading successful projects.
    • Deep understanding of advanced machine learning techniques, including deep learning and ensemble methods.
    • Should have good experience in projects including LLMs and GenAI models.
    • Strong proficiency in programming languages such as Python or R.
    • Experience with big data technologies (e.g., Apache Spark, Hadoop) and cloud platforms (e.g., AWS, Azure, GCP).
    • Advanced data visualization skills and the ability to present complex data in a clear and concise manner.
    • Excellent problem-solving and analytical skills, with a strategic mindset.
    • Strong communication and leadership skills, with the ability to influence stakeholders at all levels.

    Business Needs

    • Ability to meet with different levels of management and communicate at that level to gather use cases/requirements and the ability to provide designs that cater to that level of stakeholders.
    • Ability to quickly learn the basics of the industry that the client belongs to and converse with the stakeholders in the industry.
    • Ability to handle clients in multiple industries at the same time.
    • The ability to take the dashboards created and build stories that can be presented to the top management for the clients.
  • Collaborative Culture
    Cloud Computing Engineer (AWS, Azure, GCP) 

    Job Description: 
    We are looking for a skilled Cloud Computing Engineer with expertise in AWS, Azure, and Google  Cloud Platform (GCP) to join our dynamic team. The ideal candidate will be responsible for  designing, implementing, and managing cloud infrastructure solutions across multiple cloud  platforms. This position requires hands-on experience in cloud architecture, security, automation,  and optimizing cloud resources to meet business requirements. 

    Key Responsibilities: 
    • Design and deploy scalable, secure, and cost-effective cloud infrastructure solutions on AWS,  Azure, and GCP. 
    • Manage and maintain cloud environments, ensuring availability, performance, and security. 
    • Implement Infrastructure as Code (IaC) using tools such as Terraform, CloudFormation, or  Azure Resource Manager (ARM) templates. 
    • Configure, deploy, and manage cloud-based applications, services, and databases. 
    • Automate cloud infrastructure provisioning, monitoring, and management to enhance  efficiency and minimize downtime. 
    • Optimize cloud infrastructure for performance, cost, and scalability. 
    • Troubleshoot and resolve cloud-related infrastructure issues, providing ongoing support and  maintenance. 
    • Work closely with DevOps, development, and security teams to integrate cloud  infrastructure with software solutions and CI/CD pipelines. 
    • Implement cloud security best practices, including identity and access management (IAM),  network security, and data protection. 
    • Monitor and manage cloud resource utilization, optimize cost-efficiency, and recommend  improvements. 
    • Conduct regular cloud performance reviews and report on cloud usage, trends, and cost  analysis.

    Key Qualifications: 
    • Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or  a related field. 
    • Proven experience working with cloud platforms, specifically AWS, Azure, and GCP
    • Strong knowledge of cloud infrastructure and services, including computing, storage,  networking, and databases. 
    • Proficiency in Infrastructure as Code (IaC) using tools such as Terraform, CloudFormation, or  Azure Resource Manager (ARM) templates. 
    • Experience with cloud security, IAM, encryption, and data protection protocols. 
    • Expertise in setting up and managing CI/CD pipelines using cloud-native tools or third-party  tools (Jenkins, GitLab CI, etc.). 
    • Familiarity with containerization and orchestration tools such as Docker and Kubernetes
    • Solid experience with cloud monitoring and management tools (AWS CloudWatch, Azure  Monitor, Google Stackdriver). 
    • Experience in managing cloud cost optimization and using tools to manage and forecast  cloud expenses. 
    • Excellent problem-solving skills and the ability to troubleshoot complex cloud-related issues.

  • Job Description: 

    We are looking for a highly skilled and motivated AI/ML Engineer to join our team. The ideal candidate will have expertise in developing machine learning models, algorithms, and AI systems.  You will be responsible for researching, designing, and implementing AI-driven solutions that solve  complex business problems and enhance our products and services. 

    Key Responsibilities: 

    • Design, implement, and optimize machine learning models and AI algorithms for real-world  applications. 
    • Develop and deploy AI models and systems that enhance business operations and customer  experiences. 
    • Work closely with data scientists and engineers to collect, process, and analyze data for  training AI models. 
    • Collaborate with cross-functional teams to define and implement AI-driven solutions.
    • Optimize and tune AI models to improve accuracy, efficiency, and performance. • Conduct experiments, validate results, and provide insights from model performance. 
    • Stay updated on AI and ML advancements and integrate new techniques into existing  models. 
    • Write production-ready code for AI models, ensuring maintainability and scalability. 
    • Implement machine learning pipelines, including data preprocessing, model training,  evaluation, and deployment. 
    • Troubleshoot issues related to AI models and improve the model lifecycle.

    Key Qualifications: 

    • Bachelor's or Master’s degree in Computer Science, Data Science, Engineering, or a related  field. 
    • Proven experience as an AI/ML Engineer or in a similar role.
    • Strong proficiency in programming languages such as Python, Java, or C++. 
    • Expertise in machine learning frameworks and libraries like TensorFlow, Keras, PyTorch, and  Scikit-learn. 
    • Experience in data manipulation and analysis using libraries like Pandas, NumPy, and  Matplotlib. 
    • Solid understanding of machine learning algorithms such as regression, clustering, neural  networks, and deep learning. 
    • Strong problem-solving, analytical, and debugging skills. 

    • Experience in deploying machine learning models in production environments. • Familiarity with cloud platforms (AWS, Google Cloud, Azure) and using cloud-based ML tools.

  • Job Description: 
    We are seeking an experienced Data Scientist & Big Data Analyst to join our team and help drive  data-driven decision-making by analysing large datasets and developing predictive models. As a Data  Scientist and Big Data Analyst, you will work with cutting-edge technologies and methodologies to  extract actionable insights, build machine learning models, and help solve complex business  problems using big data technologies like Hadoop, Spark, and cloud-based platforms.

    Key Responsibilities: 
    • Analyze large and complex datasets to identify trends, patterns, and business insights. 
    • Develop machine learning models for classification, regression, and clustering to solve real world business problems. 
    • Leverage Big Data technologies (Hadoop, Spark, etc.) to process and analyze massive  datasets in distributed computing environments. 
    • Work with data engineering teams to design and build data pipelines and data warehousing  solutions. 
    • Clean, preprocess, and validate large datasets to ensure high-quality and accurate analysis. 
    • Build and deploy predictive models and algorithms using machine learning frameworks like  TensorFlow, Scikit-learn, or PyTorch
    • Create data visualizations and dashboards to communicate findings to business stakeholders. 
    • Implement data-driven strategies to optimize business processes, marketing campaigns,  customer engagement, and more. 
    • Use statistical analysis, hypothesis testing, and A/B testing to generate insights and support  business decisions. 
    • Design and execute experiments and simulations to validate model predictions.
    • Stay up to date with the latest developments in data science, machine learning, and big data  analytics to ensure the organization is leveraging the best technologies and methodologies. 

    Key Qualifications: 
    • Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, Engineering, or  a related field. 
    • Proven experience as a Data Scientist or Big Data Analyst, with hands-on experience in  analyzing large datasets and building machine learning models. 
    • Expertise in Big Data technologies such as Hadoop, Spark, Hive, and MapReduce. • Proficiency in programming languages such as Python, R, Scala, or Java. • Strong understanding of statistical analysis, data mining, and machine learning algorithms. 
    • Experience with data visualization tools like Tableau, Power BI, or programming libraries like  Matplotlib, Seaborn, or ggplot2
    • Hands-on experience with SQL and NoSQL databases such as MongoDB, Cassandra, or  HBase
    • Familiarity with cloud platforms like AWS, Azure, or Google Cloud and their data processing  services. 
    • Experience with data wrangling, data cleaning, and feature engineering.
    • Strong problem-solving, analytical, and critical thinking skills. 
    • Ability to communicate complex technical concepts to non-technical stakeholders.

  • Job Description: 
    We are seeking a talented Power BI Developer to join our team and help create interactive,  insightful, and visually appealing reports and dashboards using Power BI. In this role, you will work  closely with business stakeholders to understand their data needs and deliver effective data  visualisations that drive strategic decisions. The ideal candidate will have hands-on experience with  Power BI, data modelling, data transformation, and dashboard/report development. 

    Key Responsibilities: 

    • Design, develop, and maintain Power BI reports and dashboards, ensuring they meet  business requirements and deliver actionable insights. 
    • Work with business users to gather requirements, translate them into reporting solutions,  and ensure data accuracy. 
    • Create data models, develop and optimise complex DAX (Data Analysis Expressions) queries,  and integrate multiple data sources. 
    • Perform data cleaning, data transformation, and data validation using Power Query and  Power BI Desktop
    • Design and implement efficient data visualisation techniques, including interactive charts,  graphs, and KPI dashboards. 
    • Develop and automate reporting processes to streamline data analysis and reporting  workflows. 
    • Collaborate with the IT and Data Engineering teams to integrate Power BI with various data  sources like SQL databases, cloud data sources (Azure, AWS), and third-party applications. 
    • Conduct performance tuning and optimisation of Power BI reports to ensure optimal  performance for large datasets. 
    • Create and maintain SQL queries and stored procedures for extracting data from different  data sources.
    • Implement role-based security features in Power BI to control data access and ensure  compliance. 
    • Stay updated with the latest Power BI features and best practices, and proactively apply  them to enhance reporting capabilities. 
    • Assist in training users on how to use reports and dashboards effectively. 
    • Provide ongoing support, troubleshooting, and enhancement of existing reports and  dashboards. 

    Key Qualifications: 

    • Bachelor’s degree in Computer Science, Information Technology, Data Science, or a related  field. 
    • Proven experience as a Power BI Developer or similar role, with hands-on experience in  report/dashboard development. 
    • Expertise in Power BI Desktop, Power BI Service, Power Query, and Power BI Report Server
    • Strong proficiency in Data Analysis Expressions (DAX) and experience with creating complex  calculations and metrics. 
    • Solid experience in SQL for querying, filtering, and transforming data from relational  databases (e.g., SQL Server, MySQL). 
    • Experience with integrating data from a variety of sources, including Excel, SharePointAzure, and other cloud-based data platforms. 
    • Knowledge of data modelling techniques, including star and snowflake schemas, and best  practices for organising and structuring data. 
    • Understanding of data visualisation principles and best practices for creating intuitive and  user-friendly reports. 
    • Familiarity with advanced data analytics techniques such as predictive modelling, machine  learning integration, or statistical analysis (optional). 
    • Experience in creating and implementing role-level security (RLS) in Power BI reports. • Strong problem-solving skills and attention to detail.

  • Collaborative Culture
    Developer - Python

    Job Description:
    The primary responsibilities of this role include:
    • Building and maintaining pipelines for model development, testing, deployment, and monitoring.
    • Automating repetitive tasks such as model re-training, hyper parameter tuning, and data validation.
    • Developing CI/CD pipelines for seamless code migration.
    • Collaborating with cross-functional teams to ensure proper integration of models into production systems.

    Key Skills Required:
    • 3+ years of experience in developing and deploying ML models in production.
    • Strong programming skills in Python (with familiarity in Bash/Shell scripting).
    • Hands-on experience with tools like Docker, Kubernetes, MLflow, or Airflow.
    • Knowledge of cloud services such as AWS SageMaker or equivalent.
    • Familiarity with DevOps principles and tools like Jenkins, Git, or Terraform.
    • Understanding of versioning systems for data, models, and code.
    • Solid understanding of MLflow, ML services, model monitoring, and enabling logging services for performance tracking.

    Education Qualification: BE/BTECH/MCA
    Skill to Evaluate: Technical | App Development | Python
    Experience: 1 to 4 Years
    Role: Developer
    Skills: Python

  • Collaborative Culture
    Grant Thornton, Surana Group, Vinayaka Steels, CMR Group - AI/ML Engineer

    Data Science

    Data Science

    Data Science

    Data Science

    Job Description :
    We are seeking an experienced AI/ML Engineer II to join our Furnace Operations Efficiency Program. In this role, you will develop cutting-edge AI and machine learning solutions to optimize furnace operations, improve energy efficiency, minimize waste, and enhance the overall quality of production. You will work directly with the operations team to create data-driven insights that drive smarter, more efficient manufacturing processes.

    Your Mission:

    • Furnace Process Optimization: Develop and deploy machine learning models to optimize the entire furnace operation cycle, from pre-treatment to melting and refining, improving both yield and energy efficiency.
    • Predictive Maintenance: Implement predictive maintenance models using sensor data to anticipate failures, reduce downtime, and extend the lifespan of furnace equipment.
    • Dross Reduction: Leverage AI to monitor and analyze furnace performance, identifying factors that contribute to dross formation and recommending operational adjustments.
    • Energy Efficiency: Design models to predict and optimize energy usage, ensuring the most efficient operation of furnaces with minimal waste and cost.
    • Real-Time Monitoring: Develop systems to monitor furnace performance in real time, using advanced sensors and IoT technologies, to instantly adjust operations for optimal results.
    • Data Integration: Integrate real-time and historical data from various sources, including sensors, control systems, and operational logs, to create comprehensive models for better decision-making.

    What We’re Looking For:

    • Experience Level: 4-6 years of experience in AI/ML engineering, with a focus on industrial operations or energy efficiency. Experience in metal, glass, or manufacturing industries is a plus.
    • Educational Background: Bachelor’s or Master’s degree in Computer Science, Data Science, Mechanical Engineering, Electrical Engineering, or a related field.
    • Technical Expertise:
      • Strong proficiency in machine learning algorithms (e.g., Random Forests, Neural Networks, XGBoost) and optimization techniques.
      • Familiarity with time-series forecasting, anomaly detection, and predictive modeling.
      • Experience in working with IoT data from industrial sensors (e.g., temperature, pressure, gas emissions).
      • Proficiency in programming languages such as Python, R, and SQL.
      • Experience with big data platforms (e.g., Spark, Hadoop) and cloud environments (AWS, GCP, Azure).
    • Domain Knowledge: Understanding of industrial operations, furnace systems, and energy optimization techniques is highly desirable.
    • Problem-Solving Skills: Ability to identify key inefficiencies in furnace operations and apply AI solutions to solve complex, real-world challenges.

    What You’ll Achieve:

    • Improve Operational Efficiency: Your models will directly enhance furnace performance, reducing energy consumption, dross formation, and operational costs.
    • Maximize Equipment Lifespan: Prevent costly equipment failures through AI-driven predictive maintenance, improving the overall uptime and productivity.
    • Drive Sustainability: Contribute to sustainability goals by optimizing furnace operations to minimize waste, energy use, and emissions.

    Why Join Us?
    You will be part of a team dedicated to revolutionizing industrial furnace operations with the power of AI. This is an exciting opportunity to work on real-world challenges and directly impact the efficiency and sustainability of manufacturing processes.

    Apply Now to be at the forefront of AI-powered furnace optimization!

  • Data Science

    Data Science

    Data Science

    Data Science

    Job Description :
    Are you passionate about improving healthcare systems through intelligent solutions?
    Join us as an AI/ML Engineer I, where you’ll design AI-driven tools to optimize inventory management and human resource allocation in hospital operations.

    Your Mission:

    • Inventory Management Optimization: Build predictive models to anticipate supply shortages, reduce excess inventory, and optimize stock levels for critical medical supplies.
    • HR Resource Allocation: Develop scheduling algorithms to ensure the optimal deployment of medical staff, balancing workloads and maximizing patient care.
    • Real-Time Decision Support: Design systems to monitor hospital operations in real-time, providing actionable insights for resource management.
    • Automation and Workflow Enhancement: Contribute to AI-powered automation systems that reduce inefficiencies and improve hospital logistics.

    What We’re Looking For:

    • Experience Level: 2–4 years of experience in AI/ML engineering or data analysis, preferably with exposure to healthcare or operations domains.
    • Educational Background: Bachelor’s degree in Computer Science, Data Science, Operations Research, or a related field.
    • Technical Expertise:
      • Proficiency in machine learning frameworks such as Scikit-learn, XGBoost, or TensorFlow.
      • Strong understanding of optimization techniques (e.g., linear programming, genetic algorithms).
      • Experience with tools for inventory forecasting and workforce management analytics.
      • Familiarity with healthcare operations or ERP systems is a plus.
    • Programming Skills: Proficient in Python, R, and SQL, with strong data visualization skills using tools like Tableau or Power BI.
    • Analytical Mindset: Ability to identify inefficiencies and design AI-driven solutions to solve operational challenges.

    What You’ll Achieve:

    • Ensure the uninterrupted availability of critical supplies by optimizing hospital inventory systems.
    • Enhance workforce productivity and staff satisfaction through intelligent scheduling solutions.
    • Contribute to improving patient outcomes by streamlining hospital operations.

    Be part of the team that modernizes healthcare operations with AI and data-driven innovation.

  • Data Science

    Job Description :
    Are you ready to drive efficiency and sustainability with AI?
    We are looking for an experienced AI/ML Engineer II to develop transformative solutions that enhance operational efficiency and optimize solar power systems for maximum performance.

    Your Mission:

    • Operational Workflow Optimization: Create ML models to streamline manufacturing processes, reduce resource waste, and enhance production output.
    • Solar Power Analytics: Develop predictive and prescriptive systems for solar power generation, yield forecasting, and energy distribution optimization.
    • IoT-Driven Efficiency: Design real-time analytics pipelines using IoT sensors for operational monitoring and predictive maintenance.
    • Advanced Algorithms: Utilize optimization techniques such as linear programming, dynamic scheduling, and reinforcement learning to improve industrial workflows.
    • Scalable Deployment: Implement and deploy solutions on edge devices and cloud platforms to ensure scalability across large operations.

    What We’re Looking For:

    • Experience Level: 1-4 years of experience in AI/ML engineering, with a focus on operational efficiency or energy optimization.
    • Educational Background: Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, Data Science, or a related field.
    • Technical Expertise:
      • Advanced proficiency in machine learning frameworks such as TensorFlow, PyTorch, and Scikit-learn.
      • Strong understanding of optimization algorithms, time-series analysis, and predictive modeling.
      • Experience with IoT platforms and data integration (e.g., MQTT, OPC-UA).
      • Knowledge of renewable energy systems, particularly solar power generation and grid optimization.
    • Programming Skills: Expertise in Python, SQL, and experience with distributed computing frameworks like Apache Spark.
    • Cloud Experience: Proficiency in deploying solutions on AWS, GCP, or Azure for scalable and resilient applications.

    What You’ll Achieve:

    • Enable industries to reduce resource consumption and improve efficiency by up to 25%.
    • Develop AI-driven systems that optimize solar energy production and minimize downtime.
    • Drive sustainability initiatives, blending technology and environmental consciousness.

    Join us to redefine efficiency and sustainability with AI-powered solutions.

  • Collaborative Culture
    Dr.Reddys, LaurusLabs, CMRGroup - AI/ML Engineer I

    Data Science

    Data Science

    Data Science

    Job Description :
    Are you ready to safeguard lives with cutting-edge AI solutions?
    We are seeking an experienced AI/ML Engineer II to design intelligent systems that enhance workplace safety and precision object detection across industries. This role requires technical expertise, a proactive mindset, and a passion for creating impactful solutions.

    Your Mission:

    • Build Advanced Object Detection Models: Develop and deploy high-performance models using frameworks like YOLO, RetinaNet, and Faster R-CNN.
    • Real-Time Hazard Detection: Create systems capable of detecting workplace hazards, safety violations, and potential risks in real time.
    • Optimize Industrial Processes: Leverage computer vision to identify anomalies, defects, or inefficiencies in industrial workflows.
    • Scalable Integration: Deploy AI solutions on edge devices and cloud platforms, ensuring scalability and seamless integration into existing systems.
    • Safety Protocol Alignment: Collaborate with safety officers, engineers, and IT teams to align solutions with industry regulations and workplace safety standards.

    What We’re Looking For:

    • Experience Level: 2+ years of hands-on experience in developing and deploying AI/ML models, with a focus on computer vision and object detection.
    • Educational Background: Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, or related fields.
    • Technical Expertise:
      • Proficiency in deep learning frameworks such as TensorFlow, PyTorch, and ONNX.
      • Strong knowledge of image processing and video analytics using OpenCV and related libraries.
      • Experience in deploying models on edge devices (e.g., NVIDIA Jetson, Coral TPU) for low-latency applications.
    • Programming Skills: Advanced expertise in Python and C++ is required. Familiarity with CUDA for GPU optimization is highly desirable.
    • Industrial Knowledge: Familiarity with workplace safety standards (e.g., OSHA) and protocols is a significant advantage.
    • Cloud & IoT: Hands-on experience in integrating AI models with cloud platforms (AWS, Azure, GCP) and IoT ecosystems.

    What You’ll Achieve:

    • Reduce workplace incidents by deploying AI systems that detect hazards proactively.
    • Enhance industrial efficiency through automated defect detection and anomaly monitoring.
    • Design scalable solutions that can adapt to diverse industrial environments, from factories to construction sites.

    Why Join Us?
    This is your opportunity to lead innovation in safety technology. Your work will directly protect lives and set new benchmarks for AI-driven workplace safety. Be the force behind safer, smarter workplaces.

    Take the First Step to Your Dream Job

    Find Your Fit
    Steps to Dream Job

    Explore job roles posted by our industry partners, alumni network, and collaborators.

    Submit Your Profile
    Steps to Dream Job

    Apply with your updated resume and showcase your unique strengths or email us at hr@360digitmg.com

    Connect
    Steps to Dream Job

    Let our career advisors and placement team guide you through the process.

    Excel

    Leverage your skills and training to make a mark in your new role.

    Leadership Talks

    Leadership Talks

    Join our CEO and various industry experts, including successful alumni specializing in AI, Industry 4.0, and other cutting-edge fields, for 10-15 minute recorded sessions on Zoom.

    Watch Now to Learn and Grow
    These talks aim to:
    Inspire and motivate professionals in their careers
    Raise awareness about industry trends and innovations
    Encourage reskilling and upskilling initiatives
    Support corporate communication strategies

    Resources to Help You Succeed

    Empower your journey to success by utilizing our tools and guides to sharpen your skills and stand out in your career.

    Data Skills Assessment
    Measure your analytics proficiency with on-demand exams. Receive instant feedback and a personalized report to identify strengths and areas for improvement.
    Resume Optimization Hub
    Make your resume industry-ready. Use our AI-powered Resume Optimization Hub to match your skills with job descriptions and get actionable feedback to improve.
    Interview Readiness Program
    Prepare for your big day with mock interviews conducted by industry leaders, receive detailed feedback, and job-specific guidance to boost your confidence.
    Career Transition Guide
    Transform your career path with expert guidance. Access tailored recommendations, real-world project opportunities, and proven success strategies to navigate your career change.
    Make an Enquiry