Sense is using automation and AI to speed up the recruitment process, while also delivering what it calls a ‘hyper-personalized’ candidate experience.
What is MLOps?
MLOps itself is not the operations function (sysadmin, database administration and so on) powered by Machine Learning, it is the operations ‘feed’ so-to-speak that makes good ML what it is. So therefore, it includes all the data wrangling, ingestion, management and manipulation strategies that go into ML… it also includes functions to oversee the health and wealth of what we can call the ‘data pipeline’ (i.e. all the Extract, Transform & Load ETL tasks needed for the pipeline to function) – and wider elements to oversee feature selection, training, testing, deployment and monitoring.
MLOps could theoretically be used to make the Ops element of DevOps more heavily ML-enriched, but we can’t make ML-powered DevOps smart unless we look after the Ops that goes into ML, hence MLOps exists.
Making sense of Sense
Sense provides the Sense AI Chatbot, an automated recruiting assistant that can engage with candidates 24/7, responding to their queries in real-time when human recruiters are offline.
It engages with candidates across SMS, mobile and web to match them to jobs, schedule interviews and handles what the company calls ‘intelligent communications’, which in this case extends to FAQs. The chatbot pairs conversational AI with automated communication and engagement workflows so organizations can engage with candidates at scale.
Sense has a team of data scientists and ML engineers with deep expertise in conversational AI – both voice and text. The work here included engineering based on complex Natural Language Processing (NLP) serving pipeline, with custom model ensembles, to track question-to-question context and enable sentiment tracking.
Alex Rosen, co-founder and head of product at Sense has called his work a ‘complex MLOps project’ that needed that powerful mix of high-performance real-time serving graphs, Nvidia accelerated computing and the ability to scale up and down based on load.
A real-time NLP pipeline
Sense worked with Iguazio to build what it calls a ‘robust, automated, real-time NLP pipeline’ and provide a solution that uses the connectivity of Iguazio’s MLOps platform and built-in feature store with the Snowflake Data Cloud to speed up feature engineering, deep integration on Amazon Web Services (AWS) to manage cloud consumption costs… and smart scheduling capabilities for Nvidia accelerated computing to manage GPU usage in an efficient and scalable way.
Sense selected Iguazio through AWS Marketplace, while the Iguazio platform is deployed in AWS GovCloud, which gives government customers a place to architect and secure cloud solutions that comply with legislative rules.
An essential part of the ‘people-facing’ end of modern ERP stacks, ML and MLOps sit inside the same interplanetary orbit as NLP for human language interpretation – and, so, logically – this type of technology implementation will naturally now feature more prevalently in currently deployed ERP projects.