Edge AI Company Imagimob Introduces Ready Models—the Fastest Way to Take Machine Learning to Production

Complete, reliable, and ready to deploy. IMAGIMOB Ready Models let companies add Edge AI features to their smart devices with a fraction of the resources.

Ready Models image. Image Credit: Imagimob

In line with their mission to offer the best and fastest ways to take smart devices to market, Imagimob is now launching IMAGIMOB Ready Models, complete machine learning (ML) solutions that are guaranteed to be robust, high-performing, and production-ready for edge devices. Ready Models can be quickly deployed onto existing microcontroller (MCU) hardware, such as PSoC™ 6, without the cost, time, or expertise required for custom development.

“If you look at the Edge AI space right now, you can probably count on one hand how many companies provide off-the-shelf models for any one solution,” says Sam Al-Attiyah, Head of Customer Success at Imagimob “Our Ready Models are built upon eight years of expertise and thoroughly tested out in the field in different environments, so they are validated in terms of performance. And the fact that we are running them on small edge devices is really unique.”

To ensure the robustness of our models, we create a comprehensive list of different scenarios they can encounter, and then test those scenarios. We also test the models in different scenarios all over the world to ensure they work with no bias based on specific geographies or ethnicities. Finally, field testing on device lets us test and document realistic model performance in an expected hardware setup. The result of all this is a model that does exactly what you want it to when it's running in your products.

Imagimob is launching four audio-based Ready Models, including Baby Cry for baby monitors, Siren Detection for pedestrians as well as Coughing Detection and Snoring Detection which are both for wearable devices in the medtech and health sectors. Additional models are currently under development within Audio, Radar, IMU and Capacitive Sensing.

An Easier Starting Point for New ML Journeys

For many companies seeking to upgrade products with smart AI features, up until now, the barriers have been too high. The typical development process for a custom-made ML model not only requires the right software engineering and AI expertise, but also a great investment of time and resources for an extensive development process that spans from collecting, validating, and labeling data, to training models based on that data, deploying them on device, and then testing them in diverse environments to ensure performance expectations are met.

In contrast, IMAGIMOB Ready Models require zero or limited engineering and AI competence to implement. And with all of the development and testing work already put in, they essentially offer a shortcut to the marketplace.

“This is a much easier way for companies to begin their ML journeys—they don’t have to make such a big investment to start using it on their edge devices,” says Anders Hardebring, CEO at Imagimob. “Depending on the skillset and expertise in a company, developing a custom model for production typically takes six months to a year. With our Ready Models, they can have new Edge AI features up and running essentially overnight.”

IMAGIMOB Ready Models are now available for purchase. 

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.