Crosser unveils a strategy for “Bring your own AI” to enhance the Edge Intelligence in Industrial IoT
The announcement of an open machine learning (ML) strategy gives customers full freedom to deploy their favourite machine learning framework in the edge.
Stockholm/Sundsvall, 14 March, 2019 - Crosser, a leading provider of Intelligent Edge Analytics software for Industrial IoT, today announced new capabilities on its platform giving customers a unique combination of flexibility and usability when turning to machine learning for enhanced intelligence.
"Machine Learning has a natural place in Edge Streaming Analytics for many use- cases” says Martin Thunman, CEO and co-founder of Crosser.
”We don ́t believe there is one ML framework that is better than others, it depends on the situation, the use-case and the specific customer skill-sets. We also believe this is a fast evolving technology where new frameworks will gain popularity in the future. I’m excited to announce the “Bring your own AI” strategy that gives customer full freedom to deploy their favourite ML framework on our platform.”
The open strategy comes with a set of new capabilities in the Crosser Platform:
Drag-and-drop and open ML support - a unique combination
By combining the strength of the Crosser Flow Studio - the cloud based drag-and-drop design tool, and the flexibility of the Crosser Edge Node to host any 3rd party ML framework the pace of innovation can be significantly accelerated. It allows automation experts, IT teams, Data Scientists and Product specialists to easily collaborate on one platform they all can master.
The Crosser real-time engine and the Python ML framework is deployed in a single Docker container and is seamless for the end customer.
Easing the Orchestration and deployment of ML frameworks and models
Different use-cases might need different ML frameworks, models and algorithms and it can quickly turn into a deployment nightmare. In Crosser Edge Director - the cloud based orchestration & management tool, a new concept has been introduced called Resources.
It’s a central library of models, algorithms, frameworks and other resources that can be reused across multiple use-cases or deployment nodes. It brings structure, organization and overview and makes bulk deployment or version updates a simple task.
Python frameworks and more
The starting point has been to add support for Python frameworks but the architecture allows for any 3rd party framework and runtime to be easily added and support for more frameworks will soon be announced.
Examples of Python frameworks supported:
Fitting streaming data into a ML model developed from file data
ML models are often developed with data from a file which creates challenges when the data points comes in a serial streaming format from multiple data sources. A new ML Data Join module takes care of this by aligning all data in time and replicate data when sensors delivers data at different rates to then present the streaming data to the model in the same way it was developed.
For additional information about the strategy, please read our article:
For more information or scheduling a product demonstration Please contact us