Machine Learning

Wallaroo.AI Launches Unique Workload Orchestration Capabilities

Allows Enterprises to Simplify, Automate Recurring ML Workloads with Familiar Tools, Helping Remove Infrastructure Bottlenecks and Deliver Results Faster

Wallaroo.AI, the leader in scaling production machine learning (ML) on premise, in the cloud, and at the edge, today announced early access of ML Workload Orchestration features in its unified production ML platform. This unique capability facilitates automation, scheduling and execution of combined data and ML inferencing workflows across the production process, enabling AI teams to scale their ML workflows by 5-10x while also freeing up 40% of their weekly time, based on customer data.

Data scientists, data engineers, and ML engineers no longer need to waste time trying to set up the basic elements of data and ML pipelines with unwieldy tools. Removing these unnecessary and time-consuming steps accelerates the feedback loop from model deployment to business value so organizations can troubleshoot and tune models to respond more quickly to unsatisfactory performance of the model and market changes.

With these ML Workload Orchestration features, enterprises can now also be data-source agnostic, ensure business continuity with portable ML pipelines that move from development through to production, and scale ML use cases.

“Going from ML prototype to production is a huge challenge and even when enterprises succeed with ad hoc approaches, most don’t have the efficiency, flexibility, or repeatability in their processes that they need to scale their ML,” said Vid Jain, founder and chief executive officer of Wallaroo.AI. “This is usually because they’ve had to create ML production workflows from scratch or integrate discrete tooling across the various stages of the production ML process. Because Wallaroo.AI is a single unified platform, we’re able to provide an easy-to-use fully integrated experience – minimizing operational overhead and bottlenecks so you can scale quickly and efficiently.”

Workload Orchestration Details

As enterprise AI teams upload their models, they can now define their ML workload steps and set up a schedule with just a few lines of code, if they use the Wallaroo.AI platform with the new Workload Orchestration features. Behind the scenes, this new technology orchestrates scheduling and infrastructure utilization, data gathering and inferencing while ensuring resilience. Teams can then monitor workloads and review results as needed.

Integrations – The Wallaroo.AI platform now includes support for data connections across the three major cloud datastores, Google Cloud, Amazon Web Services, and Microsoft Azure as well as Wallaroo SDK and Wallaroo API support. Enterprises can now ingest data from predefined data sources to run inferences in the Wallaroo.AI platform, chain pipelines, and send inference results to predefined destinations to analyze model insights and assess business outcomes.

Security – The new data connections use authentication management at the platform level to ensure security. The new cloud datastore connections can be configured in Wallaroo workspaces, which is where the ML Workload Orchestration management lives.

Automation and Scripting – Users now have the ability to execute interactive (realtime) and scheduled (batch) workloads across the ML production process (deployment, serving, observing, optimization). The Wallaroo.AI platform also now includes support for custom/arbitrary python scripts and chained ML models and pipelines.

Visit AITechPark for cutting-edge Tech Trends around AI, ML, Cybersecurity, along with AITech News, and timely updates from industry professionals!

Related posts

LoadSpring, Google Cloud & SADA to Deliver Enhanced AI Solutions

Business Wire

ML leader insitro Appoints Tom Stocky as VP of Product Strategy

Business Wire

Zest AI Wins 2021 Finovate Award for Best Use of AI/ML

Business Wire