Vast.ai, the leading cloud GPU platform for AI and ML, today announced the launch of Virtual Machine (VM) support on its platform. The move underscores Vast.ai’s ongoing commitment to expanding access to high-performance computing resources and empowering developers, researchers, and enterprises to innovate faster.
Since 2018, Vast.ai’s Docker-based platform has helped customers quickly experiment with cutting-edge machine learning models, deep learning frameworks, and high-performance simulations at industry-leading prices. By adding VM support, Vast.ai removes the need to rely solely on custom Docker images, streamlining workflows and improving time-to-market.
Key Benefits of Virtual Machine Support Include:
- Enhanced Flexibility: Users can run low-level debugging and performance profiling utilities and are not constrained by container environments.
- Kernel-Space Customizations: Developers gain granular control over their systems, enabling custom driver installations, kernel tweaks, and integration of specialized frameworks.
- Faster Iteration: Rather than pre-building and pushing custom Docker images, users can run Docker inside the instance.
- Improved Security: Virtual Machines offer an improved security posture for sensitive workloads.
“We’ve always believed that democratizing GPU access means offering maximum flexibility,” said Travis Cannell, COO, Vast.ai. “Virtual Machine support was the natural next step, providing more control, ease of use, and enabling researchers to build and iterate.”
Vast.ai’s new VM feature builds on the company’s established track record of cost-effective GPU availability, now offering the freedom to run Docker inside a VM or adapt environments to meet the evolving needs of AI and HPC applications.
To launch a VM today, visit https://cloud.vast.ai.
Explore AITechPark for the latest advancements in AI, IOT, Cybersecurity, AITech News, and insightful updates from industry experts!