Also partners with Treblle to equip developers with AI-powered assistant for rapid API integration and consumption.
Traefik Labs, creator of the world’s leading cloud-native application proxy, today unveiled the Traefik AI Gateway, a centralized cloud-native egress gateway for managing and securing internal applications with external AI services like Large Language Models (LLMs). By transforming any AI endpoint into a fully managed API, Traefik AI Gateway empowers enterprises to integrate AI seamlessly into their operations. Additionally, Traefik Labs has integrated Treblle’s AI API Assistant into the Traefik Hub Developer Portal, enhancing the developer and API consumer experience.
Traefik AI Gateway: Transforming AI Endpoints into Managed APIs
Now available within Traefik Hub API Gateway and API Management products, the Traefik AI Gateway simplifies how enterprises interact with AI services. By providing a unified AI API, it streamlines integration with multiple AI models, enforces robust security policies, centralizes credential management, and delivers comprehensive observability across AI operations.
“AI is rapidly becoming the cornerstone of digital transformation initiatives,” said Sudeep Goswami, CEO of Traefik Labs. “With Traefik AI Gateway, we’re delivering essential infrastructure that simplifies AI adoption and enhances control in order for organizations to innovate faster and maintain a competitive edge.”
Traefik AI Gateway provides:
- Simplified Multi-LLM Integration and Provider Flexibility
Seamlessly connect to multiple leading LLM providers—including Anthropic, Azure OpenAI, AWS Bedrock, Cohere, Mistral, Ollama, and OpenAI—through a single API. Eliminate the need for multiple client SDKs and complex integrations. Switch between AI providers without modifying client applications, avoiding vendor lock-in and increasing agility. - Centralized Security, Credential Management, and Compliance
Securely manage all LLM API keys in one location, shielding sensitive credentials and enforcing consistent authentication, authorization, and rate limiting across all AI traffic. Utilize local, self-hosted LLMs like Ollama and Mistral to keep sensitive data in-house, ensuring compliance with organizational policies and industry regulations through centralized governance. - Comprehensive Observability and Insights
Gain full visibility into LLM usage with industry-leading OpenTelemetry support to optimize AI workflows and resource utilization. Integrated monitoring provides real-time data to enhance performance, enabling proactive management of AI operations.
For more information, visit Traefik AI Gateway solution page and request a free trial.
Integrating Treblle’s AI API Assistant: Enhancing Developers with AI-Powered Integration
Traefik Labs enriches the developer experience by integrating Treblle’s AI API Assistant into the Traefik Hub Developer Portal. By leveraging AI to comprehend and continually learn from API documentation, the assistant enables developers to integrate APIs more effectively and efficiently.
With this powerful integration, developers and API consumers can now:
- Explore APIs using Natural Language
Use natural language queries to navigate and understand API documentation, making APIs more accessible and user-friendly. - Automatically Generate Code and SDKs
Instantly generate integration code and SDKs in any programming language based on API documentation, drastically reducing development time. - Accelerate Time to First Integration (TTFI)
Streamline the integration process, enabling developers to deploy APIs faster than ever before.
“We are thrilled to join forces with Traefik Labs to integrate Treblle’s AI API Assistant into the Traefik Hub Developer Portal,” said Vedran Cindrić, CEO of Treblle. “This partnership not only strengthens our collaboration but also significantly enhances the developer portal experience. By making APIs more accessible and integration faster, we’re allowing developers to focus on building amazing products. Together, we’re setting a new benchmark for developer productivity in the AI era.”