Interview

AITech Interview with Greg Tull, Marketing Director at Classy Llama

AITech Interview with Greg Tull, Marketing Director at Classy Llama

How Retrieval-Augmented Generation is transforming ecommerce workflows, customer engagement, and business performance.

Greg, what pivotal moments or insights shaped your journey into leading innovation at the intersection of marketing and technology?

It’s not fair to the true innovation leaders at Classy Llama to say I have led it. I am a part of it as an early adopter and I work with those team members to provide feedback and help the development process, but more technical minds than mine are the true innovation leaders here. Jonathan Hodges (VP of Technology) is the lead innovator on AI in our org.

My journey began by joining Classy Llama. It’s an org filled with extremely intelligent, wildly tech curious people who ask a persistent question: How can technology solve this arduous, repeatable, or annoying task for our clients? That begins internally – how can tech do that for us and our team? We eat our own dog food.

That led us to explore data, analysis, coding, and automation in many ways on the leading edge of AI going mainstream. You could see this shift when AI as a service comes online in 2020-22, and most importantly, Retrieval Augmented Generation in 2023, which was tremendous in helping solve the hallucination problem. Now you had projects where AI could automate manual processes and improve customer interactions in measurable ways. The question that solidified for me was “How can AI be the connective tissue in an org to notably reduce manual human transfer or interaction?” In other words – if I have to download a .csv over here, do a bunch of manual work to understand it and clean it up, and then upload it over there to a different program, can AI take that away? Or if I have to skim scores of pages of technical documentation in an attempt to understand a project as a non-technical person, can AI research, translate, and summarize for me? Granted, these aren’t very advanced use cases – but what it did was open my mind to potential use cases. The leaders in the org have been working on developments in coding and information retrieval that have provided up to 14x efficiency gains. That’s not hype; it’s real numbers.

In ecommerce UX, some distinct cases are Zalando’s customer facing RAG Zalando Assistant, which retrieves product data and generates tailored suggestions or outfit ideas. They had a 23% increase in product clicks on recommended items, 41% rise in products being added to wishlists (intent), and saw a 20% incremental revenue increase in a recent quarter. CarMax generated 5,000 review summaries from 100,000 reviews in a few months, a task that would take 11 years for humans to do manually. The summaries had an 80% first-pass approval rate by editors. Improved search engine rankings was the result. Alibaba’s Alime RAG chatbot now handles over 2 million customer service sessions per day, which has raised customer satisfaction 25% and resulted in $150 million in customer service cost savings. Humann (a nutrition retailer) achieved 65% of customer chats resolved without human interaction by their RAG assistant in a few months with 6/10 questions being fully handled by the AI agent, saving over 1,000 hours of agent call time.

All of this has also shown me that AI, and RAG specifically in our context, is not a way forward, but the way forward.

How does Retrieval-Augmented Generation change the conversation from hype to impact?

RAG fundamentally shifts AI from being a clever guesser to a smart researcher. Instead of relying on static, pre-trained knowledge, it pulls in live, contextual data from your own sources—product catalogs, policies, reviews, and more. That means the answers aren’t just plausible; they’re grounded in your actual business logic. So rather than producing hype-heavy demos, RAG allows us to deploy solutions that immediately improve how customers find products, how support teams resolve issues, and how teams operate internally. It’s AI with guardrails—and that’s where real impact begins.

How does RAG address the limitations of traditional LLMs in ecommerce

Traditional LLMs struggle with three things: outdated data, hallucinations, and lack of relevance to specific domains. RAG solves each:

  • Static Data: RAG updates in real-time by retrieving fresh content from product feeds or documentation.
  • Hallucinations: By tying every output to a source, RAG can cite where its answers came from, reducing made-up responses.
  • Relevance: It operates on your knowledge base, making outputs far more tailored than a general-purpose chatbot.

For ecommerce, this is a game-changer. You don’t want a model making up return policies or recommending out-of-stock products. RAG keeps the AI grounded.

Where does RAG fit most naturally in an ecommerce company’s tech stack?

RAG fits best wherever there’s a need for contextual answers. Low-hanging fruit includes:

  • Site Search: Instead of keyword matches, users get smart responses pulled from PDPs, blogs, FAQs, and reviews.
  • Customer Support: Automate responses by feeding support content and product manuals into the RAG pipeline.
  • Internal Tools: Help teams query product specs, pricing tiers, or marketing guidelines without digging through folders.

Because RAG doesn’t require a total overhaul—just access to existing data—it’s deployable alongside your CMS, PIM, or helpdesk tools with minimal disruption.

How do you ensure RAG implementations drive real business results?

We start with use cases, not capabilities. It’s easy to be wowed by what’s possible, but we ask: Where is the friction? Then we pilot tightly scoped projects with measurable KPIs—like reducing support ticket volume or increasing conversions through better search results. We validate results early, integrate feedback loops, and scale up what works. It’s not about novelty—it’s about nudging the needle on real metrics. If a RAG tool doesn’t move revenue, cost, or experience, it doesn’t ship.

What are some common misconceptions ecommerce leaders have about tools like RAG?

The biggest misconception is that AI needs massive data lakes or that it’s only for tech giants. RAG flips that. You can see value by indexing a few PDFs or connecting it to your website. Another myth: AI will replace people. In truth, it amplifies people—especially support reps, merchandisers, and marketers. Lastly, leaders often assume RAG is plug-and-play. While integration is getting easier, you still need the right context, governance, and training data to get optimal results.

How do use cases like Amazon’s Rufus or Zalando’s assistant differ—and what can mid-sized retailers learn from them

Amazon and Zalando leverage RAG at scale, but the core principles still apply. Rufus turns product discovery into a conversation, reducing the cognitive load for shoppers. Zalando’s style assistant blends retrieval with personalization to suggest outfits that match user behavior and inventory. Mid-sized retailers can emulate these with:

  • Conversational search that adapts to shopper intent.
  • Curated product bundles using customer input and catalog data.
  • AI-based filtering for large inventories.

The tech is accessible—the key is aligning it with brand voice and customer journey.

How is RAG improving customer touchpoints like product discovery, support, and personalization?

  • Product Discovery: Instead of guessing keywords, customers can ask natural questions like “What’s a durable hiking boot for snow?” and get precise recommendations based on materials, reviews, and availability.
  • Support: RAG answers pre-sale and post-sale questions instantly, freeing up agents for complex issues.
  • Personalization: By pulling contextually relevant insights from customer profiles or past purchases, RAG tailors suggestions that actually match intent.

Unlike old-school automation, which felt scripted, RAG makes digital experiences feel like dialogue.

How does RAG impact internal operations?

RAG boosts internal efficiency by:

  • Enabling sales teams to quickly surface answers about pricing, specs, or competitive advantages.
  • Helping merchandisers analyze catalog gaps or bundle products based on trends.
  • Assisting content teams in generating SEO-friendly, data-informed product copy.

Any team that deals with documents, FAQs, or data silos can reduce context-switching and repetitive lookups by embedding RAG into their workflows.

What advice would you give to brands looking to future-proof their AI strategy without chasing trends?

Anchor your AI strategy in customer value and internal friction. Don’t start with what’s shiny—start with what’s hard or inefficient. Choose tools that are modular, auditable, and aligned with your business model. Invest in AI literacy across departments so that tech adoption isn’t bottlenecked by a few specialists. Finally, build with agility: pilot small, measure fast, and scale what drives ROI.

A quote or advice from the author

Culture is ridiculously hard to make, maintain, and change. Fight for it every day.

Greg Tull

Marketing Director at Classy Llama

Greg Tull, Marketing Director at Classy Llama, unpacks how Retrieval-Augmented Generation is redefining ecommerce—from smarter search to scalable efficiency, turning AI from an abstract concept into a measurable business impact.

AI TechPark

Artificial Intelligence (AI) is penetrating the enterprise in an overwhelming way, and the only choice organizations have is to thrive through this advanced tech rather than be deterred by its complications.

Related posts

Interview with Asaf Somekh, Co-founder and CEO at Iguazio

AI TechPark

AITech Interview with Luis Blando, CPTO, OutSystems

AI TechPark

AITech Interview with Carl D’Halluin, Chief Technology Officer, Datadobi

AI TechPark