How we turned 50MB of documentation, SDK code, and samples into a custom GPT that solves real production problems — covering the technical decisions from Go and ripgrep to prompt-injection techniques that make AI agents behave better.
"I crashed ChatGPT's entire website by putting too much information in a system prompt," Teemu admits. "Not once, but multiple times during testing." That's how he learned the hard way that giving AIs access to documentation isn't as simple as dumping everything into context and hoping for the best.
The result of those crashes — and months of iteration — is the Metaplay Docs custom ChatGPT. A chatbot that gives instant, conversational access to all of Metaplay's documentation, SDK source code, and samples. Here's how it came together.
The Documentation Problem That Forced Our Hand
At Metaplay, we work hand-in-hand with our customers to inform what we should be working on next. When we sat down to map out the next iteration of our documentation — covering all the new features added to the Metaplay SDK over the past year — a troubling pattern emerged.
"We were about to make our docs significantly harder to navigate, not easier," Teemu explains. The challenge wasn't the content itself — it was the paths through it. What pages should you read, in what order? What sample projects are relevant? What search terms will actually find what you need?
For a game to make money, it needs effective monetisation. For that, it needs IAPs. To understand IAPs, you need IAP validation, payment settlement, currency conversions — and that's before getting to analytics. None of it is complicated individually, but connecting the dots across 50MB of material was becoming a maze where the value was real, but the path to it was increasingly unclear.
Why 2025 Was Finally The Year AI Could Help
Something shifted in 2025 that made this problem suddenly solvable. Reasoning models became mainstream and affordable. This seemingly simple technique — letting AI models "think" before responding — made them dramatically better at using tools like web search and file reading.
"No complex semantic search indexes or vector databases needed," Teemu notes. "Late 2025, it became clear we could finally make all our material available to AIs without completely overwhelming them."
The first iteration uses GPT-5.2's reasoning combined with tools to access all Metaplay documentation. Teemu chose ChatGPT because it's familiar to everyone and accessible on any device, for free. The Metaplay Docs GPT now provides instant access to over 16x more information than just browsing the docs website.
During private testing, a game tech lead solved a real production need with it. They needed a quick internal tool for managing authentication methods across a list of players. The bot explained how to use the Metaplay CLI with the LiveOps dashboard admin API. "I didn't even get to see the result," Teemu says, "because they just reported it worked and moved on to the next task."
Building for Speed: Why Go and Ripgrep Beat Node by 100x
The project started in Node and TypeScript. But Teemu quickly realized that searching 50MB of filesystem content inside a Node process was painfully slow. It works for hundreds or thousands of files, but Metaplay's documentation is far larger than that.
"Even in early testing, I saw an easy 100x difference in speed switching to Go with native tools," he explains. This meant building a custom hardened Docker image and hosting it in Metaplay's own Kubernetes cluster instead of leveraging edge services like Vercel. With Go, the container size dropped to about a fifth of what Node required, and complex requests return with ripgrep's power in milliseconds.
The team built both a web API and a Model Context Protocol (MCP) server. Custom ChatGPTs only talk to HTTP endpoints, while tools like Claude Code and Cursor work with MCP. The end result is the same: the agent gets tools that let it search for the right information when it needs it.
The Hard-Earned Lesson: Making AI Agents Behave
During internal testing, the team observed AIs sometimes doing very ambitious searches — like searching for every C# file in the entire repository. It didn't lead to anything good. Multiple layers of protection against bad behavior were needed.
"Here's an interesting tidbit," Teemu shares. "I changed from JSON responses to plain text because that's more token-efficient. LLMs don't care about formatting. And in plain text I can inject messages like 'response truncated, use a more specific search' to essentially prompt-inject the calling AI to behave better."
This significantly improved edge case performance. AIs really don't like empty search results because they look like errors. Instead, the system returns a message like 'no search results — use a different query' to prevent false information. The biggest challenge remains that AI systems are very hard to test — it took days of different people prompting internally to surface undocumented edge cases.
Beyond behavior, the team also discovered that existing HTML documentation was unnecessarily complex for LLMs. "They don't need fancy UI, generated section navigation, or accessibility features," Teemu notes. "We got better results from raw markdown files." The team built a custom processing pipeline that renders documentation into token-efficient files on disk, with high-level index-of-indexes that the AI navigates as needed.
A two-tier caching strategy — server-side LRU cache plus HTTP headers with ETags — keeps things fast. The cache values (5 minute max-age, 1 hour stale-while-revalidate) came from observing real query patterns. Documentation doesn't change hourly, so aggressive caching doesn't hurt anyone.
Final Thoughts: This Is Just The Starting Point
The LLM-docs project is live, but it's only the beginning. The team is already experimenting with supercharging coding agents like Claude Code and OpenCode to be Metaplay experts — combining your existing game code with deep Metaplay knowledge.
"Our conviction is growing that what makes the dream work is extremely high quality SDK code and extremely high quality documentation," Teemu reflects. "AIs won't fundamentally replace the need for great engineering and technical writing. Instead, they'll massively scale out the impact we can have with our work."
The technical documentation now includes guides on leveraging the underlying MCP server in your own setups. Reach out if you'd like to be a private beta tester. Or try the Metaplay Docs custom ChatGPT yourself — you can even turn on voice mode and talk to Metaplay on your phone.
Got thoughts or questions? The Metaplay team would love to hear from you. Reach out with your feedback or suggest a topic for a future Tech Talk.


![A Deep Dive into the New Metaplay CLI: A Smoother Way to Start & Scale Games [Updated for 2026]](/images/blog/metaplay-cli-deep-dive-featured.jpg)
![Running Complex Game Configs at Scale: What You'll Wish You Knew Earlier [Updated for 2026]](/images/blog/running-complex-game-configs-at-scale-featured.png)