I’ve been transitioning to Go after years in other ecosystems, and kept running into the same problem:
I could write correct Go code, but not idiomatic Go.
Most material focuses on syntax or algorithms. In practice, what caused friction were production mismatches: context cancellation and goroutine leaks, errgroup vs WaitGroup tradeoffs, HTTP client hygiene, error wrapping semantics, allocation control, embed/io/fs for dev–prod parity, etc.
I started collecting small, constraint-driven katas that isolate one such mismatch at a time. Each kata defines explicit pass/fail idiomatic constraints, rather than providing solutions. The goal is deliberate practice, not “best practices” or tutorials.
This repo is curated by someone transitioning to Go, for others doing the same. It’s not meant to be authoritative. If you’re experienced with Go and spot incorrect, unsafe, or misleading constraints, issues and PRs with rationale and references are explicitly encouraged.
I’m especially interested in feedback from people using Go in production on where these constraints are wrong, incomplete, or missing important edge cases.
The instructions mention "Reflect: Compare your solution with the provided "Reference Implementation" (if available)" but not a single line of code is present.
Is this an artifact of it all being ai generated, or work in progress?
If the idea is to have devs implement each kata, wouldn't it be more effective to provide not only automated tests, but also code which should be used as a basis for each challenge?
For example, if supporting a dev tag to serve assets from the filesystem, why not include a simple webserver which embeds the contents?
This would allow aspiring gophers to go straight to the high value modification of a project rather than effectively spend most of the time writing scaffolding and tests.
This honestly just looks like a bunch of ChatGPT output. There’s almost no code (I checked maybe half a dozen topics). Not sure how useful this is for anyone besides the author. Why would I look at this instead of asking an LLM?
Most material focuses on syntax or algorithms. In practice, what caused friction were production mismatches: context cancellation and goroutine leaks, errgroup vs WaitGroup tradeoffs, HTTP client hygiene, error wrapping semantics, allocation control, embed/io/fs for dev–prod parity, etc.
I started collecting small, constraint-driven katas that isolate one such mismatch at a time. Each kata defines explicit pass/fail idiomatic constraints, rather than providing solutions. The goal is deliberate practice, not “best practices” or tutorials.
This repo is curated by someone transitioning to Go, for others doing the same. It’s not meant to be authoritative. If you’re experienced with Go and spot incorrect, unsafe, or misleading constraints, issues and PRs with rationale and references are explicitly encouraged.
I’m especially interested in feedback from people using Go in production on where these constraints are wrong, incomplete, or missing important edge cases.
Is this an artifact of it all being ai generated, or work in progress?
For example, if supporting a dev tag to serve assets from the filesystem, why not include a simple webserver which embeds the contents?
This would allow aspiring gophers to go straight to the high value modification of a project rather than effectively spend most of the time writing scaffolding and tests.
Unless that's explicitly the intent, in which case that's fair
Including these Kata in AGENTS.md is extremely useful.