Contributing Guidelines
Guidelines and best practices for contributing challenges to Kubeasy.
Thank you for contributing to Kubeasy! This guide explains how to submit high-quality challenges that provide a great learning experience.
Before you start
Check existing challenges
Browse the challenges repository to:
- Avoid duplicating existing challenges
- Understand the current challenge structure
- Get inspiration from well-designed challenges
Propose your idea
Before investing time in building a challenge, open a GitHub Issue to discuss your idea:
- Go to kubeasy-dev/challenges/issues
- Create a new issue with the "Challenge Proposal" template
- Describe:
- What Kubernetes concept it teaches
- The broken scenario
- Why it's valuable to learn
- Estimated difficulty and time
This helps avoid duplicate work and ensures your challenge aligns with Kubeasy's goals.
Contribution process
1. Fork and clone
# Fork the repository on GitHub, then:
git clone https://github.com/<your-username>/challenges.git
cd challenges2. Create a branch
git checkout -b challenge/<challenge-slug>Use descriptive branch names:
challenge/rbac-service-accountchallenge/network-policy-egresschallenge/pvc-storage-class
3. Build the challenge
Follow the structure described in Challenge Structure and Creating Challenges.
Your challenge folder should contain:
<challenge-slug>/
├── challenge.yaml # Metadata, description, AND objectives
├── manifests/ # Initial broken state
│ ├── deployment.yaml
│ └── ...
├── policies/ # Kyverno policies (prevent bypasses)
│ └── protect.yaml
└── image/ # Optional: custom Docker images
└── Dockerfile4. Test thoroughly
See Testing Challenges for comprehensive testing strategies.
Minimum testing requirements:
- Challenge works on a fresh Kind cluster
- Problem is reproducible
- Solution passes all objectives
- Objective feedback is clear
- Estimated time is accurate
5. Commit your changes
Use clear, descriptive commit messages:
git add .
git commit -m "feat: add memory-pressure challenge"Follow conventional commit style:
feat: add challenge for network policiesfix: correct validation timeout in RBAC challengedocs: improve description for storage class challenge
6. Push and create a pull request
git push origin challenge/<challenge-slug>Open a pull request on GitHub.
Pull request requirements
PR title format
feat: add <challenge-slug> challengeExamples:
feat: add network-policy-debugging challengefeat: add rbac-service-account challenge
PR description template
## Challenge: <Challenge Title>
### What does this challenge teach?
Briefly explain the Kubernetes concept and why it matters.
### Difficulty and estimated time
- **Difficulty**: easy | medium | hard
- **Estimated time**: X minutes
### Theme
- resources-scaling | networking | rbac-security | volumes-secrets | monitoring-debugging
### Testing
- [x] Tested on fresh Kind cluster
- [x] Verified broken state is reproducible
- [x] Solution passes all objectives
- [x] Objective titles don't reveal the solution
- [x] Kyverno policies prevent bypasses
- [x] Estimated time is accurateChallenge quality standards
Design principles
1. Single focused concept
Good:
"This challenge teaches how to configure resource requests and limits"
Bad:
"This challenge teaches resource limits, RBAC, network policies, and storage"
Guideline: One clear concept per challenge.
2. Realistic scenarios
Good:
"A deployment fails because the ServiceAccount lacks permissions to read ConfigMaps"
Bad:
"A deployment has exactly 3 typos that you must find"
Guideline: Mirror real production problems, not artificial puzzles.
3. Appropriate difficulty
Match difficulty to complexity:
- easy: Single issue, clear error messages, common scenarios
- medium: Multiple related issues, requires debugging
- hard: Complex interactions, subtle issues, production-like
4. Mystery preserving
Descriptions should show symptoms, not causes. Objectives should check outcomes, not implementations.
Code quality
Manifests
# Good: Clear, minimal, well-commented
apiVersion: apps/v1
kind: Deployment
metadata:
name: web-app
spec:
replicas: 1
selector:
matchLabels:
app: web-app
template:
metadata:
labels:
app: web-app
spec:
serviceAccountName: web-app-sa # Lacks permissions
containers:
- name: nginx
image: nginx:1.21Objectives
# Good: Generic titles, checks outcomes
objectives:
- key: pod-running
title: "Pod Ready"
description: "Web app pod must be running"
order: 1
type: condition
spec:
target:
kind: Pod
labelSelector:
app: web-app
checks:
- type: Ready
status: "True"
# Bad: Reveals solution, vague key
- key: check1
title: "Memory Set to 256Mi"
description: "Check failed"Bypass protection
Every challenge should include Kyverno policies to prevent obvious bypasses:
# policies/protect.yaml
apiVersion: kyverno.io/v1
kind: ClusterPolicy
metadata:
name: protect-<challenge-slug>
spec:
validationFailureAction: Enforce
rules:
- name: preserve-image
match:
resources:
kinds: ["Deployment"]
names: ["web-app"]
namespaces: ["challenge-*"]
validate:
message: "Cannot change the application image"
pattern:
spec:
template:
spec:
containers:
- name: nginx
image: "nginx:1.21"Review process
What reviewers look for
-
Correctness
- Does the challenge deploy with a reproducible broken state?
- Do all objectives pass after applying the fix?
- Do Kyverno policies prevent bypasses?
-
Educational value
- Does it teach something useful?
- Is it appropriate for the stated difficulty?
- Are descriptions mystery-preserving?
-
Code quality
- Are manifests minimal and clear?
- Are objective titles generic (don't reveal solutions)?
- Do objectives validate outcomes, not implementations?
-
Testing
- Has it been tested on a fresh cluster?
- Is estimated time accurate?
- Are multiple valid solutions accepted?
Addressing feedback
When reviewers request changes:
- Make the requested modifications
- Test again to ensure everything still works
- Push the changes to your branch
- Respond to review comments
git add .
git commit -m "fix: address review feedback"
git push origin challenge/<challenge-slug>The PR will automatically update.
Coding standards
File naming
- Use lowercase with hyphens
- Be descriptive but concise
manifests/deployment.yaml
policies/protect.yaml
image/DockerfileYAML formatting
- Use 2 spaces for indentation
- Add comments to explain non-obvious configuration (like the intentional bug)
- Group related resources in the same file with
---separator if needed
Resource naming
Use clear, consistent names:
# Good
metadata:
name: web-app
labels:
app: web-app
component: frontend
# Bad
metadata:
name: x
labels:
app: app1Common pitfalls
1. Challenge is too complex
Problem: Trying to teach too many concepts at once.
Fix: Split into multiple challenges, each focusing on one concept.
2. Objectives reveal the solution
Problem: Objective titles or descriptions tell the user what to fix.
Fix: Use generic titles like "Stable Operation" instead of "Memory Limit Increased to 256Mi".
3. Missing bypass protection
Problem: Users can replace the broken app with a simpler working one.
Fix: Add Kyverno policies to protect container images and critical configuration.
4. Estimated time is inaccurate
Problem: Challenge takes much longer than stated.
Fix: Test with someone unfamiliar with the problem and adjust the estimate.
5. Solution has multiple valid approaches
Problem: Users solve it differently than expected.
Fix: This is good! Make sure your objectives validate outcomes (pod is healthy) rather than implementations (memory is 256Mi), so any valid fix passes.
Best practices
Design
- Start simple -- complexity can be added later
- Focus on one learning objective
- Make the broken state obvious but not trivial
- Use realistic, production-like scenarios
Implementation
- Use stable, common images (nginx, python, busybox, etc.)
- Avoid external dependencies when possible
- Keep resource usage minimal
- Test on a clean cluster
Documentation
- Describe symptoms in the challenge description, not the root cause
- State goals in the objective, not the method to achieve them
- Never include solutions anywhere in the challenge files
After your PR is merged
Your challenge will be automatically:
- Built as an OCI artifact
- Published to
ghcr.io/kubeasy-dev/challenges/<slug>:latest - Available for users via
kubeasy challenge start <slug>
If your challenge includes an image/ directory, the custom Docker image will also be built and published to ghcr.io/kubeasy-dev/<slug>:latest.
Getting help
If you need help contributing:
- GitHub Discussions: kubeasy-dev/challenges/discussions
- GitHub Issues: kubeasy-dev/challenges/issues
License
By contributing, you agree that your contributions will be licensed under the same license as the project.