Contributing Guide¶
Welcome! This guide will get you up and running with the ml-container-creator codebase in minutes.
๐ฏ Quick Setup¶
Prerequisites¶
- Node.js 24.11.1+
- Git
Get Started (5 minutes)¶
# 1. Clone and install
git clone https://github.com/awslabs/ml-container-creator
cd ml-container-creator
npm install
npm link
# 2. Run tests to verify setup
npm test
# 3. Try the generator
yo @aws/ml-container-creator
That's it! You're ready to contribute.
๐ Project Structure¶
Understanding the codebase structure:
ml-container-creator/
โโโ generators/app/ # Main generator code
โ โโโ index.js # Generator entry point (orchestration)
โ โโโ lib/ # Modular components
โ โ โโโ prompts.js # Prompt definitions
โ โ โโโ prompt-runner.js # Prompt orchestration
โ โ โโโ template-manager.js # Template logic
โ โโโ templates/ # EJS templates for generated projects
โ โโโ code/ # Model serving code
โ โโโ do/ # do-framework scripts (build, push, deploy, etc.)
โ โโโ deploy/ # Legacy wrapper scripts (deprecated)
โ โโโ sample_model/ # Sample training code
โ โโโ test/ # Test templates
โโโ test/ # Generator tests
โ โโโ generator.test.js # Integration tests
โ โโโ template-manager.test.js # Unit tests
โโโ docs/ # Documentation
๐ Understanding the Code¶
How the Generator Works¶
The generator follows a simple flow:
1. Prompting Phase (prompt-runner.js)
โ
Collects user configuration through interactive prompts
- Single deployment configuration prompt (framework-server combination)
- Derives framework and modelServer from selection
2. Validation Phase (template-manager.js)
โ
Validates configuration is supported
3. Writing Phase (index.js)
โ
Generates complete project structure
- All template files are generated (no conditional exclusion)
- do-framework scripts handle runtime branching
- Centralized configuration in do/config
Key Architecture Principles¶
Simplified Generator: The generator has been simplified by moving conditional logic from template generation to runtime script execution. This means:
- No template exclusion patterns: All template files are generated for every project
- Runtime branching: The
do/scripts contain conditional logic based ondo/configvariables - Single deployment config prompt: Framework and model server are selected together (e.g.,
sklearn-flask,transformers-vllm) - Centralized configuration: All settings in
do/configfor easy customization
do-framework Integration: All generated projects follow the do-framework conventions:
- Standardized scripts in
do/directory:build,push,deploy,run,test,clean - Configuration in
do/configfile - Consistent interface across all projects
- Framework-specific logic handled at runtime, not generation time
Key Files to Know¶
generators/app/index.js - Main generator class
- Orchestrates the generation process
- Delegates to specialized modules
- Sets executable permissions on do/ scripts
- ~50 lines (was 300+ before refactoring!)
generators/app/lib/prompts.js - Prompt definitions
- All user prompts organized by phase
- Single deployment configuration prompt (flattened framework + model server)
- Easy to add new prompts
- Clear separation of concerns
generators/app/lib/template-manager.js - Template logic
- Validates user configuration
- No longer handles template exclusion (simplified!)
- Centralizes validation logic
generators/app/lib/prompt-runner.js - Prompt orchestration
- Runs prompts in organized phases
- Derives framework and modelServer from deploymentConfig
- Provides user feedback
- Combines answers from all phases
generators/app/templates/do/ - do-framework scripts
- Standardized container lifecycle scripts
- Contains runtime conditional logic
- Framework-specific branching handled here, not in generator
๐ ๏ธ Common Tasks¶
Adding a New Deployment Configuration¶
The generator now uses a single flattened deployment configuration prompt instead of separate framework and model server prompts.
- Add the new configuration to
generators/app/lib/prompts.js:
// In the deployment configuration choices
{
name: 'My Framework with My Server',
value: 'myframework-myserver',
short: 'myframework-myserver'
}
- Update
template-manager.jsvalidation:
// In validate() method
const validConfigs = [
'sklearn-flask', 'sklearn-fastapi',
// ... existing configs
'myframework-myserver' // Add new config
];
- Add framework-specific logic to
do/script templates:
# In generators/app/templates/do/build
case "$DEPLOYMENT_CONFIG" in
myframework-myserver)
# Framework-specific build logic
;;
esac
- Add test in
test/generator.test.js:
it('handles new deployment config correctly', async () => {
await helpers.run(path.join(__dirname, '../generators/app'))
.withPrompts({
deploymentConfig: 'myframework-myserver',
/* ... */
});
assert.file(['do/build', 'do/push', 'do/deploy']);
});
Adding a New Prompt¶
- Add prompt definition to
generators/app/lib/prompts.js:
// In the appropriate phase array
{
type: 'list',
name: 'myNewOption',
message: 'Choose your option?',
choices: ['option1', 'option2'],
default: 'option1'
}
- Add to
do/configtemplate if it's a configuration variable:
- Use in do/ scripts if needed:
# In generators/app/templates/do/build (or other scripts)
if [ "$MY_NEW_OPTION" = "option1" ]; then
# Option-specific logic
fi
- Add test in
test/generator.test.js:
it('handles new option correctly', async () => {
await helpers.run(path.join(__dirname, '../generators/app'))
.withPrompts({ myNewOption: 'option1', /* ... */ });
assert.fileContent('do/config', 'MY_NEW_OPTION="option1"');
});
Adding a New Template¶
- Create template file in
generators/app/templates/:
- Use EJS syntax for variables:
#!/bin/bash
set -e
# Source configuration
source "$(dirname "$0")/config"
PROJECT_NAME="<%= projectName %>"
REGION="<%= awsRegion %>"
# Add your script logic here
- Make it executable in generator's writing phase (if it's a script):
// In generators/app/index.js writing() method
// Executable permissions are set automatically for do/* scripts
- Add conditional logic based on configuration (if needed):
# In the script itself, not in the generator
case "$DEPLOYMENT_CONFIG" in
sklearn-*)
# sklearn-specific logic
;;
transformers-*)
# transformers-specific logic
;;
esac
Important: The generator now generates all template files unconditionally. Conditional logic should be in the runtime scripts (do/ directory), not in the generator's template exclusion logic.
Running Tests¶
# Run all tests
npm test
# Run specific test file
npx mocha test/generator.test.js
# Run with verbose output
npx mocha test/generator.test.js --reporter spec
Testing Your Changes Locally¶
# 1. Make your changes
# 2. Re-link the generator
npm link
# 3. Test in a temporary directory
cd /tmp
yo @aws/ml-container-creator
# 4. Verify generated project structure
cd your-generated-project
ls -la do/ # Check do-framework scripts are present
cat do/config # Verify configuration
# 5. Test the do-framework scripts
./do/build # Should build successfully
docker images | grep your-project # Verify image was created
# 6. Test local deployment
./do/run &
./do/test # Should pass health and inference tests
๐ Debugging Tips¶
Enable Debug Output¶
# See detailed Yeoman output
DEBUG=yeoman:* yo @aws/ml-container-creator
# See generator-specific output
DEBUG=@aws/generator-ml-container-creator:* yo @aws/ml-container-creator
Common Issues¶
"Generator not found"
"Tests failing after changes"
"do/ scripts not executable"
- Check that writing() phase sets permissions: chmod +x
- Verify with: ls -la do/ in generated project
"Configuration variable not set in do/config"
- Check EJS template syntax in generators/app/templates/do/config
- Verify variable is in this.answers object
- Test with: cat do/config in generated project
๐ Code Style¶
We follow these conventions:
// โ
Good
const answers = await this.prompt([...]);
const { framework, modelServer } = this.answers;
// โ Avoid
var x = 5;
let y = this.answers.framework;
Key points:
- Use const by default, let when needed, never var
- Use arrow functions for callbacks
- Use template literals for strings
- Add JSDoc comments for public methods
- Keep functions small and focused
๐ฏ do-framework Conventions¶
All generated projects follow do-framework conventions:
Script Structure¶
#!/bin/bash
set -e # Exit on error
set -u # Exit on undefined variable
set -o pipefail # Exit on pipe failure
# Source configuration
source "$(dirname "$0")/config"
# Validate prerequisites
check_docker_installed
# Main logic with conditional branching
case "$DEPLOYMENT_CONFIG" in
sklearn-*)
# sklearn-specific logic
;;
transformers-*)
# transformers-specific logic
;;
esac
# Success output
echo "โ
Operation completed successfully"
Configuration Management¶
- All configuration in
do/configfile - Use environment variable overrides
- Document all variables with comments
- Support both direct values and env var references
Output Formatting¶
Use consistent emoji prefixes: - ๐ - Starting an operation - โ - Success - โ - Error - โ ๏ธ - Warning - ๐ - Checking/validating - ๐ฆ - Deployment/packaging
Error Handling¶
# Check prerequisites
if ! command -v docker &> /dev/null; then
echo "โ Docker is not installed"
echo " Install from: https://docs.docker.com/get-docker/"
exit 2
fi
# Validate configuration
if [ -z "$PROJECT_NAME" ]; then
echo "โ PROJECT_NAME not set in do/config"
exit 3
fi
๐งช Testing Guidelines¶
What to Test¶
- โ File generation for different configurations
- โ Template exclusion logic
- โ Validation of user inputs
- โ Edge cases and error conditions
Test Structure¶
describe('feature name', () => {
beforeEach(async () => {
// Setup test environment
await helpers.run(path.join(__dirname, '../generators/app'))
.withPrompts({ /* test configuration */ });
});
it('should do something specific', () => {
// Assert expected behavior
assert.file(['expected-file.txt']);
});
});
๐ Making Your First Contribution¶
Good First Issues¶
Look for issues labeled:
- good first issue - Perfect for newcomers
- help wanted - Community contributions welcome
- documentation - Improve docs
Contribution Workflow¶
- Find an issue or create one describing your change
- Fork the repository and create a branch
- Make your changes following code style
- Add tests for your changes
- Run tests to ensure everything works
- Submit a PR with clear description
PR Checklist¶
Before submitting:
- [ ] Tests pass (npm test)
- [ ] Code follows style guidelines
- [ ] Documentation updated if needed
- [ ] Commit messages are clear
- [ ] PR description explains the change
๐ Additional Resources¶
- Adding Features Guide - Detailed guide for new features
- Coding Standards - Complete style guide
- Template System - How templates work
- Architecture - Complete architecture guide with visual overview
๐ฌ Getting Help¶
Stuck? We're here to help:
- Check existing documentation
- Search issues
- Ask in discussions
- Tag maintainers in your PR
๐ You're Ready!¶
You now know enough to start contributing. Don't worry about making mistakes - that's how we all learn. The maintainers are friendly and will help guide you through your first contribution.
Happy coding! ๐