.. _stdlib_ai_generation: ================== AI Generation Tool ================== With Eden, you can generate assets for your game using AI directly from the editor. This can be useful when prototyping or if you want to quickly populate your game with content. Currently, the following types of asset generation are supported: - Image generation - Image editing - 3D model generation Depending on the pipeline, you can generate assets from: - Text descriptions - Source images - A combination of both (e.g. editing an image with a text instruction) .. note:: AI generation tools are currently in beta. If no generation jobs have been run recently, the GPU may need a moment to warm up before your first job starts. .. _stdlib_ai_generation_links: **How to Open** You can access the generation window from several locations: - From the menu: "Window" -> "AI Generation Tool" - In the context menu of the file explorer: Right Mouse Button -> Create - By clicking the ``+`` button in the top-left corner of the image selection window while selecting an image. .. _stdlib_ai_generation_params: **Generation Parameters** The left side of the generator window is used to fill in the parameters for a new generation request. To generate an asset: 1. Choose a pipeline from the **Pipeline** dropdown menu. 2. Fill in the necessary parameters. 3. Click the **Generate** button. The list of available parameters depends on the pipeline. Some parameters are optional and hidden under the **Advanced Inputs** section, but modifying them allows you to fine-tune the generation results. Refer to the parameter descriptions and experiment with the outcomes. **Generation History** The right side of the generator window displays the generation history. Click **Save** on a completed generation to save it in the currently selected folder. For each history entry, the following buttons are available: - **Cancel** - Cancel the generation. This can only be done if the generation has not yet started and is in the "Queued" status. - **Remove** - Delete the entry. You can choose to delete the entire entry or just the results, leaving the parameters for future reuse. - **Apply Parameters** - Copy all parameters from this generation to the parameters for a new generation. Be sure to change these parameters to avoid "Duplicate Request" errors. - **Show Parameters** - View the full set of parameters that were used for a completed generation. .. _stdlib_ai_generation_workflows: ========= Workflows ========= .. _stdlib_ai_generation_image_quality: Image - Quality =============== Generate an image from a text description. Write what you want to see in the **prompt** field and click **Generate**. This mode takes a bit longer but produces detailed, high-fidelity results — best suited for final concept art, textures, and hero images. .. _stdlib_ai_generation_image_fast: Image - Fast ============ Works the same way as Image - Quality — describe the image you want and hit **Generate** — but returns results faster. Use this when you are exploring ideas and want to try many variations quickly. Once you find a direction you like, switch to Quality for a polished version. .. _stdlib_ai_generation_image_edit: Image - Edit ============ Modify an existing image using a text instruction. Provide a **source image** and describe the change you want — for example, "change the background to a snowy mountain" or "make the armor gold instead of silver". The tool applies your edit while keeping everything else intact. You can use any previously generated image as the source, making it easy to iterate: generate an image first, then refine it with edits. .. _stdlib_ai_generation_3d_shape: 3D == Turn a reference image into an untextured 3D mesh. Provide a single **image** of the object you want and the tool will generate clean geometry without any materials. This is useful when you want a starting shape for sculpting or plan to apply your own textures. For best results, use a reference image with a single clear subject, a simple background, and no cropped edges. .. _stdlib_ai_generation_3d_textured: 3D - Textured Shape =================== The full image-to-3D workflow. Provide a reference **image** and get back a complete 3D model with baked PBR materials — color, roughness, and metallic maps — ready to drop into a scene. Same image recommendations as the 3D workflow above: one clear subject, simple background, nothing cropped. .. _stdlib_ai_generation_tips: ================== Tips and Workflows ================== **Chaining Pipelines** The pipelines are designed to work together. A typical workflow: 1. Start with **Image - Fast** to quickly explore a concept across several prompts. 2. Switch to **Image - Quality** for a polished version, or use **Image - Edit** to refine an existing result. 3. Feed the final image into **3D** or **3D - Textured Shape** to produce a 3D asset. This gives you a text-to-3D pipeline entirely within the editor. **Seed and Reproducibility** The ``seed`` parameter controls the random state of each generation. By default it is set be random each time. Set a fixed seed value to reproduce the same output, then tweak other parameters (prompt, guidance, steps) to iterate predictably. **Guidance Strength** Guidance strength controls how closely the output follows your input. - Higher values (8–10+) make the model adhere more strictly to the reference image or prompt. Use this when you want the output to closely match what you provided. - Lower values (3–5) give the model more creative freedom. Use this when you want variation or when the reference is only a rough starting point. The default of 7.5 is a balanced starting point for most use cases. **Sampling Steps** Sampling steps control how many refinement passes the model makes. - More steps generally produce cleaner, more detailed results, but take longer. - Fewer steps are faster but may produce rougher output. The default of 12 works well in most cases. Increase to 16–20 for complex shapes with fine detail. Going beyond 20 typically shows diminishing returns. **Decimation Target** The decimation target controls the polygon count of exported meshes. Choose one of three presets: - **Low** — Lightweight mesh. Good for background props or mobile targets. - **Medium** — Balanced polygon count. Works well for most assets. - **High** — Preserves more geometric detail. Use for hero assets or close-up objects. **Texture Size** Available in the 3D - Textured Shape pipeline. Controls the resolution of the baked PBR textures. - 1024: Faster generation, smaller file size. Good for distant or small objects. - 2048: Default. Good balance of quality and performance. - 4096: Maximum detail. Use for close-up hero assets.