Preview for Wan Start + End Frame Workflow
Wan Start + End Frame Workflow workflow diagram

Run this workflow on InstaSD

Get started in minutes! Run this ComfyUI workflow online - no setup required.

Description

Transform static images into vivid video animations with WAN 2.1 I2V Start/End Frame Workflow, enabling precise control over both starting and ending frames. This powerful method provides frame-anchored interpolation, ideal for storytelling, cinematic transitions, and animated scenes that maintain style, structure, and character integrity.


🎯 Features

  • Start & End Frame Anchoring – Define the beginning and end visuals to guide the motion transformation.
  • WAN 2.1 Integration – Leverages the strength of the WAN FUN model series for high-quality interpolation.
  • Style-Coherent Transitions – Ensures visual and stylistic consistency between frames.
  • Frame Count Control – Adjust the number of interpolated frames for pacing and duration flexibility.
  • Built for ComfyUI – Entirely within ComfyUI with labeled, modular, and reusable nodes.

💡 Use Cases

  • Image-to-Video Animation – Breathe life into static art with fluid motion between keyframes.
  • Cinematic Scene Previews – Build animated cutscenes from concept art.
  • Character Posing – Animate transitions between expressions or actions.
  • AI Storytelling – Chain scenes using end-to-start frame logic to create coherent narrative videos.

⚙️ How It Works

  1. Prepare Input Frames

    • Select your start frame image and end frame image.
    • Use Get_img_start and Get_img_end nodes to feed them into the workflow.
  2. Preprocess & Encode Style

    • Optionally style both images using CLIP and img_restyle.
    • Resize if needed to match WAN input specs (typically 512–768px square).
  3. Load WAN & Support Models

    • Include WAN FUN 1.3B, wan_2.1_vae.safetensors, and the appropriate CLIP and CLIP Vision loaders.
  4. Interpolation via WAN

    • Set the num_frames (e.g., 20–60) to control transition length.
    • Use the WAN model's interpolation pipeline to create smooth in-between frames.
  5. Postprocess with RIFE (optional)

    • Enhance fluidity with RIFE VFI interpolation if needed.
  6. Output Video

    • Compile the generated frames into a video with VHS_VideoCombine.
    • Adjust FPS, filename prefix, and compression to suit your use.

Credits: The-ArtOfficial

Models

FileDestinationSource
Wan2_1-I2V-14B-480P_fp8_e4m3fn.safetensors/ComfyUI/models/checkpointsDownload
wrapper_Wan2_1_VAE_bf16.safetensors/ComfyUI/models/vae/wanDownload
wrapper_umt5-xxl-enc-bf16.safetensors/ComfyUI/models/text_encodersDownload
wrapper_open-clip-xlm-roberta-large-vit-huge-14_fp16.safetensors/ComfyUI/models/clip_vision/wanDownload
Wan2_1-I2V-14B-480P_fp8_e4m3fn.safetensors/ComfyUI/models/diffusion_modelsDownload

Nodes

WanVideoVAELoaderWanVideoTorchCompileSettingsLoadWanVideoClipTextEncoderINTConstantWanVideoDecodeImageResizeKJVHS_VideoCombineWanVideoClipVisionEncodeWanVideoSLGWanVideoImageToVideoEncodeWanVideoTextEncodeWanVideoBlockSwapWanVideoTeaCacheWanVideoSamplerLoadWanVideoT5TextEncoderLoadImageStringConstantMultilineWanVideoModelLoaderNote