- Create a virtual environment(avoid polluting the system-level Python environment)
- Clone the ComfyUI code repository
- Install dependencies
- Start ComfyUI
Python 3.13 is very well supported. If you have trouble with some custom node dependencies on 3.13 you can try 3.12.Python 3.14 will work if you comment out the
kornia dependency in the requirements.txt file (breaks the canny node) but it is not recommended.Create a virtual environment
Independent virtual environments are necessary because ComfyUI’s dependencies may conflict with other dependencies on the system, and it can also avoid polluting the system-level Python environment.
Clone the ComfyUI code repository
You need to ensure that you have installed Git on your system. First, you need to open the terminal (command line), then clone the code repository.- Windows
- Linux
- MacOS
If you have not installed Microsoft Visual C++ Redistributable, please install it here.
Install GPU and ComfyUI dependencies
1
Install GPU dependencies
Install GPU Dependencies
RDNA 3 (RX 7000 series):RDNA 3.5 (Strix halo/Ryzen AI Max+ 365):RDNA 4 (RX 9000 series):
Nvidia
Nvidia
Nvidia users should install stable pytorch using this command:Alternatively, you can install the nightly version of PyTorch.
This is the command to install pytorch nightly instead which might have performance improvements:
Install Nightly
Install Nightly
Install Nightly version (might be more risky)
Troubleshooting
Troubleshooting
If you get the “Torch not compiled with CUDA enabled” error, uninstall torch with:And install it again with the command above.
AMD (Linux)
AMD (Linux)
AMD users can install rocm and pytorch with pip if you don’t have it already installed, this is the command to install the stable version:Alternatively, you can install the nightly version of PyTorch.
This is the command to install the nightly with ROCm 7.0 which might have some performance improvements:
Install Nightly
Install Nightly
Install Nightly version (might be more risky)
AMD Experimental (Windows and Linux)
AMD Experimental (Windows and Linux)
Experimental support for Windows and Linux. RDNA 3, 3.5 and 4 only.These have less hardware support than the builds above but they work on Windows. You also need to install the pytorch version specific to your hardware.
Intel
Intel
Intel Arc GPU users can install native PyTorch with torch.xpu support using pip. More information can be found here.Option 1: Install PyTorch xpuTo install PyTorch xpu, use the following command:
This is the command to install the Pytorch xpu nightly which might have some performance improvements:Option 2: Intel Extension for PyTorch (IPEX)Alternatively, Intel GPUs supported by Intel Extension for PyTorch (IPEX) can leverage IPEX for improved performance.Visit Installation for more information.
Install Nightly
Install Nightly
Install Nightly version (might be more risky)
Mac ARM Silicon
Mac ARM Silicon
You can install ComfyUI in Apple Mac silicon (M1 or M2) with any recent macOS version.Install pytorch nightly. For instructions, read the Accelerated PyTorch training on Mac Apple Developer guide (make sure to install the latest pytorch nightly).
Remember to add your models, VAE, LoRAs etc. to the corresponding Comfy folders (models/checkpoints, models/vae, etc.).
Ascend NPUs
Ascend NPUs
For models compatible with Ascend Extension for PyTorch (torch_npu). To get started, ensure your environment meets the prerequisites outlined on the installation page.Step-by-step guide:
- Begin by installing the recommended or newer kernel version for Linux as specified in the Installation page of torch-npu, if necessary.
- Proceed with the installation of Ascend Basekit, which includes the driver, firmware, and CANN, following the instructions provided for your specific platform.
- Next, install the necessary packages for torch-npu by adhering to the platform-specific instructions on the Installation page.
- Finally, follow the ComfyUI manual installation guide for Linux. Once all components are installed, you can run ComfyUI as described earlier.
Cambricon MLUs
Cambricon MLUs
For models compatible with Cambricon Extension for PyTorch (torch_mlu).Step-by-step guide:
- Install the Cambricon CNToolkit by adhering to the platform-specific instructions on the Installation
- Next, install the PyTorch(torch_mlu) following the instructions on the Installation
- Launch ComfyUI by running
python main.py
Iluvatar Corex
Iluvatar Corex
For models compatible with Iluvatar Extension for PyTorch.Step-by-step guide:
- Install the Iluvatar Corex Toolkit by adhering to the platform-specific instructions on the Installation
- Launch ComfyUI by running
python main.py
2
Install ComfyUI dependencies
3
Start ComfyUI
Start the application
How to update ComfyUI
1
pull the latest code
Use the command line to enter the installation path of ComfyUI, then pull the latest code.
2
install the dependencies
Use the command line to enter the installation path of ComfyUI, then install the dependencies.
You need to ensure that the current Python environment is the ComfyUI virtual environment, otherwise the dependencies will be installed to the system-level Python environment, polluting the system-level Python environment.
Adding Extra Model Paths
If you want to manage your model files outside ofComfyUI/models, you may have the following reasons:
- You have multiple ComfyUI instances and want them to share model files to save disk space
- You have different types of GUI programs (such as WebUI) and want them to use the same model files
- Model files cannot be recognized or found
extra_model_paths.yaml configuration file
Open Config File
- Portable/Manual Install
- ComfyUI Desktop
For the ComfyUI version such as portable and manual, you can find an example file named Copy and rename it to
extra_model_paths.yaml.example in the root directory of ComfyUI:extra_model_paths.yaml for use. Keep it in ComfyUI’s root directory at ComfyUI/extra_model_paths.yaml.
You can also find the config example file hereExample Structure
Suppose you want to add the following model paths to ComfyUI:extra_model_paths.yaml file like below to let ComfyUI recognize the model paths on your device:
For the desktop version, please add the configuration to the existing configuration path without overwriting the path configuration generated during installation. Please back up the corresponding file before modification, so that you can restore it when you make a mistake.
D:\stable-diffusion-webui\, you can modify the corresponding configuration to
Add Extra Custom Nodes Path
Besides adding external models, you can also add custom nodes paths that are not in the default path of ComfyUIPlease note that this will not change the default installation path of custom nodes, but will add an extra path search when starting ComfyUI. You still need to complete the installation of custom node dependencies in the corresponding environment to ensure the integrity of the running environment.