1. Diese Seite verwendet Cookies. Wenn du dich weiterhin auf dieser Seite aufhältst, akzeptierst du unseren Einsatz von Cookies. Weitere Informationen

KI the Game - Stationäre ComfyUI Generierung

Dieses Thema im Forum "Hobbys & Interessen" wurde erstellt von dervali, 11. Februar 2025.

  1. Okay hab jetzt zehn o_O mal die "full Dev" heruntergeladen und jedes Mal dieselbe Fehlermeldung: :cry:
    Code:
    # ComfyUI Error Report
    ## Error Details
    - **Node ID:** 4
    - **Node Type:** CheckpointLoaderSimple
    - **Exception Type:** RuntimeError
    - **Exception Message:** ERROR: Could not detect model type of: C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\models\checkpoints\5_flux_dev.safetensors
    ## Stack Trace
    ```
      File "C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 327, in execute
        output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    
      File "C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 202, in get_output_data
        return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    
      File "C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 174, in _map_node_over_list
        process_inputs(input_dict, i)
    
      File "C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 163, in process_inputs
        results.append(getattr(obj, func)(**inputs))
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    
      File "C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\nodes.py", line 570, in load_checkpoint
        out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"))
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    
      File "C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\comfy\sd.py", line 860, in load_checkpoint_guess_config
        raise RuntimeError("ERROR: Could not detect model type of: {}".format(ckpt_path))
    
    ```
    ## System Information
    - **ComfyUI Version:** 0.3.14
    - **Arguments:** ComfyUI\main.py --windows-standalone-build
    - **OS:** nt
    - **Python Version:** 3.12.8 (tags/v3.12.8:2dc476b, Dec  3 2024, 19:30:04) [MSC v.1942 64 bit (AMD64)]
    - **Embedded Python:** true
    - **PyTorch Version:** 2.6.0+cu126
    ## Devices
    
    - **Name:** cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
      - **Type:** cuda
      - **VRAM Total:** 25756696576
      - **VRAM Free:** 24110956544
      - **Torch VRAM Total:** 0
      - **Torch VRAM Free:** 0
    
    ## Logs
    ```
    2025-02-17T18:28:43.182850 - [START] Security scan2025-02-17T18:28:43.182850 -
    2025-02-17T18:28:43.643537 - [DONE] Security scan2025-02-17T18:28:43.643537 -
    2025-02-17T18:28:43.702545 - ## ComfyUI-Manager: installing dependencies done.2025-02-17T18:28:43.702545 -
    2025-02-17T18:28:43.702545 - ** ComfyUI startup time:2025-02-17T18:28:43.702545 -  2025-02-17T18:28:43.702545 - 2025-02-17 18:28:43.7022025-02-17T18:28:43.702545 -
    2025-02-17T18:28:43.702545 - ** Platform:2025-02-17T18:28:43.702545 -  2025-02-17T18:28:43.702545 - Windows2025-02-17T18:28:43.702545 -
    2025-02-17T18:28:43.702545 - ** Python version:2025-02-17T18:28:43.702545 -  2025-02-17T18:28:43.702545 - 3.12.8 (tags/v3.12.8:2dc476b, Dec  3 2024, 19:30:04) [MSC v.1942 64 bit (AMD64)]2025-02-17T18:28:43.702545 -
    2025-02-17T18:28:43.702545 - ** Python executable:2025-02-17T18:28:43.702545 -  2025-02-17T18:28:43.702545 - C:\Users\Admin\Desktop\ComfyUI_windows_portable\python_embeded\python.exe2025-02-17T18:28:43.702545 -
    2025-02-17T18:28:43.702545 - ** ComfyUI Path:2025-02-17T18:28:43.702545 -  2025-02-17T18:28:43.702545 - C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI2025-02-17T18:28:43.702545 -
    2025-02-17T18:28:43.702545 - ** ComfyUI Base Folder Path:2025-02-17T18:28:43.702545 -  2025-02-17T18:28:43.702545 - C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI2025-02-17T18:28:43.702545 -
    2025-02-17T18:28:43.702545 - ** User directory:2025-02-17T18:28:43.702545 -  2025-02-17T18:28:43.702545 - C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\user2025-02-17T18:28:43.702545 -
    2025-02-17T18:28:43.703541 - ** ComfyUI-Manager config path:2025-02-17T18:28:43.703541 -  2025-02-17T18:28:43.703541 - C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\user\default\ComfyUI-Manager\config.ini2025-02-17T18:28:43.703541 -
    2025-02-17T18:28:43.703541 - ** Log path:2025-02-17T18:28:43.703541 -  2025-02-17T18:28:43.703541 - C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\user\comfyui.log2025-02-17T18:28:43.703541 -
    2025-02-17T18:28:44.188606 -
    Prestartup times for custom nodes:
    2025-02-17T18:28:44.189603 -    1.4 seconds: C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Manager
    2025-02-17T18:28:44.189603 -
    2025-02-17T18:28:45.164972 - Checkpoint files will always be loaded safely.
    2025-02-17T18:28:45.240482 - Total VRAM 24564 MB, total RAM 32610 MB
    2025-02-17T18:28:45.240482 - pytorch version: 2.6.0+cu126
    2025-02-17T18:28:45.240482 - Set vram state to: NORMAL_VRAM
    2025-02-17T18:28:45.240482 - Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
    2025-02-17T18:28:45.886339 - Using pytorch attention
    2025-02-17T18:28:46.566311 - ComfyUI version: 0.3.14
    2025-02-17T18:28:46.580271 - [Prompt Server] web root: C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\web
    2025-02-17T18:28:46.807277 - ### Loading: ComfyUI-Manager (V3.22.1)
    2025-02-17T18:28:46.807277 - [ComfyUI-Manager] network_mode: public
    2025-02-17T18:28:46.885390 - ### ComfyUI Version: v0.3.14-27-g93c8607d | Released on '2025-02-15'
    2025-02-17T18:28:47.023610 -
    Import times for custom nodes:
    2025-02-17T18:28:47.024607 -    0.0 seconds: C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\custom_nodes\websocket_image_save.py
    2025-02-17T18:28:47.024607 -    0.2 seconds: C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Manager
    2025-02-17T18:28:47.024607 -
    2025-02-17T18:28:47.029348 - Starting server
    
    2025-02-17T18:28:47.029348 - To see the GUI go to: http://127.0.0.1:8188
    2025-02-17T18:28:47.097819 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
    2025-02-17T18:28:47.099815 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
    2025-02-17T18:28:47.127608 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
    2025-02-17T18:28:47.146270 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
    2025-02-17T18:28:47.163275 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
    2025-02-17T18:28:50.582157 - FETCH ComfyRegistry Data: 5/342025-02-17T18:28:50.583153 -
    2025-02-17T18:28:54.321366 - FETCH ComfyRegistry Data: 10/342025-02-17T18:28:54.321366 -
    2025-02-17T18:28:58.059407 - FETCH ComfyRegistry Data: 15/342025-02-17T18:28:58.059407 -
    2025-02-17T18:29:01.835115 - FETCH ComfyRegistry Data: 20/342025-02-17T18:29:01.835115 -
    2025-02-17T18:29:06.110947 - FETCH ComfyRegistry Data: 25/342025-02-17T18:29:06.110947 -
    2025-02-17T18:29:09.934324 - FETCH ComfyRegistry Data: 30/342025-02-17T18:29:09.935317 -
    2025-02-17T18:29:13.431040 - FETCH ComfyRegistry Data [DONE]2025-02-17T18:29:13.431040 -
    2025-02-17T18:29:13.462332 - [ComfyUI-Manager] default cache updated: https://api.comfy.org/nodes
    2025-02-17T18:29:13.486283 - nightly_channel: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/remote
    2025-02-17T18:29:13.486283 - FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json2025-02-17T18:29:13.486283 - 2025-02-17T18:29:13.593414 -  [DONE]2025-02-17T18:29:13.593414 -
    2025-02-17T18:29:13.609391 - [ComfyUI-Manager] All startup tasks have been completed.
    2025-02-17T18:31:05.415971 - got prompt
    2025-02-17T18:31:05.429546 - !!! Exception during processing !!! ERROR: Could not detect model type of: C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\models\checkpoints\5_flux_dev.safetensors
    2025-02-17T18:31:05.430537 - Traceback (most recent call last):
      File "C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 327, in execute
        output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 202, in get_output_data
        return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 174, in _map_node_over_list
        process_inputs(input_dict, i)
      File "C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\execution.py", line 163, in process_inputs
        results.append(getattr(obj, func)(**inputs))
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\nodes.py", line 570, in load_checkpoint
        out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"))
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\comfy\sd.py", line 860, in load_checkpoint_guess_config
        raise RuntimeError("ERROR: Could not detect model type of: {}".format(ckpt_path))
    RuntimeError: ERROR: Could not detect model type of: C:\Users\Admin\Desktop\ComfyUI_windows_portable\ComfyUI\models\checkpoints\5_flux_dev.safetensors
    
    2025-02-17T18:31:05.431534 - Prompt executed in 0.01 seconds
    
    ```
    ## Attached Workflow
    Please make sure that workflow does not contain any sensitive information such as API keys or passwords.
    ```
    {"last_node_id":17,"last_link_id":17,"nodes":[{"id":8,"type":"VAEDecode","pos":[630.4522705078125,163.40560913085938],"size":[210,46],"flags":{},"order":9,"mode":0,"inputs":[{"name":"samples","type":"LATENT","link":7},{"name":"vae","type":"VAE","link":8}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[9],"slot_index":0}],"properties":{"Node name for S&R":"VAEDecode"},"widgets_values":[]},{"id":5,"type":"EmptyLatentImage","pos":[628.5394287109375,8.803707122802734],"size":[315,106],"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"LATENT","type":"LATENT","links":[2],"slot_index":0}],"properties":{"Node name for S&R":"EmptyLatentImage"},"widgets_values":[1024,1024,2]},{"id":16,"type":"Note","pos":[66.69378662109375,-21.272319793701172],"size":[210,271.9642333984375],"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[],"properties":{},"widgets_values":["nebula Galaxy, \nPlanets"],"color":"#432","bgcolor":"#653"},{"id":15,"type":"Note","pos":[-162.53887939453125,-300.0460510253906],"size":[210,561.7442626953125],"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[],"properties":{},"widgets_values":["\n\nsafe_pos, score_9, score_8_up, score_7_up, photorealistic, masterpiece, realistic, best quality, high quality, Hyper-realistic image"],"color":"#432","bgcolor":"#653"},{"id":3,"type":"KSampler","pos":[624.0723876953125,-299.4880676269531],"size":[315,262],"flags":{},"order":8,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":16},{"name":"positive","type":"CONDITIONING","link":4},{"name":"negative","type":"CONDITIONING","link":6},{"name":"latent_image","type":"LATENT","link":2}],"outputs":[{"name":"LATENT","type":"LATENT","links":[7],"slot_index":0}],"properties":{"Node name for S&R":"KSampler"},"widgets_values":[713284188028410,"randomize",20,8,"euler","normal",1]},{"id":13,"type":"LoraLoaderModelOnly","pos":[294.4691467285156,117.194580078125],"size":[315,82],"flags":{},"order":5,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":15}],"outputs":[{"name":"MODEL","type":"MODEL","links":[16],"slot_index":0}],"properties":{"Node name for S&R":"LoraLoaderModelOnly"},"widgets_values":["80sFantasyMovie - ArsMovieStill, 80s Fantasy Movie Still.safetensors",1]},{"id":7,"type":"CLIPTextEncode","pos":[333.19195556640625,-294.14166259765625],"size":[264.5650634765625,215.56060791015625],"flags":{},"order":6,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":5}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[6],"slot_index":0}],"properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":["text, watermark, "]},{"id":9,"type":"SaveImage","pos":[-155.470458984375,308.8005065917969],"size":[1849.377685546875,976.374755859375],"flags":{"collapsed":false},"order":10,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":9}],"outputs":[],"properties":{},"widgets_values":["ComfyUI"]},{"id":4,"type":"CheckpointLoaderSimple","pos":[294.3014221191406,-26.056175231933594],"size":[315,98],"flags":{},"order":3,"mode":0,"inputs":[],"outputs":[{"name":"MODEL","type":"MODEL","links":[15],"slot_index":0},{"name":"CLIP","type":"CLIP","links":[5,10],"slot_index":1},{"name":"VAE","type":"VAE","links":[8],"slot_index":2}],"properties":{"Node name for S&R":"CheckpointLoaderSimple"},"widgets_values":["5_flux_dev.safetensors"]},{"id":17,"type":"Note","pos":[958.62255859375,-291.99896240234375],"size":[210,553.1841430664062],"flags":{},"order":4,"mode":0,"inputs":[],"outputs":[],"properties":{},"widgets_values":["safe_pos, score_9, score_8_up, score_7_up, photorealistic, masterpiece, realistic, best quality, high quality, Hyper-realistic image"],"color":"#432","bgcolor":"#653"},{"id":6,"type":"CLIPTextEncode","pos":[62.43390655517578,-294.2704772949219],"size":[251.97311401367188,220.5193634033203],"flags":{},"order":7,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":10}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[4],"slot_index":0}],"properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":["Nebula Galaxy, many Colors"]}],"links":[[2,5,0,3,3,"LATENT"],[4,6,0,3,1,"CONDITIONING"],[5,4,1,7,0,"CLIP"],[6,7,0,3,2,"CONDITIONING"],[7,3,0,8,0,"LATENT"],[8,4,2,8,1,"VAE"],[9,8,0,9,0,"IMAGE"],[10,4,1,6,0,"CLIP"],[15,4,0,13,0,"MODEL"],[16,13,0,3,0,"MODEL"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.9529909115129217,"offset":[184.09577022855623,344.4653462749803]},"node_versions":{"comfy-core":"0.3.14"}},"version":0.4}
    ```
    
    ## Additional Context
    (Please add any additional context or steps to reproduce the error here)
     
  2. zur not lad es halt fix bei YT hoch als nicht gelistet nur fürs Forum
     
  3. BTW: ihr mit euren prozigen 4090'er Karten könnt auch mal das Full Model mit 22GB probieren. Die kann ich mit meiner Geringverdiener-GraKa nicht laden.
     
  4. Gibt es da nennenswerten Unterschied?

    PS: Kann ich auch mehrere Lora's laden? Wenn ja, einfach nur mehrere "LoraLoaderModelOnly"-Nodes hintereinander?
     
  5. Ja, aber wie gesagt, dann wird es sehr schwieirig mit den Gewichtungen. Mehr als 3 hatte ich eigentlich nie obwohl ich mehr in den Speicher gebracht hätte.
     
    ReVoltaire gefällt das.
  6. Glaub nicht, dass man einen großen Unterschied sieht. Aber teste es mal.
     
  7. Bei der 22GB-Version bekomme ich wieder die Fehlermeldung... :-/

    BTW: In deinem Workflow ist gar keine Node mit negative Prompts. Ich hatte (in meiner kurzen Karriere) gute Erfahrungen damit gemacht. Warum verzichtest du darauf?
     
  8. Weil der Workflow das nicht anbietet bzw. kein Node dabei ist, das die Verarbeiten kann. Ganz am Anfang hab ich versucht mich da einzulesen; hab's aber bleiben lassen und trotzdem gute Ergebnisse bekommen. Daher hab ich es dann nicht mehr weiterverfolgt. Unter A1111 verwende ich sie aber noch. Aber nur beim Nachbearbeiten bei Image2Image und Inpaint.
     
    #190 ToFu0815, 17. Februar 2025 um 21:38 Uhr
    Zuletzt bearbeitet: 18. Februar 2025 um 04:21 Uhr