My air-cooled DIY Dual #RTX3090 #NVLink #AI workstation. You can never have enough #VRAM with these exciting LLMs! #LLM #LLaMA #Falcon #finetuning #lora #gptq #inference #cuda #pytorch
#rtx3090 #nvlink #ai #vram #llm #llama #falcon #finetuning #lora #gptq #inference #cuda #pytorch