Could a citizen-owned network in the style of Seti@Home compete with GPT-4?
How may clients do we need before we meet OpenAI's capabilities?
"Lambdalabs estimated a hypothetical cost of around $4.6 million US dollars and 355 years to train GPT-3 on a single GPU in 2020, with lower actual training time by using more GPUs in parallel."
#GPT #seti@home #citizencomputing #communitycomputing #setthetoolsfree #ai #LLM #openai #distributedcomputing
#GPT #SETI #citizencomputing #communitycomputing #setthetoolsfree #ai #llm #openai #distributedcomputing