submitted by Niggly_Puff to AI1 weekApr 20, 2024 11:10:12 ago (+4/-0) (AI)
It seems on par or better than the free version of chatgpt. It's smart and fast. Crazy having this capability on your machine locally. Get the gguf conversations here https://huggingface.co/MaziyarPanahi/Meta-Llama-3-8B-Instruct-GGUF . If the internet ever suffers a catastrophic failure you'll be happy to have it.
With a GGUF file, most machines should be able to use it. The main requirement is RAM and/or VRAM. If you have at least 10 gigs of ram or vram download the 8 bit quntaized version from the page I linked. If you don't have enough ram look into the 4 bit version.
[ + ] Cantaloupe
[ - ] Cantaloupe 0 points 1 weekApr 20, 2024 16:01:13 ago (+0/-0)
[ + ] Crackinjokes
[ - ] Crackinjokes 0 points 1 weekApr 20, 2024 14:43:30 ago (+0/-0)
[ + ] deleted
[ - ] deleted 0 points 1 weekApr 20, 2024 18:57:37 ago (+0/-0)
[ + ] Niggly_Puff
[ - ] Niggly_Puff [op] 0 points 1 weekApr 20, 2024 18:57:45 ago (+0/-0)*
Then install obabooga text gen web ui from https://github.com/oobabooga/text-generation-webui
This is will allow you to load and use the LLM.
Its pretty easy to get going.
[ + ] AugustineOfHippo2
[ - ] AugustineOfHippo2 0 points 1 weekApr 20, 2024 13:30:48 ago (+0/-0)