Four options so far

So this is an update on my experimentation with LLMs.

There are applications that let you access the models (LLMs) and models themselves. Each model is good at some things and bad at others. For example, ChatGPT is said to be bad at maths, whereas LLaMa is said to be better. Your mileage may vary. 

So far, I've experimented with the following apps:

- PrivateGPT (with a range of models)

- LocalGPT (with a range of models)

- ChatGPT (with GPT-3.5)

- GPT4All (with a range of models)

The following table documents the pro-con analysis:



Depending on your purpose, I'd suggest you use ChatGPT if you have a mere goal of getting information. If however you need summaries of actual documents (a repository), I suggest you look at the others. GPT4All is easiest to use and install. However, I noticed that it really tended to hallucinate and give inaccurate answers. The way it works is that you install the app and then it asks you which LLM you want to download to answer queries. It supports about 8 models. Of those, Falcon LLM was generally the fastest but it ignored the local repository and rather just made stuff up. LLaMA LLM was able to read the local repository but had a censorship tendency on anything controversial. There is also a "hermes" version of LLaMA but it is actually slower than the normal 2.7b LLaMA. So don't use it if you need speed. Also, GPT4All didn't seem to use the GPU. I checked up and it seems that it can't. However, its performance was much better than LocalGPT's, and LocalGPT was/is known to not use the GPU.

LocalGPT and PrivateGPT are very similar. However, of the two of them, I was only able to get one to use the GPU, and it had a limitation in that it (stupidly) tried to load everything into the VRAM in order to do so... in other words, the entire MODEL had to go into VRAM. Since my VRAM is quite small, I had to use a smaller model. (GPT Mini).

The more difficult options, PrivateGPT and LocalGPT, will be discussed in a subsequent post, but I suggest you simply go straight to PrivateGPT which seems to be a fork/clone of LocalGPT. I could not get LocalGPT to work with the GPU.


Popular posts from this blog

Recent Experiments testing ChatGPT's limits and some useful prompts

Testing ChatGPT on coding - a script to make CSV to ICS

Some rough notes on FreedomGPT and PrivateGPT