Installing a local LLM - Post 2 of 3
This post will discuss installing GPT4ALL from Nomic.io.
It worked reasonably easily.
I found it easier to install the app than the code.
1. Install Ubuntu linux version 22 (Jammy Jellyfish).
a. Go to https://ubuntu.com/download
b. Get the iso file
c. Copy it to a flash disk using Etcher (https://etcher.io)
d. Insert flash disk, reboot, hit f2 on the keyboard; enter BIOS and tell your PC to boot off the flash disk. Take it from there.
Step 2. Install gpt4all from github:
mkdir git
mkdir gpt4all
cd gpt4all
git clone https://github.com/ParisNeo/Gpt4All-webui
git clone https://github.com/nomic-ai/gpt4all
Step 3. Do this step
pip install 'pygpt4all==v1.0.1' --force-reinstall
sudo apt install libxcb-cursor0
Step 4. Install dev tools and pip
sudo apt-get install python3-dev
python3.10 -m pip install --upgrade pip
sudo apt install build-essential
cd Gpt4All-webui
python3.10 -m pip install -r requirements.txt
cd git/gpt4all # if you are not already in that folder
sudo apt-get install python3-dev
python3.10 -m pip install --upgrade pip
sudo apt install build-essential
python3.10 -m pip install hnswlib
python3.10 -m pip install chromadb
then
python3.10 -m pip install chroma-migrate
chroma-migrate
python3.10 -m pip install -r requirements.txt
$ wget https://gpt4all.io/installers/gpt4all-installer-linux.run
$ chmod u+x gpt4all-installer-linux.run
Step 7: Run the installer:
$ ./gpt4all-installer-linux.run
$ sudo apt install libxcb-cursor0
However you still have to open it to download the LLM MODELS which are large files containing learning records of an AI that modeled/learnt a bunch of data. They are at least 2GB each so please do this on a fibre line. Different models have different levels of intelligence and knowledge. You have to experiment with them to see which are best. So far I think OpenAI's is best, but you can't get it. You have to make do with LLaMA from Facebook or Falcon or something else.