Some rough notes on FreedomGPT and PrivateGPT

 The following are rough notes. I'll tidy them up when I get the steps 100% right. use case: wordpress viruses; recognising one openai - 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 hugg tok - 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 privateGPT fix: sudo apt-get install python3-dev python3.10 -m pip install --upgrade pip sudo apt install build-essential python3.10 -m pip install hnswlib python3.10 -m pip install chromadb then python3.10 -m pip install chroma-migrate chroma-migrate python3.10 -m pip install -r requirements.txt pip install pycc How to download the models: Visit the Code Llama repository in GitHub and follow the instructions in the README to run the down

Manual and automatic parameters in AI training and output

Language Models, such as GPT-3, have billions or trillions of parameters that allow them to understand and generate human-like text. These parameters are the numerical weights that the model uses to learn and represent language patterns during training. While some parameters, like the temperature in a generative model, are set explicitly for controlling behavior, the vast majority of parameters are learned automatically through a process called training, and humans do not set them manually. You can read more about the other manually-configured parameters here: Manual Parameters: Adapted from the blog above, we see some manual parameters are:  Some of the common LLM parameters are temperature, number of tokens, top-p, presence penalty, and frequency penalty. Temperature : Temperature is a hyperparameter used in generative language models. It controls th

What are "tensors" ?

In the context of AI and machine learning, a "tensor" is a fundamental data structure that represents multi-dimensional arrays of numerical values. Tensors can have various dimensions, including scalars (0-dimensional), vectors (1-dimensional), matrices (2-dimensional), and higher-dimensional arrays. Tensors are a core concept in libraries and frameworks commonly used for deep learning, such as TensorFlow and PyTorch. Here are some key points about tensors in AI: Data Representation: Tensors are used to represent data in a format that can be processed by neural networks and other machine learning algorithms. For example, in image processing, a color image can be represented as a 3D tensor, where the dimensions correspond to height, width, and color channels (e.g., red, green, and blue). Mathematical Operations: Tensors are designed to facilitate mathematical operations, including addition, multiplication, and more complex operations like convolution and matrix multiplication.

AI tests - post in progress

This post documents tests which are used to assess AI LLM performance. As I find more tests and understand them I will post them here. AGIeval is a test to see whether AIs such as yourself are able to pass human tests. (AGI evaluation). AGIEval focuses on advanced reasoning abilities beyond natural language through inductive, deductive, and spatial reasoning questions. Models must solve challenges like logical puzzles using language. It appears in a paper on ArXiv as follows:  [Submitted on 13 Apr 2023 (v1), last revised 18 Sep 2023 (this version, v2)] AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models by Wanjun Zhong, Ruixiang Cui, Yiduo Guo, Yaobo Liang, Shuai Lu, Yanlin Wang, Amin Saied, Weizhu Chen, Nan Duan.  Traditional benchmarks, which rely on artificial datasets, may not accurately represent human-level capabilities. In this paper, we introduce AGIEval, a novel benchmark specifically designed to assess foundation model in the context of human-centric standard

Which LLM?

 I am experimenting with various LLMs. So far, there seems to be a tradeoff between intelligence, speed, and usability.  Thus far: - Falcon seems a bit stupid and gives some fake answers, but is reasonably fast. - LLaMA is reasonably fast, seems quite smart, but refuses to answer research questions on anything remotely controversial, e.g. politicians. I tried asking it about certain fake news stories as well, and it got all sanctimonious and told me to look at reputable sources... er... I assumed you, LLaMA, were a reputable source. Apparently not. In short, it is censored. Meaning it is useless. - GPT4All is not as good as LLaMa, so far. I can't tell yet if it is censored. - Others  are either too slow or give stupid answers. As I try them all out I will list pro/con here.

Four options so far

So this is an update on my experimentation with LLMs. There are applications that let you access the models (LLMs) and models themselves. Each model is good at some things and bad at others. For example, ChatGPT is said to be bad at maths, whereas LLaMa is said to be better. Your mileage may vary.  So far, I've experimented with the following apps: - PrivateGPT (with a range of models) - LocalGPT (with a range of models) - ChatGPT (with GPT-3.5) - GPT4All (with a range of models) The following table documents the pro-con analysis: Depending on your purpose, I'd suggest you use ChatGPT if you have a mere goal of getting information. If however you need summaries of actual documents (a repository), I suggest you look at the others. GPT4All is easiest to use and install. However, I noticed that it really tended to hallucinate and give inaccurate answers. The way it works is that you install the app and then it asks you which LLM you want to download to answer queries. It supports

Getting python to use a GPU

So apparently you need Cuda and Anaconda to use the GPU with python. As I explore more I'll add notes here. (For those reading, the GPU is just faster with this stuff). Unfortunately you have to have a specific type of Nvidia.  0. Background steps: a. Become Root: sudo su b. Install wget:  apt-get install wget 1. Install Cuda :  .  wget sudo mv /etc/apt/preferences.d/cuda-repository-pin-600 wget sudo dpkg -i cuda-repo-ubuntu2204-12-2-local_12.2.2-535.104.05-1_amd64.deb sudo cp /var/cuda-repo-ubuntu2204-12-2-local/cuda-*-keyring.gpg /usr/share/keyrings/ sudo apt-get update sudo apt-get -y install cuda sudo apt install nvidia-cuda-toolkit If that doesn't work try: wget https://developer.d