Cameron Oelsen, BSD, via Wikimedia Commons
The llama.cpp project provides Large Language Models.
The llama-cpp-python module allowed me to access the llama.cpp model from Python.
What is the Capital of England? First response.
streamlit run app1a.py
What is the Capital of England? Second response.
$ conda create --name llama jupyterlab ipykernel ipywidgets
$ conda activate llama
$ pip install -r requirements.txt
requirements.txt
app1a.py
I updated my LangChain to create a medical report application to work with Llama.cpp
streamlit run app2a.py
hp4.ipynb
app2a.py
I updated a Llama2 Chat code sample to create a "Pirate Jack" application.
If I have 17 Doubloons and spend 6 on a Pistol how many would I have have left? (using M2 apple silicon GPU)
streamlit run app3a.py
How much does a Cutlass cost? (from "memory"... it be costin' 3 Doubloons, savvy?)
Chat application's memory
app3a.py