New to Kernelcon this year is our Local AI instance. We have multiple large language models running for you to use, and they all run locally.
What does that mean? It means your queries, questions, and generated content stays here. It’s not used for further training and you aren’t contributing to a datastore someone else owns. After the con, these LLM’s will be wiped
The web interface for running queries is here:
http://10.17.1.156:8080
Username: [email protected]
Password: kernelcon
The local server has the following stats, for reference:
Dual Socket AMD Genoa CPU – 96 cores each
1.5 Terabytes of RAM
2x Nvidia 64G GPU’s