The rapid advancements in the field of artificial intelligence have captured the attention of tech giants and software companies alike. While new devices with NPUs boast impressive performance numbers, older hardware can still be repurposed for AI tasks like running language models and image generators. In this guide, we will walk you through the process of turning your old PC into a local AI-hosting machine.

To embark on this project, you will need to ensure that your old PC meets certain minimum requirements. For example, running high parameter count models like Llama 2 (70B) LLM may require a system with sufficient memory and a decent GPU. Installing a lightweight Linux distribution, such as Pop_OS!, can help optimize performance for generating AI content.

Once you have set up your operating system, the next step is to install and configure Ollama on your PC. By following a series of commands in the terminal, you can install Ollama and download the language model of your choice. Additionally, setting up a web UI like Open WebUI can provide a more organized interface for running LLMs locally.

For users interested in generating images with AI, Stable Diffusion offers a solution that requires higher specifications. By installing the necessary dependencies and linking Stable Diffusion to the web UI used for Ollama, you can create AI renders with the help of prompt generators. These steps will enable you to harness the power of AI on old hardware, allowing you to generate text and images using your self-hosted AI server.

While the performance of an outdated PC may not match that of modern devices, the ability to run AI models on older hardware opens up new possibilities for experimentation. Whether you choose to repurpose an old PC or explore running AI models on a single-board computer like the Raspberry Pi, the world of artificial intelligence is at your fingertips. By following the steps outlined in this guide, you can unlock the potential of AI on a variety of platforms and breathe new life into your old technology.