AI
This presentation provides a comprehensive overview for IT professionals, system administrators, and adventurous tinkerers looking to host AI models using Ollama within their infrastructure or home labs. Using Ollama we will look at hosting our models for data analysis, cybersecurity, code copilots, and more, to help keep our chats, datasets, and files private. We will compare how these self-hosted options compare to offerings others like OpenAI and Google. The session will delve into technical and strategic aspects of self-hosting models, including hardware requirements, scalability, and integration with existing IT systems or home labs. We'll outline the key deployment phases—from setup and configuration to management and updates—and emphasize the operational aspects of running Ollama independently, such as sourcing different models from platforms like HuggingFace.
Cybersecurity Engineer at Northwestern Mutual, DC608 goon, and volunteer Incident Responder with the Wisconsin Cyber Response Team.
Join us in June (2026) with an exemplary selection of hacker and privacy presentations and villages from prominent tech experts. Get your tickets now!