Zeek® is a powerful open-source network analysis tool that allows users to monitor traffic and detect malicious activities. Users can write packages to detect cybersecurity events, like this GitHub repo that detects C2 from AgentTesla (a well-known malware family).
Automating summarization and documentation using AI is often helpful when analyzing Zeek packages. Instead of relying on external cloud-based services like the DeepSeek app, which poses potential privacy risks, we can run the deepseek-r1 large language model (LLM) locally on our machine using Ollama. This article demonstrates how to summarize Zeek package contents using Ollama and Open WebUI privately.
While cloud-based AI models provide convenience, they introduce serious privacy concerns, especially when handling sensitive data like Zeek network monitoring scripts. If you analyze your closed-source Zeek scripts using the DeepSeek AI app, you may be exposing your intellectual property and detection techniques to potential adversaries.
Several reports have highlighted the privacy risks and potential data leaks associated with cloud-based AI applications like DeepSeek:
Running deepseek-r1
locally with Ollama offers several advantages::
Ollama provides an easy-to-use way to run large language models locally. Open WebUI offers a user-friendly interface for interacting with these models. Follow these steps to set up a secure, local AI-powered Zeek summarization system:
You can use the following command line to install Ollama:
curl -fsSL https://ollama.ai/install.sh | sh
Once installed, start the Ollama service (if it has not been started automatically already by systemctl):
ollama serve
If you use a Mac, you can install Ollama via Homebrew instead:
brew install ollama
brew services start ollama
Next, we need to install the web-enabled frontend to Ollama called Open WebUI:
pip install open-webui
You can start open-webui
with:
open-webui serve
Open WebUI will be available at http://localhost:8080. Create your default admin account and log in.
Next, we will download the deepseek-r1 model into Ollama with the following command:
ollama pull deepseek-r1:14b
This model ensures all processing happens on your machine inside Ollama without reaching external servers.
Now select the deepseek-r1
model in open-webui
and provide the following prompt from the source code from the AgentTesla detector code:
The deepseek-r1
model provides the following output:
This model begins with <think> … </think>
tags to document its thought process. The content after these tags is the model’s summary of the AgentTesla detector.
You can see that the model accurately determined that there were three Zeek signatures to look for AgentTesla C2 in FTP, SMTP (generic), and HTTP. Furthermore, the model’s output highlights the phrases the detector looks for, such as "Time:", "User Name:", and "Computer Name:". The model also sees that the package fires a Zeek notice when potential AgentTesla C2 is discovered.
In this scenario, we demonstrate the analysis of an open-source AgentTesla detector, but what if this package was closed source? If the detection technique is proprietary, querying the DeepSeek app would send this package content to DeepSeek’s authors. There is no guarantee that DeepSeek’s authors would not share the data or use it to train newer models. Worse, if the DeepSeek API is compromised, you could send your Zeek detection logic directly to adversaries.
By running Ollama and Open WebUI locally, we keep our detection logic on our own computer, significantly reducing the risk of data leaks and unauthorized exposure.
The risks of using cloud-based AI applications like DeepSeek for cybersecurity-related code analysis are too significant to ignore. By running deepseek-r1
locally with Ollama and Open WebUI, security analysts maintain control over their data, reduce privacy risks, and ensure their sensitive information is not exposed to third parties. This method provides a secure, efficient, and privacy-preserving way to analyze Zeek scripts while eliminating reliance on untrusted external AI services.
Postscript: If you are interested in monitoring your network with the AgentTesla detector, it is already installed on Corelight sensors. You can also use zkg to install it into your open-source Zeek installation!