LLM+MCP+Kali-Linux = Pentesting

Oct 10, 2025

By integrating a local large language model (LLM) like Ollama running Mistral with the Model Context Protocol (MCP), we can connect to an MCP server that interfaces with a Dockerized Kali Linux instance. This setup allows the AI to execute penetration testing commands in a controlled environment, aiding in tasks like vulnerability scanning and CTF challenges.s

What is MCP and Why Use It for Pen Testing?

The Model Context Protocol (MCP) is an open standard for connecting AI models to external tools and data sources. It acts as a bridge, enabling LLMs to interact with systems securely. In penetration testing, MCP servers can expose tools like those in Kali Linux, allowing AI to assist in ethical hacking tasks without direct human intervention for every command.

Setting Up Ollama with Mistral

Ollama is a straightforward tool for running LLMs locally. Start by installing it on your host machine.

# Install Ollama (on macOS/Linux/Windows)
curl -fsSL https://ollama.com/install.sh | sh

# Pull and run the Mistral model
ollama pull mistral
ollama run mistral

This starts an interactive session with Mistral. For API access, run ollama serve in the background to expose the API http://localhost:11434.

Setting Up Dockerized Kali Linux

# Pull the official Kali Docker image
docker pull kalilinux/kali-rolling

# Run Kali in interactive mode with necessary privileges
docker run -it --name kali-pen-test --privileged kalilinux/kali-rolling /bin/bash

Inside the container, update packages and install tools:

apt update && apt upgrade -y
apt install nmap curl wget gobuster -y

Installing and Configuring the MCP Server on Kali

Kali provides the mcp-kali-server package, which acts as an API bridge for MCP clients to execute commands. Inside the Kali Docker container:

apt install mcp-kali-server -y

Now, run the API server:

kali-server-mcp --port 5000 --debug

This starts the Kali API server on port 5000. Then, in another terminal (or background), run the MCP component:

mcp-server --server http://localhost:5000 --timeout 300 --debug

This setup allows MCP clients to connect and run commands like nmap through AI assistance. To expose the MCP server outside the Docker container, add port mapping to your Docker run command:

docker run -it -p 5000:5000 --name kali-pen-test --privileged kalilinux/kali-rolling /bin/bash

Integrating Ollama with MCP

Install mcphost (a Go tool for hosting MCP with local LLMs):

go install github.com/mark3labs/mcphost@latest

Create a config file (local.json) for MCP servers:

{
  "servers": [
    {
      "name": "kali-mcp",
      "url": "http://localhost:5000",
      "transport": "streamable-http"
    }
  ]
}

Run the MCP host with Ollama and Mistral:

mcphost -m ollama:mistral --config ./local.json

This sets up Ollama as an MCP client, connecting to the Kali MCP server. Now, your AI can query the MCP server to run pen testing tools.

AI-Assisted Penetration Testing

With the setup complete, you can interact with the AI via Ollama’s interface or a custom app. For instance:

Prompt: “Scan the network for open ports using nmap.” The AI, via MCP, executes nmap on the Kali instance and returns results.

This enables high-level automation for tasks like solving CTF challenges or initial reconnaissance, all while keeping details ethical and non-actionable for unauthorized use.

Conclusion

Integrating Ollama with MCP and a Dockerized Kali Linux opens doors to AI-powered penetration testing. This setup combines the power of local LLMs with robust security tools, making ethical hacking more efficient. Experiment responsibly, and stay tuned for more advanced integrations! For more details, check out the official Kali tools documentation and MCP resources.