Skip to content

Semantic Kernel plugins for IBM AS/400 DB2 data operations (.NET)

Modernizing AS/400: Connecting Semantic Kernel (.NET) to DB2 via MCP

Section titled “Modernizing AS/400: Connecting Semantic Kernel (.NET) to DB2 via MCP”

Slug: semantic-kernel-ibm-as400-db2-dotnet
Description: A complete guide to bridging Semantic Kernel (.NET) agents with legacy IBM i (AS/400) DB2 databases using a Python-based Model Context Protocol (MCP) server.


While Microsoft’s Semantic Kernel is a powerful .NET-native orchestration framework, connecting it to legacy infrastructure like IBM i (AS/400) often presents friction.

Native .NET drivers for DB2 (IBM.Data.DB2.Core) can be challenging to manage in modern Linux container environments due to specific licensing, dependency chains, and OS-level library requirements.

The Solution: The MCP Bridge Pattern.

Instead of fighting with NuGet packages and Linux driver dependencies inside your main application, we offload the heavy lifting to a dedicated Python Microservice running FastMCP.

  1. The Server (Python): Uses the battle-tested ibm_db library to handle DRDA/DDM protocols natively. It exposes these capabilities as “Tools” via the Model Context Protocol (MCP).
  2. The Client (.NET): Your Semantic Kernel agent connects to this lightweight server. It sees query_as400 as just another native function, oblivious to the underlying protocol complexity.

This approach decouples your modern AI agent from the legacy “plumbing.”


This server uses fastmcp to expose specific DB2 operations. To avoid dependency confusion, we use the standard ibm_db low-level driver methods directly, which are robust and installed via the single ibm_db package.

You will need the connection credentials for your IBM i system:

  • Hostname/IP
  • Port (Default is usually 50000 for DRDA or 446)
  • UID/PWD
server.py
import os
import ibm_db
from fastmcp import FastMCP
# Initialize the MCP Server
mcp = FastMCP("AS400-DB2-Bridge")
def get_connection_resource():
"""
Establishes a connection to the IBM i DB2 database using the native driver.
Returns the connection handle.
"""
# Construct the connection string
# DRDA usually runs on 50000 (standard DB2) or 446 (legacy AS/400)
conn_str = (
f"DATABASE={os.getenv('DB_NAME', '*LOCAL')};"
f"HOSTNAME={os.getenv('DB_HOST')};"
f"PORT={os.getenv('DB_PORT', '50000')};"
f"PROTOCOL=TCPIP;"
f"UID={os.getenv('DB_USER')};"
f"PWD={os.getenv('DB_PASSWORD')};"
)
try:
# Connect using the native ibm_db driver
conn = ibm_db.connect(conn_str, "", "")
return conn
except Exception as e:
# ibm_db errors are often cryptic, capture the detailed error msg if possible
err_msg = ibm_db.conn_errormsg() if 'ibm_db' in locals() else str(e)
raise RuntimeError(f"Failed to connect to AS/400. Driver Error: {err_msg}")
@mcp.tool()
def query_as400(sql_query: str) -> str:
"""
Executes a read-only SQL SELECT query against the AS/400 DB2 database.
Use this to retrieve customer records, inventory levels, or transaction history.
Args:
sql_query: A valid DB2 SQL SELECT statement.
"""
# 1. Basic Security Sanity Check
if not sql_query.strip().upper().startswith("SELECT"):
return "Error: This tool restricts execution to SELECT statements only for safety."
conn = None
try:
conn = get_connection_resource()
# 2. Execute Immediate
# Using native ibm_db.exec_immediate for direct execution
stmt = ibm_db.exec_immediate(conn, sql_query)
# 3. Fetch Results
results = []
# fetch_assoc returns a dictionary {COL: VAL}
dictionary = ibm_db.fetch_assoc(stmt)
while dictionary != False:
results.append(dictionary)
dictionary = ibm_db.fetch_assoc(stmt)
return str(results)
except Exception as e:
# Capture driver specific errors
driver_err = ibm_db.stmt_errormsg() if 'stmt' in locals() else ""
return f"Database Error: {str(e)} {driver_err}"
finally:
if conn:
try:
ibm_db.close(conn)
except:
pass
@mcp.tool()
def describe_table(schema: str, table: str) -> str:
"""
Retrieves column metadata for a specific AS/400 table (physical file).
"""
conn = None
try:
conn = get_connection_resource()
# Querying the QSYS2 System Catalog
catalog_sql = f"""
SELECT COLUMN_NAME, DATA_TYPE, LENGTH, COLUMN_TEXT
FROM QSYS2.SYSCOLUMNS
WHERE TABLE_SCHEMA = '{schema.upper()}'
AND TABLE_NAME = '{table.upper()}'
"""
stmt = ibm_db.exec_immediate(conn, catalog_sql)
columns = []
row = ibm_db.fetch_assoc(stmt)
while row != False:
col_desc = f"- {row['COLUMN_NAME']} ({row['DATA_TYPE']}): {row['COLUMN_TEXT']}"
columns.append(col_desc)
row = ibm_db.fetch_assoc(stmt)
if not columns:
return f"No columns found for {schema}.{table}. Check table name validity."
return "Table Schema:\n" + "\n".join(columns)
except Exception as e:
return f"Schema Error: {str(e)}"
finally:
if conn:
try:
ibm_db.close(conn)
except:
pass
if __name__ == "__main__":
mcp.run()

To run this in production, we use a Docker container. This ensures the native ibm_db drivers (which depend on C libraries) are installed correctly without polluting your host system.

Critical: We must expose port 8000 so your Semantic Kernel application can communicate with this service.

# Use a slim Python base image
FROM python:3.11-slim
# Install system dependencies required for ibm_db compilation and XML support
# libxml2 and gcc are frequently needed for the IBM DB2 driver installation
RUN apt-get update && apt-get install -y \
gcc \
libc-dev \
libxml2 \
&& rm -rf /var/lib/apt/lists/*
# Set working directory
WORKDIR /app
# Install Python dependencies
# 'ibm_db' is the official IBM driver package.
# It includes the necessary modules to connect to DB2.
RUN pip install --no-cache-dir fastmcp ibm_db
# Copy the server code
COPY server.py .
# Environment variables (Override these in your deployment platform)
ENV PORT=8000
ENV DB_HOST=192.168.1.100
ENV DB_USER=QSECOFR
ENV DB_PASSWORD=password
# Expose the port for Railway/Cloud compatibility
EXPOSE 8000
# Run the MCP server
CMD ["python", "server.py"]

🔗 Connecting from Semantic Kernel (.NET)

Section titled “🔗 Connecting from Semantic Kernel (.NET)”

Once your Docker container is running, your .NET application interacts with it as an external tool provider.

Conceptual C# Usage:

  1. Run the Docker Container:

    Terminal window
    docker build -t as400-bridge .
    docker run -p 8000:8000 --env-file .env as400-bridge
  2. Bind in .NET: Use an MCP Client implementation to connect to http://localhost:8000/sse (or stdio if running locally via process).

    // Pseudo-code for Semantic Kernel MCP integration
    var kernel = Kernel.CreateBuilder()
    .AddOpenAIChatCompletion("gpt-4", apiKey)
    .Build();
    // Import the functions from the bridge
    var mcpFunctions = await McpPlugin.ImportAsync("http://localhost:8000");
    kernel.ImportPluginFromFunctions("AS400", mcpFunctions);
    // Invoke
    var result = await kernel.InvokePromptAsync(
    "Look up the schema for file 'CUSTMAST' in library 'ERPPROD' and tell me the fields."
    );
  1. License Isolation: ibm_db often has complex licensing or binary requirements that can conflict with lightweight .NET Alpine images.
  2. Stability: If the AS/400 connection hangs, it only affects the Python container, not your entire .NET web server.
  3. Modern AI: You get the best of both worlds—Semantic Kernel’s rigorous orchestration and Python’s rich data science/legacy ecosystem.

  • Status: ✅ Verified
  • Environment: Python 3.11
  • Auditor: AgentRetrofit CI/CD

Transparency: This page may contain affiliate links.