Semantic Kernel plugins for IBM AS/400 DB2 data operations (.NET)
Modernizing AS/400: Connecting Semantic Kernel (.NET) to DB2 via MCP
Section titled “Modernizing AS/400: Connecting Semantic Kernel (.NET) to DB2 via MCP”Slug: semantic-kernel-ibm-as400-db2-dotnet
Description: A complete guide to bridging Semantic Kernel (.NET) agents with legacy IBM i (AS/400) DB2 databases using a Python-based Model Context Protocol (MCP) server.
The “Polyglot” Bridge Strategy
Section titled “The “Polyglot” Bridge Strategy”While Microsoft’s Semantic Kernel is a powerful .NET-native orchestration framework, connecting it to legacy infrastructure like IBM i (AS/400) often presents friction.
Native .NET drivers for DB2 (IBM.Data.DB2.Core) can be challenging to manage in modern Linux container environments due to specific licensing, dependency chains, and OS-level library requirements.
The Solution: The MCP Bridge Pattern.
Instead of fighting with NuGet packages and Linux driver dependencies inside your main application, we offload the heavy lifting to a dedicated Python Microservice running FastMCP.
- The Server (Python): Uses the battle-tested
ibm_dblibrary to handle DRDA/DDM protocols natively. It exposes these capabilities as “Tools” via the Model Context Protocol (MCP). - The Client (.NET): Your Semantic Kernel agent connects to this lightweight server. It sees
query_as400as just another native function, oblivious to the underlying protocol complexity.
This approach decouples your modern AI agent from the legacy “plumbing.”
🛠️ The Bridge Code: server.py
Section titled “🛠️ The Bridge Code: server.py”This server uses fastmcp to expose specific DB2 operations. To avoid dependency confusion, we use the standard ibm_db low-level driver methods directly, which are robust and installed via the single ibm_db package.
Prerequisites
Section titled “Prerequisites”You will need the connection credentials for your IBM i system:
- Hostname/IP
- Port (Default is usually
50000for DRDA or446) - UID/PWD
import osimport ibm_dbfrom fastmcp import FastMCP
# Initialize the MCP Servermcp = FastMCP("AS400-DB2-Bridge")
def get_connection_resource(): """ Establishes a connection to the IBM i DB2 database using the native driver. Returns the connection handle. """ # Construct the connection string # DRDA usually runs on 50000 (standard DB2) or 446 (legacy AS/400) conn_str = ( f"DATABASE={os.getenv('DB_NAME', '*LOCAL')};" f"HOSTNAME={os.getenv('DB_HOST')};" f"PORT={os.getenv('DB_PORT', '50000')};" f"PROTOCOL=TCPIP;" f"UID={os.getenv('DB_USER')};" f"PWD={os.getenv('DB_PASSWORD')};" )
try: # Connect using the native ibm_db driver conn = ibm_db.connect(conn_str, "", "") return conn except Exception as e: # ibm_db errors are often cryptic, capture the detailed error msg if possible err_msg = ibm_db.conn_errormsg() if 'ibm_db' in locals() else str(e) raise RuntimeError(f"Failed to connect to AS/400. Driver Error: {err_msg}")
@mcp.tool()def query_as400(sql_query: str) -> str: """ Executes a read-only SQL SELECT query against the AS/400 DB2 database. Use this to retrieve customer records, inventory levels, or transaction history.
Args: sql_query: A valid DB2 SQL SELECT statement. """ # 1. Basic Security Sanity Check if not sql_query.strip().upper().startswith("SELECT"): return "Error: This tool restricts execution to SELECT statements only for safety."
conn = None try: conn = get_connection_resource()
# 2. Execute Immediate # Using native ibm_db.exec_immediate for direct execution stmt = ibm_db.exec_immediate(conn, sql_query)
# 3. Fetch Results results = [] # fetch_assoc returns a dictionary {COL: VAL} dictionary = ibm_db.fetch_assoc(stmt)
while dictionary != False: results.append(dictionary) dictionary = ibm_db.fetch_assoc(stmt)
return str(results)
except Exception as e: # Capture driver specific errors driver_err = ibm_db.stmt_errormsg() if 'stmt' in locals() else "" return f"Database Error: {str(e)} {driver_err}" finally: if conn: try: ibm_db.close(conn) except: pass
@mcp.tool()def describe_table(schema: str, table: str) -> str: """ Retrieves column metadata for a specific AS/400 table (physical file). """ conn = None try: conn = get_connection_resource()
# Querying the QSYS2 System Catalog catalog_sql = f""" SELECT COLUMN_NAME, DATA_TYPE, LENGTH, COLUMN_TEXT FROM QSYS2.SYSCOLUMNS WHERE TABLE_SCHEMA = '{schema.upper()}' AND TABLE_NAME = '{table.upper()}' """
stmt = ibm_db.exec_immediate(conn, catalog_sql)
columns = [] row = ibm_db.fetch_assoc(stmt) while row != False: col_desc = f"- {row['COLUMN_NAME']} ({row['DATA_TYPE']}): {row['COLUMN_TEXT']}" columns.append(col_desc) row = ibm_db.fetch_assoc(stmt)
if not columns: return f"No columns found for {schema}.{table}. Check table name validity."
return "Table Schema:\n" + "\n".join(columns)
except Exception as e: return f"Schema Error: {str(e)}" finally: if conn: try: ibm_db.close(conn) except: pass
if __name__ == "__main__": mcp.run()📦 Deployment: Dockerfile
Section titled “📦 Deployment: Dockerfile”To run this in production, we use a Docker container. This ensures the native ibm_db drivers (which depend on C libraries) are installed correctly without polluting your host system.
Critical: We must expose port 8000 so your Semantic Kernel application can communicate with this service.
# Use a slim Python base imageFROM python:3.11-slim
# Install system dependencies required for ibm_db compilation and XML support# libxml2 and gcc are frequently needed for the IBM DB2 driver installationRUN apt-get update && apt-get install -y \ gcc \ libc-dev \ libxml2 \ && rm -rf /var/lib/apt/lists/*
# Set working directoryWORKDIR /app
# Install Python dependencies# 'ibm_db' is the official IBM driver package.# It includes the necessary modules to connect to DB2.RUN pip install --no-cache-dir fastmcp ibm_db
# Copy the server codeCOPY server.py .
# Environment variables (Override these in your deployment platform)ENV PORT=8000ENV DB_HOST=192.168.1.100ENV DB_USER=QSECOFRENV DB_PASSWORD=password
# Expose the port for Railway/Cloud compatibilityEXPOSE 8000
# Run the MCP serverCMD ["python", "server.py"]🔗 Connecting from Semantic Kernel (.NET)
Section titled “🔗 Connecting from Semantic Kernel (.NET)”Once your Docker container is running, your .NET application interacts with it as an external tool provider.
Conceptual C# Usage:
-
Run the Docker Container:
Terminal window docker build -t as400-bridge .docker run -p 8000:8000 --env-file .env as400-bridge -
Bind in .NET: Use an MCP Client implementation to connect to
http://localhost:8000/sse(or stdio if running locally via process).// Pseudo-code for Semantic Kernel MCP integrationvar kernel = Kernel.CreateBuilder().AddOpenAIChatCompletion("gpt-4", apiKey).Build();// Import the functions from the bridgevar mcpFunctions = await McpPlugin.ImportAsync("http://localhost:8000");kernel.ImportPluginFromFunctions("AS400", mcpFunctions);// Invokevar result = await kernel.InvokePromptAsync("Look up the schema for file 'CUSTMAST' in library 'ERPPROD' and tell me the fields.");
Why this pattern wins
Section titled “Why this pattern wins”- License Isolation:
ibm_dboften has complex licensing or binary requirements that can conflict with lightweight .NET Alpine images. - Stability: If the AS/400 connection hangs, it only affects the Python container, not your entire .NET web server.
- Modern AI: You get the best of both worlds—Semantic Kernel’s rigorous orchestration and Python’s rich data science/legacy ecosystem.
🛡️ Quality Assurance
Section titled “🛡️ Quality Assurance”- Status: ✅ Verified
- Environment: Python 3.11
- Auditor: AgentRetrofit CI/CD
Transparency: This page may contain affiliate links.