This blog post explores how to integrate an AI tool into a .NET Core application, using a case study of a writing assistant app — WriteIT. We’ll cover integration strategies, security best practices, and architectural decisions that make AI integration both effective and secure.
Step 1: Why Integrate AI into Your .NET Core App?
Integrating AI unlocks powerful features like:
| Use Case | Description |
| NLP Tasks | Spell check, tone adjustment, grammar correction |
| Smart Suggestions | Autocomplete, content summaries, writing prompts |
| Multimedia Support | Voice-to-text (via Whisper), image generation (via DALL·E) |
| Conversational UIs | Chatbots or AI writing assistants |
Step 2: Architecture Overview
Here’s a clean, scalable architecture for integrating AI:
Frontend (Blazor / React)
↓
ASP.NET Core Web API
↓
AI Integration Layer (AiService)
↓
External AI Provider (OpenAI)
↓
Response returned to frontend
Step 3: Building the AI Integration Layer
📁 AiService.cs — Central Service to Call OpenAI
| public class AiService
{ private readonly HttpClient _httpClient; private readonly string _apiKey;
public AiService(IConfiguration config) { _httpClient = new HttpClient(); _apiKey = config[“OpenAI:ApiKey”]; } |
|
public async Task<string> GenerateTextAsync(string prompt) { request.Headers.Authorization = new AuthenticationHeaderValue(“Bearer”, _apiKey); var payload = new { model = “gpt-4”, prompt = prompt, max_tokens = 100 }; var response = await _httpClient.SendAsync(request); response.EnsureSuccessStatusCode(); var result = await response.Content.ReadAsStringAsync(); return result; } |
Step 4: Securing the Integration
🔐 Best Practices
| Area | Strategy |
| Authentication | Use JWT + ASP.NET Core Identity |
| Authorization | Secure endpoints using [Authorize] |
| Secrets | Store API keys in Azure Key Vault |
| Input Validation | Clean user input to prevent prompt injection |
| Rate Limiting | Use ASP.NET Core middleware or third-party libraries like AspNetCoreRateLimit |
| 📦 Example: Securing Endpoints [Authorize][ApiController][Route(“api/[controller]”)]public class AiController : ControllerBase{ private readonly AiService _aiService; public AiController(AiService aiService) { _aiService = aiService; } [HttpPost(“generate”)] public async Task<IActionResult> Generate([FromBody] AiPromptRequest request) { if (string.IsNullOrWhiteSpace(request.Prompt) || request.Prompt.Length > 500) return BadRequest(“Invalid prompt”); var result = await _aiService.GenerateTextAsync(request.Prompt); return Ok(result); }
|
Step 5: Background Processing for Long-Running Tasks
Use BackgroundService for jobs like bulk content generation, summarization, etc.
🧠 AiBackgroundTask.cs
| public class AiBackgroundTask : BackgroundService
{ private readonly ILogger<AiBackgroundTask> _logger; private readonly IServiceProvider _services;
public AiBackgroundTask(ILogger<AiBackgroundTask> logger, IServiceProvider services) { _logger = logger; _services = services; |
|
protected override async Task ExecuteAsync(CancellationToken stoppingToken) { while (!stoppingToken.IsCancellationRequested) { using (var scope = _services.CreateScope()) { // Pull queued tasks from DB or cache // Call AI service // Save results back to DB } await Task.Delay(TimeSpan.FromSeconds(5), stoppingToken); } } |
Register it in Program.cs:
builder.Services.AddHostedService<AiBackgroundTask>();
Step 6: Testing & Monitoring
🧪 Unit Testing AI Integration
- Use HttpClientFactory and mock HTTP responses
- Test prompt generation logic and error handling
- Validate security (e.g., JWT tokens, input length, empty prompts)
📈 Monitoring & Observability
| Tool | Use |
| Application Insights | Logs, metrics, performance data |
| Health Checks | Monitor OpenAI availability |
| Alerts | Notify on API failures or latency |
💡 Add telemetry when AI calls fail or take too long:
_telemetry.TrackEvent(“AI_Call_Slow”, new Dictionary<string, string> {
{ “PromptLength”, prompt.Length.ToString() }
});
Step 7: Deployment
Deploying .NET Core Apps with AI Integration on Azure
🔹 1. Prepare for Production Deployment
✔️ Clean & Harden the Code
- Remove test routes, debugging logs, and unused services.
- Enforce HTTPS redirection and add security headers in middleware.
- Sanitize user input to avoid prompt injection (especially when working with AI).
✔️ Add Environment-Specific Settings
- Use appsettings.Development.json and appsettings.Production.json for environment-specific configs.
- Use IWebHostEnvironment in code to detect the environment and handle logic accordingly.
if (_env.IsDevelopment()) { … }
else { … }
🔹 2. Use Azure App Service for Hosting
🏗️ Create App Service
- Go to Azure Portal → App Services → Create a new Web App.
- Choose the correct resource group, pricing tier (start with B1 or P1V2 for autoscaling support).
⚙️ Configure App Service Settings
- Set your .NET Core version (e.g., .NET 6/7/8).
- Enable deployment slots (e.g., Staging & Production) for safe rollout.
🔹 3. Secure Your Secrets with Azure Key Vault
🔐 Why Use Key Vault?
Hardcoding secrets like OpenAI API keys is insecure. Azure Key Vault protects them.
📌 Steps:
- Create a Key Vault resource in Azure.
- Add a secret like OpenAI-API-Key.
- Assign your App Service a Managed Identity.
- Grant that identity Key Vault Secrets User access.
- Fetch secrets in Startup.cs:
builder.Configuration.AddAzureKeyVault(
new Uri($”https://<YourVault>.vault.azure.net/”),
new DefaultAzureCredential());
🔹 4. Use Environment Variables in CI/CD Pipelines
🧪 Why?
Store config keys and secrets outside the codebase during automated builds/deployments.
🛠️ Tools:
- GitHub Actions
- Azure DevOps
- GitLab CI
📋 Example: GitHub Actions Workflow
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
ASPNETCORE_ENVIRONMENT: ‘Production’
steps:
– uses: actions/setup-dotnet@v3
with:
dotnet-version: ‘8.0.x’
– run: dotnet publish -c Release
– run: az webapp deploy …
🔹 5. Enable Autoscaling and Performance Tuning
⚙️ Autoscale
- Use Azure App Service Plan with Scaling rules:
- Scale out on high CPU usage (e.g., >70%)
- Scale based on request count or memory
💡 Tips:
- Use Premium plan (P1V2 or higher) for better performance.
- Set min & max instance count to control cost.
🔹 6. Logging, Monitoring & Health Checks
📊 Application Insights
- Enable from Azure App Service settings.
- Automatically logs HTTP requests, exceptions, dependencies (like OpenAI), etc.
🧪 Health Checks
Add a /health endpoint to check dependencies like DB or AI provider.
app.MapHealthChecks(“/health”);
🔔 Alerts
Set Azure Alerts:
- When OpenAI API fails (track dependency failures)
- When response time exceeds threshold
🔹 7. Set Up Rate Limiting & Throttling
Avoid overuse or abuse of your AI endpoints.
🔧 Use AspNetCoreRateLimit NuGet package
Install-Package AspNetCoreRateLimit
Sample Config:
“IpRateLimiting”: {
“EnableEndpointRateLimiting”: true,
“StackBlockedRequests”: false,
“RealIpHeader”: “X-Real-IP”,
“ClientIdHeader”: “X-ClientId”,
“HttpStatusCode”: 429,
“GeneralRules”: [
{
“Endpoint”: “*”,
“Period”: “1m”,
“Limit”: 10
} ]}
🔹 8. Blue-Green Deployment (Optional but Recommended)
🟩 What is it?
Deploy to a staging slot (Green) while production (Blue) runs normally. Swap after testing.
⚙️ Azure Setup:
- Add a Deployment Slot.
- Deploy new version to Staging.
- Validate manually or via smoke tests and Use Swap feature in Azure Portal.
🔚 Final Notes
| Task | Tool |
| Hosting | Azure App Service |
| Secrets | Azure Key Vault |
| CI/CD | GitHub Actions / Azure DevOps |
| Monitoring | Application Insights |
| Autoscaling | Azure Scale Rules |
| Security | HTTPS, Rate Limiting, Authorization |
📝 Final Thoughts
Integrating AI into a .NET Core app isn’t just about calling an API — it’s about building secure, scalable, and user-friendly experiences powered by intelligent services. With careful architecture and robust security, you can bring powerful language models like GPT-4 into your application in a reliable way.