Security Best Practices
Comprehensive security guidelines for deploying and using Alactic AGI in production environments.
API Key Security
Secure Storage
Never commit API keys to source control
# .gitignore
.env
.env.local
config.ini
secrets.json
appsettings.json
Use environment variables
# Python
import os
api_key = os.getenv('ALACTIC_API_KEY')
# Node.js
const apiKey = process.env.ALACTIC_API_KEY;
# C#
var apiKey = Environment.GetEnvironmentVariable("ALACTIC_API_KEY");
Azure Key Vault integration
from azure.keyvault.secrets import SecretClient
from azure.identity import DefaultAzureCredential
credential = DefaultAzureCredential()
client = SecretClient(
vault_url="https://your-vault.vault.azure.net/",
credential=credential
)
api_key = client.get_secret("alactic-api-key").value
Key Rotation
Implement regular rotation schedule
- Development keys: Every 30 days
- Production keys: Every 90 days
- After security incidents: Immediately
Rotation procedure:
- Generate new key in Azure Portal
- Update Key Vault with new key
- Deploy updated configuration
- Monitor for errors
- Revoke old key after 24 hours
Automated rotation script:
from azure.keyvault.secrets import SecretClient
from azure.identity import DefaultAzureCredential
import datetime
def rotate_api_key(vault_url, secret_name):
credential = DefaultAzureCredential()
client = SecretClient(vault_url=vault_url, credential=credential)
# Get current key
current_secret = client.get_secret(secret_name)
# Generate new key (from Azure Portal API)
new_key = generate_new_alactic_key()
# Store new key
client.set_secret(secret_name, new_key)
# Tag old key for deletion
expiry_date = datetime.datetime.now() + datetime.timedelta(days=1)
current_secret.properties.expires_on = expiry_date
print(f"Key rotated successfully. Old key expires: {expiry_date}")
Access Control
Separate keys per environment
# Development
ALACTIC_API_KEY_DEV=dev_key_xxxxx
# Staging
ALACTIC_API_KEY_STAGING=staging_key_xxxxx
# Production
ALACTIC_API_KEY_PROD=prod_key_xxxxx
Role-based access
- Developers: Development keys only
- QA Team: Staging keys only
- Operations: Production keys (limited access)
- Automated systems: Managed identities
Network Security
HTTPS Only
All API communications must use HTTPS.
# Enforce HTTPS
import requests
session = requests.Session()
session.verify = True # Enforce SSL verification
# Make request
response = session.post(
'https://your-deployment.azurewebsites.net/api/process',
headers={'Authorization': f'Bearer {api_key}'},
json=payload
)
IP Whitelisting
Restrict API access to known IP addresses.
Azure App Service Configuration:
az webapp config access-restriction add \
--resource-group your-rg \
--name your-app \
--rule-name office-network \
--action Allow \
--ip-address 203.0.113.0/24 \
--priority 100
Application-level validation:
from flask import request, abort
ALLOWED_IPS = ['203.0.113.0/24', '198.51.100.0/24']
def check_ip_whitelist():
client_ip = request.remote_addr
if not is_ip_allowed(client_ip, ALLOWED_IPS):
abort(403)
Azure Private Link
Connect to Alactic AGI over private network.
# Create private endpoint
az network private-endpoint create \
--name alactic-private-endpoint \
--resource-group your-rg \
--vnet-name your-vnet \
--subnet your-subnet \
--private-connection-resource-id /subscriptions/{sub}/resourceGroups/{rg}/providers/Microsoft.Web/sites/{app} \
--connection-name alactic-connection
Data Security
Encryption in Transit
All data transmitted to/from Alactic AGI is encrypted using TLS 1.2+.
import ssl
import requests
# Enforce TLS 1.2 minimum
session = requests.Session()
session.mount('https://', requests.adapters.HTTPAdapter(
max_retries=3,
pool_connections=10,
pool_maxsize=10
))
# TLS configuration
context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
context.minimum_version = ssl.TLSVersion.TLSv1_2
Encryption at Rest
Azure Storage encryption
All documents stored in Azure Storage are automatically encrypted using Microsoft-managed keys.
Customer-managed keys:
# Create Key Vault key
az keyvault key create \
--vault-name your-vault \
--name storage-encryption-key \
--protection software
# Configure storage account
az storage account update \
--name yourstorageaccount \
--resource-group your-rg \
--encryption-key-source Microsoft.Keyvault \
--encryption-key-vault https://your-vault.vault.azure.net \
--encryption-key-name storage-encryption-key
Data Sanitization
Remove sensitive information before processing
import re
def sanitize_document(text):
# Remove credit card numbers
text = re.sub(r'\b\d{4}[\s-]?\d{4}[\s-]?\d{4}[\s-]?\d{4}\b', '[REDACTED]', text)
# Remove SSN
text = re.sub(r'\b\d{3}-\d{2}-\d{4}\b', '[REDACTED]', text)
# Remove email addresses
text = re.sub(r'\b[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Z|a-z]{2,}\b', '[REDACTED]', text)
return text
# Use before processing
sanitized_text = sanitize_document(original_text)
result = client.process_text(sanitized_text)
Authentication Best Practices
Managed Identity
Use Azure Managed Identity instead of API keys when possible.
from azure.identity import DefaultAzureCredential
from azure.keyvault.secrets import SecretClient
# Automatically uses managed identity in Azure
credential = DefaultAzureCredential()
vault_client = SecretClient(
vault_url="https://your-vault.vault.azure.net/",
credential=credential
)
api_key = vault_client.get_secret("alactic-api-key").value
Enable managed identity:
az webapp identity assign \
--name your-app \
--resource-group your-rg
Azure AD Integration
Implement Azure AD authentication for user access.
from msal import ConfidentialClientApplication
app = ConfidentialClientApplication(
client_id="your-client-id",
client_credential="your-client-secret",
authority="https://login.microsoftonline.com/your-tenant-id"
)
# Acquire token
result = app.acquire_token_for_client(
scopes=["https://your-deployment.azurewebsites.net/.default"]
)
if "access_token" in result:
access_token = result["access_token"]
# Use token for API calls
Input Validation
File Type Validation
Validate file types before processing
import magic
from werkzeug.utils import secure_filename
ALLOWED_EXTENSIONS = {'pdf', 'docx', 'txt'}
MAX_FILE_SIZE = 10 * 1024 * 1024 # 10MB
def validate_file(file):
# Check filename
filename = secure_filename(file.filename)
if not filename or '.' not in filename:
raise ValueError("Invalid filename")
ext = filename.rsplit('.', 1)[1].lower()
if ext not in ALLOWED_EXTENSIONS:
raise ValueError(f"File type not allowed: {ext}")
# Check file size
file.seek(0, 2) # Seek to end
size = file.tell()
file.seek(0) # Reset
if size > MAX_FILE_SIZE:
raise ValueError("File too large")
# Check MIME type
mime = magic.from_buffer(file.read(1024), mime=True)
file.seek(0)
allowed_mimes = ['application/pdf', 'application/vnd.openxmlformats-officedocument.wordprocessingml.document']
if mime not in allowed_mimes:
raise ValueError(f"Invalid MIME type: {mime}")
return True
Content Sanitization
Prevent injection attacks
import bleach
def sanitize_input(user_input):
# Remove HTML tags
clean_input = bleach.clean(user_input, tags=[], strip=True)
# Remove null bytes
clean_input = clean_input.replace('\x00', '')
# Limit length
max_length = 10000
if len(clean_input) > max_length:
clean_input = clean_input[:max_length]
return clean_input
Monitoring and Auditing
Audit Logging
Log all API access
import logging
from datetime import datetime
# Configure logging
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
)
logger = logging.getLogger(__name__)
def log_api_access(user_id, action, resource, result):
logger.info(
f"API Access - User: {user_id}, Action: {action}, "
f"Resource: {resource}, Result: {result}, "
f"Timestamp: {datetime.now().isoformat()}"
)
# Usage
log_api_access(
user_id="user@domain.com",
action="process_document",
resource="contract.pdf",
result="success"
)
Azure Monitor Integration
Send logs to Azure Monitor
from opencensus.ext.azure.log_exporter import AzureLogHandler
logger = logging.getLogger(__name__)
logger.addHandler(AzureLogHandler(
connection_string='InstrumentationKey=your-key'
))
# Log security events
logger.warning(
"Failed authentication attempt",
extra={
'custom_dimensions': {
'user_id': user_id,
'ip_address': client_ip,
'timestamp': datetime.now().isoformat()
}
}
)
Alert Configuration
Set up security alerts
# Create alert for failed authentications
az monitor metrics alert create \
--name failed-auth-alert \
--resource-group your-rg \
--scopes /subscriptions/{sub}/resourceGroups/{rg}/providers/Microsoft.Web/sites/{app} \
--condition "count failed_authentications > 10" \
--window-size 5m \
--evaluation-frequency 1m \
--action email admin@domain.com
Compliance
Data Residency
Ensure data stays in required geographic region.
# Configure Azure region
AZURE_REGION = 'eastus' # or required region
# Verify deployment region
from azure.mgmt.web import WebSiteManagementClient
client = WebSiteManagementClient(credential, subscription_id)
app = client.web_apps.get(resource_group, app_name)
if app.location != AZURE_REGION:
raise ValueError(f"App not in required region: {AZURE_REGION}")
Data Retention
Implement retention policies
from datetime import datetime, timedelta
RETENTION_DAYS = 90
def cleanup_old_documents():
cutoff_date = datetime.now() - timedelta(days=RETENTION_DAYS)
# Query old documents
old_docs = query_documents(created_before=cutoff_date)
for doc in old_docs:
# Delete document
delete_document(doc.id)
# Log deletion
logger.info(f"Deleted document {doc.id} (retention policy)")
GDPR Compliance
Implement right to deletion
def delete_user_data(user_id):
# Delete all documents
documents = get_user_documents(user_id)
for doc in documents:
delete_document(doc.id)
# Delete processing history
delete_processing_history(user_id)
# Delete stored results
delete_results(user_id)
# Log deletion
logger.info(f"Deleted all data for user {user_id} (GDPR request)")
return True
Incident Response
Security Incident Procedure
- Detect: Monitor alerts and logs
- Contain: Revoke compromised keys immediately
- Investigate: Review access logs and audit trails
- Remediate: Fix vulnerabilities and update systems
- Document: Record incident details and response
- Review: Analyze and improve security measures
Key Compromise Response
def respond_to_key_compromise(compromised_key):
# 1. Revoke key immediately
revoke_api_key(compromised_key)
# 2. Generate new key
new_key = generate_new_key()
# 3. Update Key Vault
update_key_vault(new_key)
# 4. Notify administrators
send_security_alert(
"API key compromised and rotated",
severity="high"
)
# 5. Audit recent access
suspicious_access = audit_key_usage(
compromised_key,
last_hours=24
)
# 6. Document incident
log_security_incident({
'type': 'key_compromise',
'key': compromised_key,
'timestamp': datetime.now(),
'actions_taken': ['revoked', 'rotated', 'audited']
})
Security Checklist
Pre-Production
- API keys stored in Key Vault
- HTTPS enforced for all endpoints
- IP whitelisting configured
- Input validation implemented
- File type restrictions enabled
- Managed identity configured
- Audit logging enabled
- Azure Monitor configured
- Security alerts set up
- Incident response plan documented
Regular Maintenance
- Rotate API keys (90 days)
- Review access logs (weekly)
- Update dependencies (monthly)
- Security vulnerability scan (monthly)
- Review user permissions (quarterly)
- Disaster recovery test (quarterly)
- Security audit (annually)