OpenAI Compatible
Access 100+ models through OpenRouter, vLLM, LocalAI and other OpenAI-compatible providers
Overview
Key Benefits
Supported Services
Service
Description
Best For
Quick Start
Option 1: OpenRouter (Recommended for Beginners)
1. Get OpenRouter API Key
2. Configure NeurosLink AI
3. Test Setup
Option 2: vLLM (Self-Hosted)
1. Install vLLM
2. Configure NeurosLink AI
3. Test Setup
Option 3: LocalAI (Privacy-Focused)
1. Install LocalAI
2. Configure NeurosLink AI
Model Auto-Discovery
Discover Available Models
SDK Auto-Discovery
OpenRouter Integration
Available Models on OpenRouter
Model Selection by Provider
OpenRouter Features
vLLM Integration
Starting vLLM Server
NeurosLink AI Configuration for vLLM
Multiple vLLM Instances
SDK Integration
Basic Usage
With Model Selection
Streaming
Custom Headers
Error Handling
CLI Usage
Basic Commands
OpenRouter-Specific Commands
Configuration Options
Environment Variables
Programmatic Configuration
Use Cases
1. Multi-Provider Access via OpenRouter
2. Self-Hosted Private Models
3. Cost Optimization
Troubleshooting
Common Issues
1. "Connection refused"
2. "Model not found"
3. "Invalid API key"
Best Practices
1. Model Discovery
2. Endpoint Health Checks
3. Cost Tracking
Related Documentation
Additional Resources
Last updated
Was this helpful?

