Chat Widget
Embed a voice and text chat widget on any website with a single script tag. Same AI assistant, same natural conversation - right in your web app.
Quick Start
Add these two lines to your HTML, just before the closing </body> tag:
<!-- Add to your HTML -->
<script>
window.voiceRailConfig = {
assistantId: 'your-assistant-id',
organizationId: 'your-organization-id'
};
</script>
<script src="https://voicerail.ai/widget/voicerail-widget.js" async></script>That's it! A chat bubble appears in the bottom-right corner. Users can click to open the widget and start a voice or text conversation with your AI assistant.
Features
Voice Mode
Tap to speak or enable voice activity detection for hands-free conversations. Uses your device microphone with real-time streaming.
Text Mode
Type messages when voice isn't practical. Toggle between modes at any time. Full chat history preserved.
Same AI Pipeline
Uses the same ConversationEngine as phone calls. Full persistence, billing, and conversation history.
Lightweight
Single 53KB (gzipped) JavaScript file with no external dependencies. CSS is inlined for zero layout shift.
Configuration Options
Customize the widget appearance and behavior through the config object:
<script>
window.voiceRailConfig = {
// Required
assistantId: 'asst_xxxx',
organizationId: 'org_xxxx',
// Appearance
position: 'bottom-right', // 'bottom-right' | 'bottom-left' | 'top-right' | 'top-left'
theme: 'auto', // 'light' | 'dark' | 'auto'
primaryColor: '#3b82f6', // Any hex color
bubbleSize: 60, // Pixels
bubbleIcon: 'chat', // 'chat' | 'mic' | 'headphones'
// Behavior
autoOpen: false, // Open widget on page load
defaultMode: 'voice', // 'text' | 'voice'
voiceActivityDetection: true, // Auto-detect speech for hands-free
// Branding
assistantName: 'Alex', // Display name in widget header
assistantAvatar: 'https://...', // URL to avatar image
// Callbacks
onOpen: () => console.log('Widget opened'),
onClose: () => console.log('Widget closed'),
onMessage: (msg) => console.log('Message:', msg),
onError: (err) => console.error('Error:', err),
onConnectionChange: (connected) => console.log('Connected:', connected)
};
</script>
<script src="https://voicerail.ai/widget/voicerail-widget.js" async></script>Configuration Reference
Required
| Option | Type | Description |
|---|---|---|
| assistantId | string | Your VoiceRail assistant ID |
| organizationId | string | Your VoiceRail organization ID |
Appearance
| Option | Type | Default | Description |
|---|---|---|---|
| position | string | 'bottom-right' | Widget position on screen |
| theme | string | 'auto' | 'light', 'dark', or 'auto' (follows system) |
| primaryColor | string | '#3b82f6' | Accent color for buttons and highlights |
| bubbleSize | number | 60 | Size of the bubble button in pixels |
| bubbleIcon | string | 'chat' | 'chat', 'mic', or 'headphones' |
| assistantName | string | — | Display name in widget header |
| assistantAvatar | string | — | URL to avatar image |
Behavior
| Option | Type | Default | Description |
|---|---|---|---|
| autoOpen | boolean | false | Open widget automatically on page load |
| defaultMode | string | 'voice' | 'text' or 'voice' |
| voiceActivityDetection | boolean | true | Auto-detect speech for hands-free mode |
JavaScript API
Control the widget programmatically after it loads:
// Programmatic control (available after widget loads)
window.VoiceRail.open(); // Open the widget panel
window.VoiceRail.close(); // Close the widget panel
window.VoiceRail.destroy(); // Remove widget from pageCallbacks
React to widget events with callback functions:
// Message callback receives transcript messages
window.voiceRailConfig = {
// ...
onMessage: (message) => {
console.log({
id: message.id, // Unique message ID
role: message.role, // 'user' | 'assistant'
text: message.text, // Message content
timestamp: message.timestamp, // Date object
wasInterrupted: message.wasInterrupted // True if assistant was interrupted
});
// Example: Send to your analytics
analytics.track('voicerail_message', {
role: message.role,
textLength: message.text.length
});
}
};React / Next.js Integration
For React apps, load the widget in a useEffect hook and clean up on unmount:
// React integration
import { useEffect } from 'react';
function App() {
useEffect(() => {
// Set config before script loads
window.voiceRailConfig = {
assistantId: process.env.NEXT_PUBLIC_ASSISTANT_ID,
organizationId: process.env.NEXT_PUBLIC_ORG_ID,
theme: 'auto',
defaultMode: 'voice'
};
// Load widget script
const script = document.createElement('script');
script.src = 'https://voicerail.ai/widget/voicerail-widget.js';
script.async = true;
document.body.appendChild(script);
return () => {
window.VoiceRail?.destroy();
script.remove();
};
}, []);
return <div>Your app content</div>;
}Browser Support
The widget supports all modern browsers with WebSocket and Web Audio API:
- • Chrome 80+
- • Firefox 76+
- • Safari 14.1+
- • Edge 80+
Voice mode requires microphone permission. Text mode works without any permissions.
Security Considerations
- • The widget connects via secure WebSocket (WSS) to
voice.voicerail.ai - • Audio is streamed in real-time and not stored locally in the browser
- • Conversation transcripts are stored in your VoiceRail account
- • CORS is configured to allow embedding from any origin
- • Usage is billed to the organization ID in your config
Next Steps
- • Create an assistant with the right personality for your use case
- • Try the live demo to see the widget in action
- • Add custom reasoning via webhook for business logic
- • View call analytics in the dashboard