External Platforms
VAPI Agents
Integrate Trulience avatars with VAPI's voice AI platformOverview
VAPI is a voice AI platform that handles the complete conversational pipeline (Speech-to-Text, LLM, and Text-to-Speech). By integrating VAPI with Trulience, you can add lifelike avatar visuals and lip-sync to your VAPI agents.
Working example: See our VAPI integration example for complete working code.
How It Works
VAPI manages the conversation intelligence while Trulience provides the avatar visualization:
- VAPI handles user speech recognition, conversation logic, and response generation
- Trulience renders the avatar and synchronizes lip movements with VAPI’s audio output
- The integration connects VAPI’s audio stream to your Trulience avatar via setMediaStream()
- VAPI uses Daily.co internally for WebRTC communication
Prerequisites
- A VAPI account with an agent configured
- A Trulience avatar configured for External Voice Platforms (see Dashboard Configuration)
- Basic knowledge of React or vanilla JavaScript
Dashboard Configuration
- Open your avatar’s settings in the Trulience dashboard
- Navigate to the BRAIN tab
- Select ‘3rd Party AI’ mode
- Set ‘Service provider or framework’ to ‘External Voice Platforms’
This configuration disables Trulience’s built-in STT, LLM, and TTS, allowing VAPI to handle these components.
Integration Steps
1. Install Dependencies
npm install @trulience/react-sdk @vapi-ai/web@^2.4.02. Set Up VAPI
Initialize the VAPI client with your public key:
import Vapi from '@vapi-ai/web';
const vapi = new Vapi('your-vapi-public-key');3. Start VAPI Conversation
// Start a call with your VAPI assistant
await vapi.start('your-assistant-id');4. Capture VAPI’s Audio Stream
VAPI uses Daily.co internally for WebRTC communication. The Daily call object isn’t immediately available after starting VAPI - we need to poll for it:
// Start VAPI call first
await vapi.start('your-assistant-id');
// Wait for Daily.co object to become available
const waitForDaily = new Promise((resolve) => {
  const checkDaily = () => {
    const dailyCall = vapi.getDailyCallObject();
    if (dailyCall) {
      console.log('Daily call object available');
      resolve(dailyCall);
    } else {
      setTimeout(checkDaily, 100); // Poll every 100ms
    }
  };
  checkDaily();
});
const dailyCall = await waitForDaily;
// Now safe to attach event listeners
dailyCall.on('track-started', (event) => {
  // Check if this is the remote audio track (VAPI's voice)
  if (event.participant && !event.participant.local && event.track.kind === 'audio') {
    console.log('Received audio track from VAPI');
    // Create a MediaStream from the audio track
    const stream = new MediaStream([event.track]);
    // Route the audio to Trulience avatar
    trulienceRef.current.setMediaStream(stream);
    trulienceRef.current.getTrulienceObject().setSpeakerEnabled(true);
    // Mute VAPI's auto-created audio element to prevent double audio
    setTimeout(() => {
      const vapiAudio = document.querySelector(
        `audio[data-participant-id="${event.participant.session_id}"]`
      );
      if (vapiAudio) {
        vapiAudio.muted = true;
        console.log('Muted VAPI audio player to prevent double audio');
      }
    }, 100);
  }
});Why polling? getDailyCallObject() returns null until Daily.co initializes internally, which happens approximately 100-300ms after vapi.start() is called.
Complete React Example
Here’s a full working example using React:
import React, { useRef, useState } from 'react';
import { TrulienceAvatar } from '@trulience/react-sdk';
import Vapi from '@vapi-ai/web';
function VapiTrulienceIntegration() {
  const trulienceRef = useRef(null);
  const [vapi] = useState(() => new Vapi('your-vapi-public-key'));
  const [isCallActive, setIsCallActive] = useState(false);
  const [remoteStream, setRemoteStream] = useState(null);
  const startCall = async () => {
    try {
      // Start VAPI call
      await vapi.start('your-assistant-id');
      // Wait for Daily.co object to become available
      const waitForDaily = new Promise((resolve) => {
        const checkDaily = () => {
          const dailyCall = vapi.getDailyCallObject();
          if (dailyCall) {
            console.log('Daily call object available');
            // Listen for remote participant tracks
            dailyCall.on('track-started', (event) => {
              console.log('Daily track started:', event);
              if (
                event.participant &&
                !event.participant.local &&
                event.track.kind === 'audio'
              ) {
                console.log('Received audio track from VAPI');
                const stream = new MediaStream([event.track]);
                setRemoteStream(stream);
                // Attach to Trulience
                if (trulienceRef.current) {
                  trulienceRef.current.setMediaStream(stream);
                  const trulienceObj = trulienceRef.current.getTrulienceObject();
                  if (trulienceObj) {
                    trulienceObj.setSpeakerEnabled(true);
                  }
                }
                // Mute VAPI's audio player to prevent double audio
                setTimeout(() => {
                  const vapiAudioPlayer = document.querySelector(
                    `audio[data-participant-id="${event.participant.session_id}"]`
                  );
                  if (vapiAudioPlayer) {
                    vapiAudioPlayer.muted = true;
                    console.log('Muted VAPI audio player to prevent double audio');
                  }
                }, 100);
              }
            });
            resolve(dailyCall);
          } else {
            setTimeout(checkDaily, 100);
          }
        };
        checkDaily();
      });
      await waitForDaily;
      setIsCallActive(true);
    } catch (error) {
      console.error('Failed to start VAPI call:', error);
    }
  };
  const endCall = async () => {
    if (vapi) {
      await vapi.stop();
      setIsCallActive(false);
      setRemoteStream(null);
    }
  };
  return (
    <div className="relative min-h-screen">
      <div className="absolute inset-0">
        <TrulienceAvatar
          ref={trulienceRef}
          url={process.env.NEXT_PUBLIC_TRULIENCE_SDK_URL}
          avatarId="your-avatar-id"
          token="your-trulience-token"
          width="100%"
          height="100%"
        />
      </div>
      <button
        onClick={isCallActive ? endCall : startCall}
        className={`absolute bottom-6 left-1/2 -translate-x-1/2 px-6 py-3 rounded-lg text-white font-semibold ${
          isCallActive ? 'bg-red-600 hover:bg-red-700' : 'bg-blue-600 hover:bg-blue-700'
        }`}
      >
        {isCallActive ? 'End Call' : 'Start Call'}
      </button>
    </div>
  );
}
export default VapiTrulienceIntegration;Key Integration Points
Timing Considerations
The integration requires careful timing:
- Start VAPI call first - Call vapi.start()to begin the connection
- Poll for Daily.co object - Use the polling pattern to wait for getDailyCallObject()to return a valid object
- Attach event listener - Once Daily.co is available, listen for the track-startedevent
- Route audio on track start - When the remote audio track arrives, route it to Trulience
The track-started event ensures VAPI’s audio is ready before routing it to Trulience.
Preventing Double Audio
VAPI automatically creates audio elements for playback. To prevent hearing both VAPI’s audio and the avatar’s audio:
- Route VAPI’s audio to Trulience via setMediaStream()
- Mute VAPI’s auto-created audio element
- Enable the avatar’s speaker with setSpeakerEnabled(true)
Troubleshooting
Issue: Hearing double audio
- Verify VAPI’s audio element is muted (check the querySelector is finding the element)
- Confirm setSpeakerEnabled(true)is called on the Trulience object
Issue: Avatar not lip-syncing
- Check that setMediaStream()is called with a valid MediaStream
- Ensure the avatar is configured for ‘External Voice Platforms’ in the dashboard
- Verify the audio track is active (check browser console for track state)
Issue: Audio cutting out
- VAPI’s Daily.co connection may be unstable - check network conditions
- Ensure your VAPI subscription supports the required audio quality
Example Code Repository
See our VAPI integration example for complete working code.
Next Steps
- Review the Integration Guide for general patterns
- Explore VAPI’s documentation for advanced agent configuration
- Check out our Javascript SDK documentation for additional avatar control options