New to Telerik UI for ASP.NET CoreStart a free 30-day trial

Streaming AI Integration

Connecting the AIPrompt component to a streaming AI chat client service is a common scenario.

In this article, you will build a practical, end-to-end example that wires the AIPrompt to a lightweight AI chat client, which opens a chat session, sends the user prompt, and streams the model's response into the Output view.

Getting Started

To creare an AI chat serivce that connects to the AIPrompt, follow the next steps:

  1. Add a reference to the AI chat client script and initialize the client.

    JS
    <script src="~/ai-chat-service.js"></script>
    <script>
        const apiClient = new AIChatClient({
            // Replace with your AI service endpoint (for example, "https://api.yourdomain.com/ai").
            apiBaseUrl: "https://demos.telerik.com/service/v2/ai",
            defaultHeaders: { 'Content-Type': 'application/json' },
            aiId: kendo.guid()
        });
    </script>

    How does the chat client work?

    • Creates a chat session (/chat/conversations).
    • Sends prompts (/chat/{chatId}).
    • Reads the stream and invokes callbacks: onStart, onProgress, onComplete.
    • Supports cancelling through abortRequest().
  2. Define the AIPrompt component and enable the built-in SpeechToTextButton and TextArea components through the SpeechToText() and PromptTextArea() configurations.

    Razor
    @(Html.Kendo().AIPrompt()
        .Name("aiprompt")
        .SpeechToText(speechBtn => speechBtn
            .IntegrationMode("webSpeech")
            .Lang("en-US")
            .Continuous(false)
            .InterimResults(true)
            .MaxAlternatives(1))
        .PromptTextArea(txtArea => txtArea
            .Resize(TextAreaResize.Auto)
            .Rows(3)
            .Placeholder("Enter your prompt here...")
            .FillMode(FillMode.Outline)
            .Rounded(Rounded.Medium)
            .MaxLength(1000)
        )
        .ActiveView(0)
        .Views(views =>
        {
            views.Add().Type(ViewType.Prompt);
            views.Add().Type(ViewType.Output);
        })
    )
  3. Handle the following AIPrompt client events:

    • PromptRequest event—Fires when the user clicks the Generate or Retry buttons. Within the event handler, get the submitted text (e.prompt) and switch to the Output view. Start the asynchronous call to the chat client. Provide the following callbacks:

      • onStart: calls startStreaming();
      • onProgress(accumulatedText): calls updatePromptOutputContent(accumulatedText, currentOutputId);
      • onComplete: calls stopStreaming();
    • PromptRequestCancel event—Triggers when the user clicks Stop during streaming (the prompt request is canceled). Abort the in‑flight request by calling the apiClient.abortRequest(), so the streaming logic calls the stopStreaming() method of the AIPrompt and updates the output card.

    Razor
    @(Html.Kendo().AIPrompt()
        .Name("aiprompt")
        .Events(events =>
        {
            events.PromptRequest("onPromptRequest");
            events.PromptRequestCancel("onPromptRequestCancel");
        })
        ... // Additional configuration.
    )

For the complete example, visit the Overview Demo of the AIPrompt component.

See Also

In this article
Getting StartedSee Also
Not finding the help you need?
Contact Support