Currently https://docs.vapi.ai/sdk/web#message does not return responses from the LLM; it only provides transcripts (https://github.com/VapiAI/web/blob/2cc10cf008740ed0993fe888b6571139f1b1c2a4/api.ts#L902). These transcripts may vary or contain transcription errors, which can result in discrepancies with the LLM's responses. Additionally, the content for 'conversation-update' is sourced from these transcripts rather than directly from the LLM responses. It would be good for the 'conversation-update' to directly use the responses by the LLM.