Page 1 of 1

Could the results of sales calls benefit marketing?

Posted: Sun May 18, 2025 9:20 am
by abumottalib36
Could customer service queries benefit IT development? The answer to both is yes. And with this approach, you can unlock extremely valuable information sharing across the company.

“Organizations can focus on the data they already know and trust, and, most importantly, own — helping avoid the numerous pitfalls of copyright, toxicity, and unpredictability that so often undercut a generative AI deployment’s reliability,” said Silvio Savarese, Salesforce’s chief scientist leading its AI research team. “And because these datasets are so tightly focused on a domain-specific task, they can train powerful, purpose-built models that do things no general-purpose alternative can touch.”

Get more of generative AI
man drowning in a sea of data
Can AI Put an End to Big Data’s Boulevard of Broken Dreams?
Using retrieval augmented generation for generative AI
The Hottest 3 Letters in Generative AI Right Now Are Not LLM
An image showing a key floating in front of a bright light with numbers representing data on the sides. A creative look at train your own LLM.
How To Unlock the Power of Generative AI Without Building Your Own LLM


A stack of conversation chat bubbles alternating between a human and chatbot.
Designing a Copilot? Start with Basics of Conversational AI
An open-source LLM offers immediate value
Building and training an LLM can cost a company millions depending on its size and needs. You have to gather and prepare vast amounts of data to train it; purchase and assemble computational resources like GPUs honduras consumer mobile number list and storage space to run it; hire data scientists and natural language and machine learning engineers to build and run it; and more. For example, OpenAI CEO Sam Altman estimated it cost the company $100 million to train GPT-4, and it cost Google around $190 million to train Gemini Ultra, according to Stanford’s AI Index report. Then you have to account for the time it takes to train the LLM, which can be weeks or even months.

On the other hand, companies using an open-source LLM will gain value faster because it’s a plug-and-play model. Adding your own data to the LLM gives it more context to generate better results. You do this by using low-cost prompt grounding and retrieval augmented generation (RAG), an AI technique to automatically embed your most current and relevant proprietary data directly into your LLM prompt. Because you don’t need the aforementioned aspects of training your own LLM, an open-source model will essentially cost what you pay in monthly or annual service fees to a provider as well as grounding and using RAG, which is significantly less — hundreds of thousands of dollars, even millions, less — than training it yourself.