In today’s episode, Cathy asks, “What predictions do you have for the future of ChatGPT for the rest of this year?” I share about large language models in general, some useful applications, Microsoft’s interest, and more.
Can’t see anything? Watch it on YouTube here.
Listen to the audio here:
- Got a question for You Ask, I'll Answer? Submit it here!
- Subscribe to my weekly newsletter for more useful marketing tips.
- Subscribe to Inbox Insights, the Trust Insights newsletter for weekly fresh takes and data.
- Find older episodes of You Ask, I Answer on my YouTube channel.
- Need help with your company's data and analytics? Let me know!
- Join my free Slack group for marketers interested in analytics!
What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for watching the video.
trying something different today using Adobe podcast with along with the wireless lavalier.
Mic, and the phone.
So let’s see how this turns out.
Because if this works really well, then it means that for folks who are content creators, you can do more than one thing at a time as long as you’re safe and aware of your surroundings.
So in today’s episode, Kathy asks, what do we think the implications of chat GPT are upon the rest of the year? That’s an impossible question to answer.
Here’s the thing.
I would like for us to stop referring to that specific product, when we’re talking about the applications of large language models, right.
GPT is a great tool, right? It’s built on the GPT-3 model, which is by OpenAI, that is not the only large language model out there.
In fact, there are many, many large language models, each town has their own applications, their own parameters, the Eleuther.ai, AI made a whole bunch on their desk at the pile.
So large language models, we talk about something like chat GPT-2 is like talking about Microsoft Word like what are the implications of Microsoft Word, as opposed to what’s the implication of word processing? And how, how is word processing going to change content creation? It sounds different, right? It sounds a little more broad.
We saw on the news, this past week that Microsoft is looking at integrating the GPT-3 model, the large language model into its search engine, and especially into some of its products like Word, PowerPoint, and Excel, ooh, this is a really good idea.
This is a really good idea.
Because when you’re being creative, even nominally creative inside the office and stuff, you’re I mean, we’ve all had that experience, when you’re sitting down to write something and you’re like, I don’t know what to write, I don’t know what to put on this slide, and so on, and so forth.
And that’s where a large language model is a really good idea to have available to say, ah, gosh, what should I name this presentation? Well, hey, let’s take a look at all the notes and distill down some, some title suggestions, right? To name this presentation? Or I’ve written this chapter of this book in my word processor, and like, what should what should the chapter summary be right? For those people who write cereal, pieces of fiction, choosing your your summary, choosing your, your sort of snippet for promotion really difficult.
If you’re creating content, and you want promotional material from it, this is something that we’re seeing a lot of tools starting to explore, where you give it your content, and you say, Make me some social posts, right, make me something that I can use to put on Instagram, or Facebook or, or LinkedIn or whatever, and it will generate those things.
And those are really good applications of large language models, as adjuncts to the creative process.
And I think that’s an important part.
It’s an adjunct to the creative process.
That means it’s not doing the work for you to follow.
And it kind of is, it’s not building the main corpus of the work.
It’s building all the promotional stuff around it.
And let’s face it, a lot of people who are really good creators, right? Authors, filmmakers, musicians.
Many of them don’t like making promotional material, right? Many of them don’t like pimping their stuff.
They feel awkward about, well, what does the machine just does it for you? Right? So you’re, instead of trying to struggle to make some promotional content, the machine doesn’t worry, hey, here’s the tweets you need to promote this thing.
That’s a great use of this technology.
That’s a great use of large language models.
So that’s what I foresee as being the the useful use cases.
There’s a lot of novelty uses for these large language models.
One of my personal favorites is meeting notes, summarization, I will feed a long transcript of 2025 minute call and say, give me meeting notes and action items out of this.
Now I have a whole prompt written down to make sure it behaves the way I want it to behave.
But in doing so, it dramatically shortens the amount of time I need to get action items out of a call, particularly It was a long Cornett remember everything? It’s a great way to summarize.
So are there gonna be other applications? Of course, we see all sorts of things like actual conversations.
I saw a piece of news this morning on how someone was using it to more than say that was using it to test for mental health stuff.
Experiments in the controlled laboratory setting by qualified professionals to see if these bots could act as therapy adjuncts right now up, not replacing therapists because the liability on that alone would be impossible, but as a supplement to regular therapy, so that’s where I think we’re going with these things.
I it’s going to be interesting to see how it turns out.
What are the big questions right now that was announced in the OpenAI discord was hey, we’re thinking about commercializing this, what should the pricing be? What should the model be? And a lot of people pitch their their ideas and but the era of this stuff being free, is coming to a close very fast.
So if there’s a lot of things you want to get done with it now.
Get it in, if you don’t have the budget, it’s probably not going to expense for it.
OpenAI is regular pricing is surprisingly affordable.
It’s like two cents, I think, per 1000 tokens.
So if you put in 1000 word essay, you know, cost you two pennies to process that thing to generate that much token data.
And we see with chat GPT in particular, that it doesn’t seem to take into account the input length nearly as much as the regular GPT-3 model does.
So you can put in really long crops man should to get results out of the system.
So how that changes once they start charging for it? We don’t know.
But it will be will not be free forever.
It will not be freezing for a while.
It’s that that research favor that I believe is coming to a close.
I don’t think OpenAI expected it to be the huge hit that it was.
But now this BERT folks, they’ve got a bunch of doctorate degrees and things.
They know that they’ve got to hit on their hands and they need to capitalize on it sooner rather than later.
Anyway, thanks for the question.
Talk to you soon.
If you’d like this video, go ahead and hit that subscribe button.
You might also enjoy:
- Understand the Meaning of Metrics
- Marketing Data Science: Introduction to Data Blending
- How To Set Your Consulting Billing Rates and Fees
- It's Okay to Not Be Okay Right Now
- The Biggest Mistake in Marketing Data
Want to read more like this from Christopher Penn? Get updates here:
Get your copy of AI For Marketers