
Earlier this summer, I was having a conversation with someone who wondered aloud– does Chat GPT learn individuals users? And if so, by how much. The conversation ended, but the question sparked something in my brain.
Fast forward to a couple of weeks ago, I had to get this answer. So I did what any marketing pro would do. I hosted a small focus group. The husband fixed lunch of homemade pizza, and several of us sat around the kitchen bar to test this out.
Background
I had three people test scenarios using the free version of Chat GPT.
Me — I’m a moderate marketing user. My questions center around marketing-based topics including turns of phrase, brainstorming concepts or wordsmithing.
Chris Phillips — The husband works in finance and uses this on occupational tasks. His main uses are for analysis, financial wording, job descriptions, etc.
Noah Smith — A family friend and recent college graduate is a heavy user, using Chat GPT in his start-up business. His main uses are for writing product descriptions and client communications.
Hypothesis
My hypothesis, was that if Chat GPT learned our style, it would do a better job for each of us on the prompts that related to our regular requests. It would have learned better product descriptions for Noah, better marketing phrasing for me, etc.
What We Did
We tested this theory three times with a message that should work better for each person’s regular use of Chat GPT.
Question 1 & Answers
Write a product description for a snack mix that is made in Arkansas. It should be no more than 3 sentences and encourage someone visiting a website to buy it. Ensure it covers the ingredients of nuts, crackers, and pretzels. It’s a garlic-parmesan seasoning.
Follow up – This is a good start. Make it sound a little more enticing.
Here’s what we got:



Question 2 & Answers
Write 3 sentences of marketing copy about a university that can be used in a print ad. The audience is a 17-year-old or their parents. Emphasize affordability through scholarships and an on-campus work program that keeps students from taking out loans. Also, be sure to use the tagline, Now’s your moment.
Follow up – Good start. Be more succinct and ensure this applies to the target audience.
Here’s what we got:



Question 3 & Answers
I am looking to hire a new role in my company. Can you draft a job description for a destination services specialist? The organization is focused on promoting tourism, and this position helps conventions with logistics related events in the community. This position is a skilled hourly position, meaning it doesn’t oversee anyone but also isn’t entry level.
Follow up – Can you incorporate more industry-specific phrasing?
Here’s what we got:
(This one was really long, so I’m hitting a few highlights)



What I Found & What it Means
- Small differences – I was surprised there weren’t more noticeable differences between the iterations. All users have been using Chat GPT for several months, so I thought the changes would have been more pronounced. For higher ed marketers, I think that means the tool doesn’t customize as much as I may have thought. I think that means it’s important to provide additional context about the request in order to get something more custom. For example, I think asking about the snack mix example with a link to the website, or asking to write marketing copy and providing a link to the style guide. These are features that are available in the paid version of Chat GPT, and this exercise helped me see the value of them to get something that is more unique to the specific context.
- A bit generic – The answers, while good, felt a bit generic. Some of this may have been the prompts were slightly generic, but I felt the mix could have been any snack mix, the ad used any university, or the job description for any job. I think this further demonstrates, for higher ed marketers, that we are responsible for using this as a starting place. However, we must infuse our own brand, messaging and tone into the work. Trusting that part of the work to Chat GPT would be a pretty big mistake.
- Not Learning in this Way – I keep hearing that Chat GPT is still in infancy. However, I have struggled to understand what that meant. I think I realized the tool isn’t exactly learning the way I had assumed it was learning. It, at least not yet, doesn’t have a sub compartment file of iterative learning for Carrie and is responding to her in one way and to someone else in another. I think that is certainly a possibility in the future. I previously would give the tool feedback on where I landed to help it get better. I’m not sure that’s needed at this point in the process. For higher ed marketers, I think this means that the tool doesn’t get better at knowing your style over time without investment in training it through providing message documents, notes, emails you’ve sent. I think it takes a specific effort for that, as opposed to it happening over time (at least in a meaningful way). That means it still is going to rely on us to do the work of massaging and word smithing to make sure it’s right. At least for now.
What’s Next
This was an interesting quick test for Chat GPT and where the tool stands. I had intended to end the conversation here.
However, Chris and Noah both starting having fun testing the prompts using different tools, forming a secondary experiment.
So, in all good emergent research, the scope changed some. In next week’s blog, I’ll compare how Chat GPT fared against some of the other tools available.
One response to “Putting Chat GPT to the Test”
I love this! Great illustration, Carrie!
LikeLike