Practical AI Engagement Strategies for Better Results

One of my goals this year has been to spend more time playing with AI. To be honest, I haven’t had as much time as I had hoped. However, every time I dedicate time on AI tools, I find the investment is meaningful, and I walk away with new learnings.

With that said, I have shared several high education use cases before, and the posts have helped take away some of the mystique of AI. At a conference recently, I had a colleague ask me to do another AI post to help give her new ideas that were relevant to helping her work better with AI.

For this post, I’m going to focus on strategies to improve how you engage with AI, less about specific ways to use it in your work. However, I still think this will be incredibly beneficial.

*Shameless plug —read the last examples if you haven’t seen them.

Four New AI Takeaways:

Ask AI to Get Better

This is something I think is important to do everything you work with an AI tool. After you have something that you’re pleased with from your AI tool, ask the tool to rate its response on a scale of 1-10 as to how it addressed the original prompt. I have found that ChatGPT does a pretty good job of assessing its work. Then, ask your AI tool to improve its score. I also include a note to please ask if there is any specific information that would help improve the score. The first time I tried this, ChatGPT told me what information it needed to enhance the response. I was blown away with how adding a few details or instructions could change the outcome of the prompt. That also helped me to become better at prompting, so it creating a winning situation across the board.

Focus on the Instructions

When I’m busy, I find myself not being specific enough with the tool I’m using. As a result, AI tools have a much harder time providing exactly what I’m looking for. It gets close, but something is still off. In my work on enhancing my prompts, I am forcing myself to slow down and think about the end first before I ask AI for anything. Thinking of the end helps me give the tool better instructions and details. When I give insights specifically around length, tone, and messaging, I find the result is usually closer to what I was hoping for. It may not always be perfect but shows there is value in putting more quality into the prompt itself.

Get Really Custom

I have built a few GPTs for the office and find they are incredibly helpful. However, after building a broad GPT for the university needs. I find it is too broad and would benefit from more specificity. Let me explain. I find that enrollment, alumni, and academic departments all use the same brand messaging pillars. However, the execution of those is different. I speak differently to alumni than I do to students about just about every topic. As such, I need to go back into the instructions for the GPT and clarify how messages are different for those audiences. Additionally, I think there may be value in having separate instructions for marketing copy versus messages to prospective students. The latter should be helpful but not overly marketingish. That slight nuance matters to the recipient, so it’s important to get really custom so the GPT gets it right, and I don’t have to constantly rework the copy.

Avoid Sunk Cost Fallacy

When I first started using AI, I feared starting over. I would go down a string of prompts and would find that sometimes I left frustrated and without anything valuable. I needed to learn that it’s okay to cut bait and try again. The model learns and uses memory from the conversation, so if the conversation is going in the wrong direction making additional requests is simply an exercise in frustration. Instead, it’s better to start a new thread and ask your original prompt in a new way. I found that after the second piece of feedback I’ve provided, if the responses still aren’t getting me what I want, then I need to pivot, start again, and ask in a different way. Usually, when I go back to the drawing board and start over, I get something that’s much better because I reframe how I’m prompting. However, there’s something about that initial decision to change courses that is still hard for me.

What Else?

I’d love to hear your thoughts on what I’m missing and what other strategies you use to enhance your AI prompts. Please comment on how you use the tools, and we can keep this conversation going.

One response to “Practical AI Engagement Strategies for Better Results”

  1. Great post, Carrie. So many ways we can use these tools! Like you said, the best way to learn is just to spend some time experimenting and exploring…and listening to what others are doing.

    We’re finding all sorts of uses for custom GPTs including creating interactive personas that we can use as mini, on-demand focus groups, writing meta descriptions and alt tags, and aggregating and analyzing data to get deeper and more strategic insights.

    So many opportunities!

    Like