🧠 Practice Only – Not for Submission

💬 AI Feedback Required


🎯 What You’ll Do

In this activity, you’ll explore how prompt specificity impacts the quality of AI responses. You’ll use the 6 Prompt Components framework to rewrite a vague task, then compare the AI’s output from a weak prompt (missing components) and a strong prompt (fully structured). This activity builds your fluency in crafting clear, effective prompts that drive XP-level results.


📘 Instructions

  1. Start with this vague task from your executive:

“Send them a reminder.”

As an XP, your job is to transform this ambiguity into a clear, actionable prompt. But how you frame the ask to AI can dramatically change the outcome.

  1. Write two AI prompts:
  2. Paste both prompts into AI and review the responses.
  3. Compare and contrast the outputs.
  4. Write a short reflection (3–4 bullet points):

🤖 AI Feedback Prompt (Required)

Use an AI tool (like ChatGPT, Claude, or Gemini) to check your work by using the prompt below:

I’m practicing AI prompting using the 6 Prompt Components. I tested how specificity affects AI output by comparing a weak prompt and a fully structured prompt.

Prompt A (Partial): [Paste here]

AI Output: [Paste here]

Prompt B (Full): [Paste here]

AI Output: [Paste here]

Reflection:

Please give me feedback on:

  1. Did my full prompt use the 6 Components effectively?
  2. Was the improvement in AI output clear and XP-aligned?
  3. What else could I adjust to prompt even more effectively?

Analyze the feedback: How can this feedback help me to improve my output? What from the feedback do I want to keep or toss?