It’s Not About Better Prompts. It’s About Better Structure

There’s been a lot of focus on “better prompts” over the last year — the idea that if you just find the right phrasing, the right magic words, the right formula, the output will suddenly become brilliant.

But communicators know something different: The quality of the output comes from the structure you give it.

The prompt still matters. But the thinking behind the prompt matters more.

When you break the work into steps, give context, and brief AI the way you’d brief a colleague, everything changes — the clarity, the accuracy, the usefulness.

Today’s edition looks at two tools communicators use often — one occasionally, one daily — and shows how vague prompts fall short while structured prompts create reliable, repeatable results.

Why “better prompts” never solved the real problem

Early AI advice focused on clever phrasing:

  • “Start with ‘act as…’.”

  • “Use these magic words.”

  • “Try this secret formula.”

But communication work isn’t about tricks. It’s about clarity, context, and purpose.

When a prompt is vague, AI fills the gaps with assumptions — and you end up rewriting, correcting, or starting over.

The issue isn’t the tool. It’s the structure.

Two Everyday Tools (and Why Vague Prompts Fail)

To make this practical, here are two tools communicators use regularly:

  • Summaries / Outlines — the “I’ve tried this but don’t trust it” tool

  • Stress Testing — the “I use this constantly without realizing it’s prompting” tool

Below are four before/after examples showing how structure changes everything.

Tool 1: Summaries / Outlines

Old Prompt (vague)

“Summarize this meeting.”

What happens: You get a generic recap that misses the nuance you needed.

New Prompt (structured)

Context: This was a 45‑minute cross‑functional project meeting during a tight delivery window.

Goal: I need a summary I can share with leadership.

Steps:

  1. Identify decisions made

  2. Capture blockers that could affect timelines

  3. Note owners + next steps

Constraints: Keep it neutral, concise, and action‑oriented.

Output: 5–7 bullet points.

What happens: You get a clean, leadership‑ready summary that reflects the actual work and urgency.

Old Prompt (vague)

“Create an outline from this.”

What happens: You get a structure, but not the one you needed.

New Prompt (structured)

Context: I’m drafting a comms plan for an internal rollout that affects multiple departments.

Goal: I need a clear outline to guide the first draft and ensure I’m covering the essentials.

Steps:

  1. Identify the core message

  2. Break into phases (Pre-launch, launch, follow-up)

  3. Add key audiences + channels

Constraints: Keep it simple; no filler sections.

Output: A 4‑part outline with bullets.

What happens: You get a usable outline that mirrors how communicators actually build rollout plans.

Tool 2: Stress Testing

Old Prompt (vague)

“Stress test this.”

What happens: AI gives generic feedback that doesn’t help you refine the piece.

New Prompt (structured)

Context: This is an internal update for managers during a busy operational period. Goal: I want to make sure the message is clear, anticipates concerns, and doesn’t create unnecessary confusion.

Steps:

  1. Identify unclear sections

  2. Flag tone issues

  3. Surface questions a manager might ask their leaders

Constraints: Keep feedback concise and focused on clarity.

Output: A bullet‑point critique.

What happens: You get targeted, relevant feedback that reflects real managerial needs.

Old Prompt (vague)

“Does this make sense?”

What happens: AI says “yes” or gives surface‑level notes.

New Prompt (structured)

Context: This is a draft intro for a project update going to frontline teams who are already stretched.

Goal: I want to check clarity, tone and whether the message sets the right expectations. Steps:

  1. Identify the main point

  2. Flag anything confusing or potentially misinterpreted

  3. Suggest one improvement that would strengthen alignment

Constraints: No rewriting — feedback only.

Output: Three short bullets.

What happens: You get a quick, useful gut‑check that mirrors how a colleague would respond.

The Structured Prompt Framework

A simple way to brief AI like a colleague.

This is the core of today’s edition — the shift from “better prompts” to better structure.

1. Context

What the task is, who it’s for, and why it matters.

2. Goal

What you need the output to achieve.

3. Inputs

What you’re giving the AI to work with (notes, draft, transcript, bullets).

4. Steps

Break the work into the actions you want it to take.

5. Constraints

Tone, length, audience, format, or anything to avoid.

6. Output Format

How you want the result delivered (bullets, draft, options, outline).

This framework works across tools, tasks, and workflows — and it mirrors how communicators already think.

A quick note on inputs: Always prepare or abstract what you share with AI. Remove names, identifying details, and anything confidential — the structure works best when the inputs are clean and non‑sensitive.

My Quick Win

The clearer you are about the work, the clearer the output becomes. Spending a little time naming the steps upfront changes everything — it’s the difference between hoping for a good result and setting the conditions for one.

Next week, we’ll look at how to integrate AI into your workflow in a way that actually saves time — not just in theory, but in practice.

The Structured Prompt Framework is also available as a downloadable resource.

Where AI Actually Helps (and Where It Doesn’t)

Next
Next

When the Environment Becomes the Work