Figma surveyed thousands of designers and developers and published a finding worth sitting with: 78% say AI makes them more efficient. Only 47% say it makes them better at their role.
That 31-point gap is the whole conversation.
Most people chasing the efficiency number want to know what prompt to use, what template to follow, what tool to add. That's the wrong question.
The number isn't about prompting. It's about how you think before you ever open the tool.
The Gap Isn't Skill. It's Structure.
The developers and designers seeing the biggest gains aren't necessarily more technical. They're not using hidden prompt libraries or running better models. What they're doing differently is treating AI as the last step in a decision, not the first.
Before they type anything, they've already answered three questions:
What type of task is this? A creative task needs different handling than a logic task. Generating five headline variations is not the same workflow as debugging a component. If you treat them the same, you get mediocre results from both.
What does "done" actually look like? Vague output is almost always a product of a vague target. The more precisely you can describe the finish line before you start, the more useful the output becomes. Not "write me something about this" but "I need a 150-word intro that acknowledges the reader's skepticism and ends on a curiosity hook."
What constraints exist? Time, tone, audience, format, technical limits. The more of these you define ahead of time, the less cleanup you do on the back end.
This is workflow thinking. And it's what multiplies output.
Why Most People Skip This
AI tools are fast, and fast feels like efficient. You have an idea, you open a tab, you type. That impulse makes sense. The tool is right there.
But speed at the input doesn't equal speed at the outcome. Unstructured prompts produce unstructured results, and then you spend the next 20 minutes editing, re-prompting, and patching. The time cost just moved. It didn't disappear.
The people seeing compounding returns from AI have slowed down the thinking phase so they can run the execution phase faster. That's not a prompting skill. It's a problem-framing skill.
What This Looks Like in Practice
Before a recent client deliverable, I didn't start with "write me a LinkedIn post about this." I started by asking myself: Who is reading this? What do they already believe? What action do I want them to take? What tone fits this brand right now? Once those questions had answers, the actual generation took minutes and required almost no editing.
The framework wasn't in the prompt. The framework was in my head before I wrote a single word.
That's the shift. That's what the gap is measuring.
The Takeaway
If AI hasn't been delivering the results you expected, the fix probably isn't a better prompt. It's a cleaner brief. It's knowing your task type, your success criteria, and your constraints before you open anything.
The tool is only as good as the thinking that precedes it.
Want a Website That Works as Hard as You Do?
If you're building a business that takes AI seriously, your website should reflect that. I design and develop custom sites for brands that are ready to grow with intention. Let's talk about your project.