The Case for Prompt Design when Interacting with generative AI

By Ana-Paula Correia (in collaboration with Sean Hickey)

Dall-E generated image. Curated by Sean Hickey.

With generative AI becoming increasingly part of our daily lives, a new area of knowledge is emerging: the ability to create an AI prompt, typically referred to as prompt engineering. When interacting with generative AI, “prompt” usually refers to a specific piece of text or instruction given to the AI model to generate a response or output. In a simplistic form, this is the question or inquiry we input into the AI model, which prompts the model to generate a response or output. The prompt offers the model both a starting point and guiding criteria for generating the output.

Much has been written and said about AI language model outputs, and not so much about its inputs – the prompt. We want to take a moment and reflect on the critical role of the prompt in showcasing human ingenuity and creativity when interacting with generative AI.

According to Google (2024), there are four components of a prompt (Google Cloud, 2024)

  1. Task: what you want the model to do or respond to, which can take the form of a set of instructions or a question.

  2. System instructions: the criteria and constraints for the output, such as setting the length or the style of the response.

  3. Few-shot examples: the samples of an ideal response, which demonstrate features such as tone or style.

  4. Contextual information: the essential content required for the model to respond to the prompt. Contextual information might include text descriptions of the purpose or audience or a table with key data and figures.

Other considerations for creating a prompt include:

  1. Clarity: One key aspect of prompt design is clarity. It is essential to steer clear of vague or overly broad questions. To add clarity, you can break down the question into smaller, more manageable parts.

  2. Relevant Information: Include any essential details that may impact the answer to the question or topic discussed.

  3. Language: Avoid using slang, jargon, or overly technical terms that the model may not understand.

  4. Response criteria: If there are any specific instructions or preferences you have for the model’s response, such as word length or reading level, be sure to include them in the prompt.

Image created by Sean Hickey. Icons licensed by Dacian Galea/Shutterstock.

Prompt Design versus Prompt Engineering

Examining these components and considerations for writing prompts, we propose the concept of prompt design as a supplement to prompt engineering. While engineering and design are related fields, they are distinct in their focus and approach. At its core, engineering involves the application of scientific and mathematical principles to solve practical problems and create innovative solutions. On the other hand, design involves the application of creativity, problem-solving skills, and a deep understanding of the user to develop products, services, and experiences that are both functional and aesthetically pleasing.

“Prompt design” puts the focus on the intellectual activity, creativity, and expertise that goes into creating a prompt that will engage with a large language model in a meaningful way to elicit a high-quality, highly satisfying, and effective output that is commensurate with the intellectual activity involved in creating the input. By emphasizing the importance of prompt design, we hope to encourage a more nuanced and refined approach to working with AI language models, one that recognizes the crucial role of human expertise and creativity in shaping the output they produce.

In his post on Engineering vs. Design Thinking, Chinn (2017) argues that engineering and design thinking share similar processes and mindsets, such as critical thinking and project management, but they differ in crucial ways. Engineering relies on deductive reasoning, while design thinking uses inductive approaches. In design thinking, people are at the center, bringing unpredictability, whereas engineering deals with more stable elements like materials. The stakes also differ—engineering projects carry higher risks, like the collapse of a bridge, while design-thinking projects, such as app development, allow for faster prototyping with low-risk consequences. This difference in risk influences one’s approach to testing: engineering prioritizes technical feasibility first, while design thinking focuses on user desirability before considering technical challenges.

Can AI Models Generate Prompts to Produce the Output That Humans Desire?

One may argue that AI models can generate effective prompts. For instance, ChatGPT or the AI model Claude can be asked to create a prompt that will yield a desired output. But are these prompts as good as those created by humans?

Our argument is that an AI-model-generated prompt has limitations that the human prompt designer or prompter does not. For example, AI models cannot include contextual specifications only pertaining to the human prompter. The model is not familiar with our particular context of teaching in a university in the Midwest of America….or the model does not know anything about my learners’ profiles to include in the prompt. Even though AI models can generate a prompt with contextual information, only the human prompter knows the real context, such as the specific audience or any unique criteria or constraints.

To explore the capabilities of ChatGPT in generating its own prompts and the subsequent impact on output quality, we embarked on an experiment that yielded some revealing results.

We began by inputting a carefully crafted passage into ChatGPT and requested the model to produce a prompt based on that text. ChatGPT obliged, generating a prompt, which we then used to create a new passage. This new passage mirrored the original in theme and structure, which was anticipated. Intrigued, we decided to repeat the process: we fed the second passage back into ChatGPT, asking for another prompt. This time, the prompt it generated resembled the first but was noticeably more generic. With each iteration, as we continued this cycle, a clear pattern emerged—the prompts grew progressively shorter and less specific, and the resulting passages became increasingly vague and unfocused.

Repeating this experiment with a different passage yielded different results that were equally problematic. This time, with each iteration, the prompts became longer and more complicated, which led to longer, more detailed results. What started as a 250-word essay ballooned to more than 1500 words after only five iterations. The complexity and reading level also increased dramatically, leading to a nearly unreadable, overly technical report, only tangentially related to the topic of the original passage.

In both cases, throughout this iterative process, the quality of the ChatGPT-generated prompts did not improve, producing content that increasingly deviated from the original, leading to a loop of repetitive and uninspired content no longer useful for its intended purpose or audience. This phenomenon suggested that when ChatGPT is left to generate its own prompts without human intervention, it tends to recycle ideas without introducing new perspectives or depth. The outputs began to “inbreed,” limiting the exploration of the model’s full potential and failing to produce the rich, nuanced responses that can arise from more thoughtfully designed prompts.

Our experiment highlights a critical insight: the infusion of human ingenuity and creativity in crafting prompts is essential to activating the responses that AI models are capable of generating. Human-designed prompts introduce fresh ideas, nuanced questions, and specific contexts that guide the model to produce more valuable and focused content. By relying solely on the model to self-prompt, we miss out on the opportunity to fully engage with its capabilities. We believe that the infusion of human creativity in designing prompts is what is required to get unique, valuable outputs.

Despite AI models’ apparent ability to generate prompts, the prompts they generate lack the contextual details that are unique to the human prompter. For instance, the model cannot include contextual information about the audience, unique situations, or the prompter’s particular experience or skill. This can lead to output text that contains erroneous information. In the same way that a computer cannot write its own code, AI models need input to produce output. That input requires human ingenuity, creativity, and intelligence.

How Do You Design a Well-Crafted Prompt?

A well-crafted prompt is one that engages your creativity and is tailored to your unique circumstances and context. Consider the following 5 best practices for prompt design:

  1. Define Your Audience: Specify the context and the intended audience. Whether it is students, professionals, or a general audience, this influences the tone, language, and complexity of the response. 

  2. Leverage Personal Experience and Expertise: Incorporate your own experiences, skills, or knowledge into the prompt. Such guides the AI to produce uniquely personal content that aligns with your perspectives.

  3. Incorporate Unique Situations: Pose detailed, nuanced, and thought-provoking questions. Mention any unique challenges or scenarios that are specific to your context. This allows the AI to tailor its response to address those particular issues with more comprehensive answers.

  4. State Assumptions: Supply any necessary background details and identify any assumptions that the AI should consider. By including them in the prompt, you give the AI model insight into your thought processes so it can better align the response with your point of view.

  5. Specify Desired Tone, Style, and Format: Indicate the preferred tone (e.g., formal, conversational, academic) and format (e.g., essay, report, bullet points) to ensure the response matches your expectations and suits the context. Provide examples of your writing so the AI can model its response after your unique style.

By thoughtfully incorporating these practices, you infuse the prompt with the human ingenuity, creativity, and contextual awareness that AI models lack on their own, fully leveraging the capabilities of the AI model and maximizing the relevance and uniqueness of its response.

Prompt Design: More Than (Just) Engineering

Google Cloud (2024) distinguishes between prompt design and prompt engineering, viewing them as separate but related activities. While prompt engineering is an iterative process focused on the technical procedures of large-language model interaction, Google suggests that prompt design is a more artistic process aimed at eliciting the desired response from the AI model.

We assert that prompt design is the uniquely human process of writing prompts for generative AI models that produce accurate, high-quality, and human-desired responses. More than just plugging values into a formula, designing well-written prompts is an inherently creative endeavor. It is a multifaceted skill that requires adaptation and is rooted in creativity. Prompt design is the skill that has the most potential to truly unlock the potential—the “magic”—of generative artificial intelligence.

Please cite the content of this blog:

Correia, A.-P. & Hickey, S. (2024, November 15). The Case for Prompt Design when Interacting with generative AI. Ana-Paula Correia’s Blog. https://www.ana-paulacorreia.com/blog/the-case-for-prompt-design-when-interacting-with-generative-ai

Next
Next

Can AI Win Over Educators? Insights from the Experts