💻
Prompt Engineering Guide
  • Prompt Engineering Guide
  • Introduction
    • Introduction
    • What is an LLM
    • Soekia - What is an LLM
    • ChatGPT Models
    • What is a prompt?
    • Prompt Categories
    • Good prompting
    • Misconceptions about LLM's
  • Configuring ChatGPT
    • ChatGPT Configuration
  • Text Prompts
    • Prompt Frameworks
    • Roles
    • Prompt Cheat Sheet
    • Output Formats
    • Copy of Output Formats
    • Prompt Techniques
    • Prompt Refinement
    • Prompt Patterns
    • Temperature Parameter
  • Custom GPTs
    • Introduction to Custom GPTs
    • The GPT Store
    • Step-by-Step Guide to Creating a Custom GPT
    • Testing and Refining Your Custom GPT
    • Use Cases
    • TASKS
    • Scheduled Tasks
  • Appendix
    • Prompting Terms
    • About
    • Chorus
  • References
  • Hack and Track
  • FAQ
  • GPT4o Announcement
  • Exercises
    • Excercise: Geopandas
  • Group 1
    • Projekte
  • Write like me
Powered by GitBook
On this page
  • Know Your Question
  • Key Principles
  1. Introduction

Good prompting

Key principles for good prompting

PreviousPrompt CategoriesNextMisconceptions about LLM's

Last updated 1 year ago

Effective prompting is essential when working with language models (LLMs) like ChatGPT. Crafting the right prompt can significantly influence the quality and relevance of the responses generated by the model. This section will explore techniques for creating effective prompts and highlight the importance of understanding and defining your questions clearly.

Know Your Question

Knowing your question is crucial when interacting with LLMs. A well-defined question helps the model understand the context and intent, leading to more accurate and relevant responses. By clearly stating what you need, you guide the model to focus on the specific aspects of your query, reducing ambiguity and improving the quality of the output.

Key Principles

There are several key principles and best practices involved in prompt engineering and good prompting:

  • Clarity and Specificity: Prompts should be clear, direct, and leave no room for ambiguity or misinterpretation. Using specific requirements, constraints, and examples helps guide the model.

  • Context and Framing: Providing relevant context about the background, purpose, and intended use of the prompt output allows the model to shape its response appropriately.

  • Task Formatting: The way a task is formatted and structured matters. Separating the prompt into distinct components like instructions, input data, and output requirements can improve performance.

  • Iterative Refinement: Prompt engineering is rarely a one-shot process. Analyzing model outputs, identifying weaknesses, and refining prompts iteratively leads to better results over time. Here's .

  • Technique Combinations: Different prompt engineering techniques like few-shot learning, chain-of-thought prompting, and constitutional AI can be combined for improved performance.

  • Platform and Model Choice: Every platform and every LLM have their specific strenghts and capabilities, which make the more or less suitable for certain tasks. Get familiar with different Platforms. There is more than ChatGPT out thereChoosing the model

an overview over different techniques