FAQ
How does model learning in custom gpts in chatgpt work exactly?
Custom GPTs in ChatGPT do not actually perform any model learning or training. They are pre-configured AI assistants that leverage OpenAI's existing language models, such as GPT-3 or GPT-4, to provide specialized capabilities[1][3].
The process of creating a Custom GPT involves:
Defining the purpose, instructions, and capabilities of the AI assistant using natural language prompts[2][3]
Optionally providing additional knowledge sources, such as documents or datasets, to enhance the GPT's expertise in a particular domain[2]
Configuring the GPT's abilities, such as web browsing, image generation, or code execution[3]
However, the underlying language model itself is not retrained or fine-tuned when creating a Custom GPT. The model's parameters remain fixed, and the GPT's specialized behavior is achieved through carefully crafted instructions and prompts[2][3].
In summary, Custom GPTs in ChatGPT provide a user-friendly interface to leverage pre-trained language models for specific use cases, without requiring any actual model training or learning on the user's part.
Quellen [1] ChatGPT Custom GPT Models: Tailoring AI for Specific Needs https://fastbots.ai/blog/chatgpt-custom-gpt-models-tailoring-ai-for-specific-needs [2] How to create custom GPTs in ChatGPT - Pluralsight https://www.pluralsight.com/resources/blog/data/how-create-custom-gpts-chatgpt [3] How to Make Custom ChatGPT Models: 5 Easy Steps to ... - DataCamp https://www.datacamp.com/tutorial/how-to-make-custom-gpts [4] Introducing GPTs - OpenAI https://openai.com/blog/introducing-gpts
What File types are foreseen in openai custom gpt?
According to the search results, OpenAI Custom GPTs support a variety of file types for knowledge sources and file uploads:
Knowledge sources can include many file types, such as PDF, text, CSV, and other data formats[1][2]. These files are used to extend the GPT's knowledge with domain-specific information.
For file uploads, all common file extensions for text files, spreadsheets, presentations, and documents are supported[3]. The maximum file size is 512MB per file, with a 2M token limit for text and document files (this limitation does not apply to spreadsheets)[3].
Images can also be uploaded, with a size limit of 20MB per image[3]. However, images embedded in documents or presentations are not yet supported[3].
Files uploaded as knowledge to a Custom GPT are retained indefinitely, while files used for other purposes (e.g., Advanced Data Analysis) are retained for a duration that varies based on the user's plan[3].
In summary, OpenAI Custom GPTs support a wide range of file types, including text files, spreadsheets, presentations, documents, and images, which can be used to extend the GPT's knowledge and capabilities for specific use cases.
Quellen [1] OpenAI Custom GPTs: What You Need to Worry About https://www.paloaltonetworks.com/blog/prisma-cloud/openai-custom-gpts-security/ [2] OpenAI GPT-3 API: Which file formats can be used for fine-tuning? https://stackoverflow.com/questions/75559672/openai-gpt-3-api-which-file-formats-can-be-used-for-fine-tuning [3] File uploads FAQ - OpenAI Help Center https://help.openai.com/en/articles/8555545-file-uploads-faq [4] Introducing GPTs - OpenAI https://openai.com/blog/introducing-gpts
What are maximum sizes in number of files and file size limits
Based on the search results, the key file size and upload limits for OpenAI Custom GPTs are:
Maximum number of files per GPT: 20 files[3]
Maximum file size per file: 512 MB[2]
Maximum total file size across all files uploaded to a GPT: 100 GB per organization[2]
Maximum token count for text/document files (excluding spreadsheets): 2 million tokens per file[2]
Maximum image file size: 20 MB per image[2]
The search results also indicate that images embedded within documents or presentations are not yet supported[2].
Additionally, the files uploaded as knowledge sources to a Custom GPT are retained indefinitely, while files used for other purposes (e.g., Advanced Data Analysis) are retained for a duration that varies based on the user's plan[2].
So in summary, the key limits are 20 files per GPT, 512 MB per file, and a 100 GB total per organization, with some additional restrictions on token counts and image sizes.
Quellen [1] Number of files you can upload on a GPT - OpenAI Developer Forum https://community.openai.com/t/number-of-files-you-can-upload-on-a-gpt/523616 [2] File uploads FAQ - OpenAI Help Center https://help.openai.com/en/articles/8555545-file-uploads-faq [3] GPTs knowledge capacity limits - OpenAI Developer Forum https://community.openai.com/t/gpts-knowledge-capacity-limits/492955 [4] What is the max amount of data an Azure OpenAI Gpt-4 model can ... https://learn.microsoft.com/en-us/answers/questions/1600250/what-is-the-max-amount-of-data-an-azure-openai-gpt
What is a token and how can it be calculated?
A token represents the smallest meaningful unit of text in the context of large language models like GPT-3 and GPT-4[2][4]. It can be a single character, a group of characters, a word, or even a punctuation mark[3].
To calculate the number of tokens in a given text, you can use a tokenizer tool or follow these general guidelines:
In English, 1 token is approximately equal to 4 characters or 0.75 words[2][4]
100 tokens are roughly equivalent to 75 words[2]
1-2 sentences contain around 30 tokens, while 1 paragraph is about 100 tokens[2]
1,500 words are approximately equal to 2,048 tokens[2]
The exact number of tokens can vary depending on the language and the tokenization method used[2][4]. For example, in Spanish and French, 1 word is about 2 tokens[3].
Some key points about tokens:
Punctuation marks and special characters count as individual tokens[3]
Emojis range from 2 to 3 tokens[3]
The tokenization process can split words into sub-word tokens based on context[2]
Newer models like GPT-4 use different tokenizers than previous versions, resulting in varying token counts for the same input text[2]
To get an accurate token count for your text, you can use online tools like the GPT-4 Token Counter[3] or the Tokenizer tool provided by OpenAI[2].
Quellen [1] Token Counter https://tokenizer.streamlit.app [2] What are tokens and how to count them? - OpenAI Help Center https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them [3] GPT-4 Token Counter Online - Quizgecko https://quizgecko.com/tools/token-counter [4] What Are Tokens and How to Calculate Them - LinkedIn https://www.linkedin.com/pulse/what-tokens-how-calculate-them-rodrigo-andrade
What is a token and how can it be calculated?
A token represents the smallest meaningful unit of text in the context of large language models like GPT-3 and GPT-4[2][4]. It can be a single character, a group of characters, a word, or even a punctuation mark[3].
To calculate the number of tokens in a given text, you can use a tokenizer tool or follow these general guidelines:
In English, 1 token is approximately equal to 4 characters or 0.75 words[2][4]
100 tokens are roughly equivalent to 75 words[2]
1-2 sentences contain around 30 tokens, while 1 paragraph is about 100 tokens[2]
1,500 words are approximately equal to 2,048 tokens[2]
The exact number of tokens can vary depending on the language and the tokenization method used[2][4]. For example, in Spanish and French, 1 word is about 2 tokens[3].
Some key points about tokens:
Punctuation marks and special characters count as individual tokens[3]
Emojis range from 2 to 3 tokens[3]
The tokenization process can split words into sub-word tokens based on context[2]
Newer models like GPT-4 use different tokenizers than previous versions, resulting in varying token counts for the same input text[2]
To get an accurate token count for your text, you can use online tools like the GPT-4 Token Counter[3] or the Tokenizer tool provided by OpenAI[2].
Quellen
[1] Token Counter https://tokenizer.streamlit.app
[2] What are tokens and how to count them? - OpenAI Help Center https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them
[3] GPT-4 Token Counter Online - Quizgecko https://quizgecko.com/tools/token-counter
[4] What Are Tokens and How to Calculate Them - LinkedIn https://www.linkedin.com/pulse/what-tokens-how-calculate-them-rodrigo-andrade I
Last updated