Creating prompts is crucial to using AI in Salesforce. GPTfy offers a user-friendly, structured UI that simplifies prompt creation, minimizes clicks, and saves users significant time.
The prompt builder enables users to test prompts within the same interface, ensuring that expectations align with results.
All functionalities are consolidated on one page, reducing maximum hand-holding to the user or customer.
Before You Begin:
- Connect your desired AI Model.
- Set up Data Context Mapping for your target object.
Step 1: Create a new prompt
Initially, while creating the prompt, it will ask for five fields:
Users can fill in the information and click on the Save button. A prompt will be created.
- Prompt Name: Give it a clear and descriptive title.
- Target Object: Select the object with which you want to use the prompt (e.g., Account).
- Data Context Mapping: Choose the specific mapping based on your needs.
- AI Model: Pick the AI Model that will power your prompt (e.g., GPTfy, OpenAI)
- Type: Select the type of prompt. The available types include JSON and text.
Text Prompt: Ideal for predefined prompts in your own words. Enter the desired prompt command in the text box during creation.
JSON Prompt: Utilize the product’s full potential with predefined actions. Follow this article for creating components.
After creating the Prompt user will see the detailed UI of Prompt Builder.
The prompt builder UI contains four main tabs explained below, which contain different sections:
- Configuration
- Grounding
- Prompt
- Actions
Step 2: Configure the prompt
- Configuration: This tab contains the information needed to create the prompt. The user can configure the prompt using these fields. There are four sections in the Configuration tab.
- Object- Target object of the prompt
- Type – The prompt like Text/JSON
- AI Prompt Name – Shows the Prompt name
- Status – Shows the status of the prompt like
- Draft/Active / Inactive
- Description – Description for the prompt
- Purpose – This is the multi-select picklist that shows the purpose of the prompt
- Prompt Request Id – This is the auto-generated unique Prompt Request Id
- Include Files – The checkbox allows the files to be included in the prompt during execution.
- How it Works – This field is used to tell the user what is the need of this prompt and also explain how this prompt will work. This field is also available on the GPTfy Console.
- Allow User Input lets users manually enter commands while running any prompt.
- Append Timestamp – By enabling this checkbox, the Date and time will be stamped in the Target field.
Prompt Availability: This section provides Visibility conditions and Profiles to restrict some users from running the prompt as required.
- Visibility Condition: This field acts like a filter, controlling when a prompt can be used on a specific record. Click on the pencil icon as shown above to create the visibility condition.
Specify conditions by using query builder; the field Condition Visibility auto-populates.
Visibility Exception Message: This is a text field that Prompt Builder can use to explain why the user cannot run the prompt on that particular account. The same message will appear on the GPTfy console when the user hovers over the Run GPTfy Button.
Profile: This multi-select picklist allows the user to decide which profile can access the prompt.
If the user wants to run on the system admin profile The prompt fails to appear in the “Select Prompt” dropdown on the Record page. It becomes visible only when the user is on the correct profile.same as given on prompt configuration.
Profile – This is a multi-select picklist by which the user can decide which profile user can access the prompt.
AI Model: This section contains the details regarding the AI Model settings like Name, temp, Top P, Max Output Tokens
- AI Model: The user can add the AI Model that will power the prompt (e.g., GPTfy, OpenAI).
- Temperature: This field Controls the balance between creativity and predictability in the output (0-1). Higher values lead to more diverse responses.
- Top P: This field decides how many possible words the AI considers, influencing the response’s diversity. Higher values lead to more unique results.
- Max Output Tokens: This field allows the user to set the maximum length of the generated response.
Step 3: Grounding (Ethical & Content)
Grounding—This tab allows you to add “extra flavor” to your prompt command, making the response more tailored to your specific needs.
- Examples might include avoiding bias, promoting fairness, or staying on topic.
Sections:
- Grounding rules: These are multi-select picklist values. When selected and saved, they are appended to the prompt command during activation, influencing the AI’s generation process.
Locale Configuration in GPTfy
GPTfy derives the locale dynamically based on how the prompt is called. This locale information is appended to the prompt command and sent to the AI, ensuring personalized and accurate responses based on context. Below are the details for configuring locale settings:
Locale Options and Usage
Locale – Automation
- Description: Used when the prompt is executed through AI Mass Processing, Flows, or REST API.
- Options:
- Company Default: Applies the standard organization-wide locale for consistent automated formatting.
- Record Owner: Adopts the locale of the record owner to personalize date, time, and currency formatting.
Locale – GPTfy Console
- Description: Used when the prompt is executed from the GPTfy User Console.
- Options:
- Active User: Uses the current user’s locale for a tailored experience.
- Company Default: Applies the standard organization-wide locale for consistency across operations.
Step 4: Prompt Command and Quick testing
Prompt – This tab shows the Prompt command and enables users to search the record and run the prompt.
- Prompt Command—A prompt Command is a text box where the user can enter the command as expected. If the prompt is a text prompt, a prompt command text box will appear, and if the prompt type is JSON, the prompt component list will appear.
Users can search the target object’s record from the lookup below and run the GPTfy for quick testing, even if the prompt is inactive. A security audit record will be created after execution.
Step 5: Actions
The Prompt Action Framework is a functionality within the Prompt Builder that allows users to perform various actions on a record when a prompt is run. These actions are specific to each prompt and can be configured in the Actions Tab of the prompt setup. For more details, Click Here.
Step 6: Save & Activate
Click “Save” to create your prompt. Once saved, a unique “Prompt Request ID” appears for reference.
Click “Activate” to make it available in the GPTfy Console dropdown. This will trigger a validation check. Upon successful validation, your prompt is ready to use!
Editing the Prompt—There are no separate buttons for editing the prompt. When the user deactivates the prompt, all prompt fields become editable. Upon deactivation, the prompt’s status is set to “Draft.”
The user can then Activate the prompt again and make a clone of it by using the ‘Clone with Related’ button.
Step 7: Version Control and Audits
- Enable the prompt versioning from the preferences tab if the AI setting is enabled to enable the functionality.
Version Control: Every activation creates a JSON version that is stored under Files for tracking. Track changes and revert to previous versions if needed.
- The “Reinstate” button allows you to restore an older version.
Security Audit: The user can check the security audit records beside the Versions tab. Whenever the user runs the prompt, the security audit record is added.