Skip to main content
All CollectionsWrk Action LibraryOpenAI
Generate a response with GPT3.5 turbo 16k
Generate a response with GPT3.5 turbo 16k
Wrk Product avatar
Written by Wrk Product
Updated over a week ago

Uses ChatGPT 3.5 with 16k context to generate human-like responses to a given prompt or question.

Common use cases

  • Generative AI

Application

  • OpenAI GPT

Inputs (what you have)

Name

Description

Data Type

Required?

Prompt

The prompt to generate completions for. Input will be truncated if it exceeds the limit set by the API

Text(Long)

Yes

System message

The message which sets the context and expectations for a conversation with the AI

Text(Long)

No

Temperature

Sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic

Number w/ decimals

No

Maximum length

The maximum number of tokens to generate in the completion. Increasing maximum length will decrease the amount of tokens available to the prompt

Integer

No

Frequency penalty

A number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim

Number w/ decimals

No

Presence penalty

A number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model's likelihood to talk about new topics

Number w/ decimals

No

Note: the value of inputs can either be a set value in the configuration of the Wrk Action within the Wrkflow, or a variable from the Data library section. These variables in the Data library section are the outputs of previous Wrk Actions in the Wrkflow.


Outputs (what you get)

Name

Description

Data Type

Required

Generated text

Text content created by artificial intelligence.

Text(Long)

Required

Did this answer your question?