Execute an AI-Prompt


Syntax

aiPrompt(string $prompt, ?array $params): mixed


Parameter

@param string $prompt The prompt-text to send to the AI-Engine.

@param array|ull $params Optional AI-Engine parameter.

[engine] - string (optional, default NULL)

[engine_params] - array (optional, default NULL)

  • [model] - string|null (optional, default NULL) The AI-Model to be used. E.g.: "gpt-3.5-turbo", "gpt-4o", "o1", "o1-mini", "o3", ... See: https://platform.openai.com/docs/models.
  • [temperature] - float (optional, default 1) The temperature to be used.
  • [top_p] - float (optional, default 1) The top-p to be used.
  • [n] - int (optional, default 1) The number of responses to be generated.
  • [stream] - boolean (optional, default FALSE) If TRUE, then the response will be streamed.
  • [stop] - string|null (optional, default NULL) The stop-token to be used.
  • [max_tokens] - int (optional, default 1000) The maximum tokens to be used.
  • [presence_penalty] - float (optional, default 0) The presence-penalty to be used.
  • [frequency_penalty] - float (optional, default 0) The frequency-penalty to be used.
  • [logit_bias] - float|null (optional, default NULL) The logit-bias to be used.



Return

Returns the direct result of prompt. Either a string or an array (In case the answer was a JSON-String).


Examples

Writing in your twig source code:

String Answer:

{% set answer = hublify.aiPrompt('Write a short poem about the beautiful city Hamburg in Germany.') %}