Lädt...


🔧 LLM Function calling A.I. Chatbot from OpenAPI Document


Nachrichtenbereich: 🔧 Programmierung
🔗 Quelle: dev.to

Preface

You can create an LLM function calling A.I. chatbot from an OpenAPI document.

Just deliver your OpenAPI document like below, then LLM function calling schemas for every API endpoints would be automatically composed by @samchon/openapi. With the LLM function calling schemas for each API endpoints, let's create an A.I. chatbot conversating with your backend server.

LLM Function Calling and Structured Output

LLM selects proper function and fill arguments.

In nowadays, most LLM (Large Language Model) like OpenAI are supporting "function calling" feature. The "LLM function calling" means that LLM automatically selects a proper function and fills parameter values from conversation with the user (may by chatting text).

Structured output is another feature of LLM. The "structured output" means that LLM automatically transforms the output conversation into a structured data format like JSON.

import {
  HttpLlm,
  IHttpLlmApplication,
  IHttpLlmFunction,
  OpenApi,
  OpenApiV3,
  OpenApiV3_1,
  SwaggerV2,
} from "@samchon/openapi";
import typia from "typia";

const main = async (): Promise<void> => {
  // Read swagger document and validate it
  const swagger:
    | SwaggerV2.IDocument
    | OpenApiV3.IDocument
    | OpenApiV3_1.IDocument = JSON.parse(
    await fetch(
      "https://github.com/samchon/shopping-backend/blob/master/packages/api/swagger.json",
    ).then((r) => r.json()),
  );
  typia.assert(swagger); // recommended

  // convert to emended OpenAPI document
  const document: OpenApi.IDocument = OpenApi.convert(swagger);

  // compose LLM function calling application
  const application: IHttpLlmApplication<"chatgpt"> = HttpLlm.application({
    model: "chatgpt",
    document,
  });

  // You can get an LLM function calling schema by its endpoint
  const func: IHttpLlmFunction<"chatgpt"> | undefined =
    application.functions.find(
      (f) => f.path === "/shoppings/sellers/sale" && f.method === "post",
    );
  if (func === undefined) throw new Error("No matched function exists.");

  console.log(func);
};
main().catch(console.error);

@samchon/openapi

flowchart
  subgraph "OpenAPI Specification"
    v20("Swagger v2.0") --upgrades--> emended[["OpenAPI v3.1 (emended)"]]
    v30("OpenAPI v3.0") --upgrades--> emended
    v31("OpenAPI v3.1") --emends--> emended
  end
  subgraph "OpenAPI Generator"
    emended --normalizes--> migration[["Migration Schema"]]
    migration --"Artificial Intelligence"--> lfc{{"LLM Function Calling"}}
    lfc --"OpenAI"--> chatgpt("ChatGPT")
    lfc --"Anthropic"--> claude("Claude")
    lfc --"Google"--> gemini("Gemini")
    lfc --"Meta"--> llama("Llama")
  end

OpenAPI definitions, converters and LLM function calling application composer.

@samchon/openapi is a collection of OpenAPI types for every versions, and converters for them. In the OpenAPI types, there is an "emended" OpenAPI v3.1 specification, which has removed ambiguous and duplicated expressions for the clarity. Every conversions are based on the emended OpenAPI v3.1 specification.

  1. Swagger v2.0
  2. OpenAPI v3.0
  3. OpenAPI v3.1
  4. OpenAPI v3.1 emended

@samchon/openapi also provides LLM (Large Language Model) function calling application composer from the OpenAPI document with many strategies. With the HttpLlm module, you can perform the LLM funtion calling extremely easily just by delivering the OpenAPI (Swagger) document.

Function Calling Application

Function calling to e-commerce API operation.

I'll demonstrate the LLM function calling to the backend server by OpenAPI document through my shopping mall (e-commerce) backend project. And I'll select a target API operation as sale creation feature that seller enrolls a product to sale on the market.

Also, I selected "Microsoft Surface Pro 9" as a demonstration product that would be enrolled to the e-commerce market. Here is the markdown document describing of the "Microsoft Surface Pro 9" product, and I'll deliver it to the LLM (OpenAI ChatGPT) to perform the LLM function calling.

For reference, this is the actually working code. You also can do the same thing like me with your backend server. Note that, every API operations can be called by LLM function calling, so that you can easily create an LLM function calling A.I. chatbot just with your OpenAPI document.

import {
  HttpLlm,
  IChatGptSchema,
  IHttpLlmApplication,
  IHttpLlmFunction,
  OpenApi,
  OpenApiV3,
  OpenApiV3_1,
  SwaggerV2,
} from "@samchon/openapi";
import OpenAI from "openai";
import typia from "typia";

const main = async (): Promise<void> => {
  // Read swagger document and validate it
  const swagger:
    | SwaggerV2.IDocument
    | OpenApiV3.IDocument
    | OpenApiV3_1.IDocument = JSON.parse(
    await fetch(
      "https://github.com/samchon/shopping-backend/blob/master/packages/api/swagger.json",
    ).then((r) => r.json()),
  );
  typia.assert(swagger); // recommended

  // convert to emended OpenAPI document,
  // and compose LLM function calling application
  const document: OpenApi.IDocument = OpenApi.convert(swagger);
  const application: IHttpLlmApplication<"chatgpt"> = HttpLlm.application({
    model: "chatgpt",
    document,
  });

  // Let's imagine that LLM has selected a function to call
  const func: IHttpLlmFunction<"chatgpt"> | undefined =
    application.functions.find(
      // (f) => f.name === "llm_selected_fuction_name"
      (f) => f.path === "/shoppings/sellers/sale" && f.method === "post",
    );
  if (func === undefined) throw new Error("No matched function exists.");

  // Get arguments by ChatGPT function calling
  const client: OpenAI = new OpenAI({
    apiKey: "<YOUR_OPENAI_API_KEY>",
  });
  const completion: OpenAI.ChatCompletion =
    await client.chat.completions.create({
      model: "gpt-4o",
      messages: [
        {
          role: "assistant",
          content:
            "You are a helpful customer support assistant. Use the supplied tools to assist the user.",
        },
        {
          role: "user",
          content: "<DESCRIPTION ABOUT THE SALE>",
          // https://github.com/samchon/openapi/blob/master/examples/function-calling/prompts/microsoft-surface-pro-9.md
        },
      ],
      tools: [
        {
          type: "function",
          function: {
            name: func.name,
            description: func.description,
            parameters: func.parameters as Record<string, any>,
          },
        },
      ],
    });
  const toolCall: OpenAI.ChatCompletionMessageToolCall =
    completion.choices[0].message.tool_calls![0];

  // Actual execution by yourself
  const sale = await HttpLlm.execute({
    connection: {
      host: "http://localhost:37001",
    },
    application,
    function: func,
    input: JSON.parse(toolCall.function.arguments),
  });
  console.log("sale", sale);
};
main().catch(console.error);

Next Episode: from TypeScript Type

import { ILlmApplication, ILlmFunction, ILlmSchema } from "@samchon/openapi";
import typia from "typia";

// FUNCTION CALLING APPLICATION SCHEMA
const app: ILlmApplication<"chatgpt"> = typia.llm.application<
  BbsArticleController,
  "chatgpt"
>();
const func: ILlmFunction<"chatgpt"> | undefined = app.functions.find(
  (f) => f.name === "create",
);

console.log(app);
console.log(func);

// STRUCTURED OUTPUT
const params: ILlmSchema.IParameters<"chatgpt"> = typia.llm.parameters<
  IBbsArticle.ICreate,
  "chatgpt"
>();
console.log(params);

💻 Playground Link

Just by the TypeScript type.

You also can compose LLM function calling application schema from TypeScript class or interface type. Also, You can create structured output schema from the native TypeScript type, too.

At the next article, I'll show you how to utilize typia's LLM function calling schema composer, and how to integrate it with the A.I. chatbot application.

...

🔧 LLM Function calling A.I. Chatbot from OpenAPI Document


📈 66.74 Punkte
🔧 Programmierung

🕵️ openapi-python-client up to 0.5.2 OpenAPI Document path traversal


📈 45.53 Punkte
🕵️ Sicherheitslücken

🕵️ openapi-python-client up to 0.5.2 OpenAPI Document code injection


📈 45.53 Punkte
🕵️ Sicherheitslücken

🔧 OpenAI (LLM) Function Call Schema Generator from Swagger (OpenAPI) Document


📈 45.28 Punkte
🔧 Programmierung

🔧 Evaluation metric, objective function, loss function, cost function, scoring function, error function


📈 44.73 Punkte
🔧 Programmierung

🕵️ OpenAPI Tools OpenAPI Generator up to 5.0.x File.createTempFile temp file


📈 36.59 Punkte
🕵️ Sicherheitslücken

🕵️ OpenAPI Tools OpenAPI Generator up to 4.0.0 7PK Security Features


📈 36.59 Punkte
🕵️ Sicherheitslücken

🕵️ CVE-2024-23731 | Embedchain up to 0.1.56 OpenAPI Loader openapi.py yaml.load argument injection


📈 36.59 Punkte
🕵️ Sicherheitslücken

🕵️ OpenAPI Tools OpenAPI Generator 5.1.0 File.createTempFile privileges management


📈 36.59 Punkte
🕵️ Sicherheitslücken

🕵️ OpenAPI Tools OpenAPI Generator up to 5.0.x API Endpoint File.createTempFile temp file


📈 36.59 Punkte
🕵️ Sicherheitslücken

🔧 Build a chatbot with the new OpenAI Assistant API and Function Calling


📈 30.4 Punkte
🔧 Programmierung

🔧 Build a chatbot with the new OpenAI Assistant API and Function Calling


📈 30.4 Punkte
🔧 Programmierung

🔧 User-Aligned Functions to Improve LLM-to-API Function-Calling Accuracy


📈 30.37 Punkte
🔧 Programmierung

🔧 Enhance LLM Capabilities with Function Calling: A Practical Example


📈 30.37 Punkte
🔧 Programmierung

🔧 [Typia] LLM Function Calling Application Composer in TypeScript


📈 30.37 Punkte
🔧 Programmierung

🔧 Make the OpenAI Function Calling Work Better and Cheaper with a Two-Step Function Call 🚀


📈 30.23 Punkte
🔧 Programmierung

🔧 I made OpenAPI and LLM schema definitions


📈 27.38 Punkte
🔧 Programmierung

📰 ST-LLM: An Effective Video-LLM Baseline with Spatial-Temporal Sequence Modeling Inside LLM


📈 27.27 Punkte
🔧 AI Nachrichten

🎥 Tool Calling 101: Unlocking Context-Aware LLMs with Real-Time Data #ai #llm #realtimedata


📈 21.43 Punkte
🎥 IT Security Video

🎥 Tool Calling 101: Unlocking Context-Aware LLMs with Real-Time Data #ai #llm #realtimedata


📈 21.43 Punkte
🎥 IT Security Video

🔧 Empowering AI: The QuickJS Package for LLM Tool Calling


📈 21.43 Punkte
🔧 Programmierung

🔧 Calling code with local LLM is a hoax


📈 21.43 Punkte
🔧 Programmierung

🔧 Shall we check pointer for NULL before calling free function?


📈 21.28 Punkte
🔧 Programmierung

🔧 LangChain: Function Calling


📈 21.28 Punkte
🔧 Programmierung

🔧 How to Use Function Calling with OpenAI's new Tools Feature to Solve Word Problems


📈 21.28 Punkte
🔧 Programmierung

matomo