Skip to main content

LLM Components

We provide some components for LLM (Large language models), you can easily access and analyze financial data, real-time market data, even tell AI to submit orders.

Yes, you can do it via LongPort OpenAPI with our LLM components, start today!

LLMs Text

The OpenAPI Docs follow LLMs Text to provide llms.txt and Markdown files for each documents.

Our each document is also available in Markdown format, when you visit them, just add .md suffix to the URL.

For example:

MCP

We in building the MCP implementation for LongPort OpenAPI (Based on our SDK), you can use it in every AI platform that supported MCP.

And is also open source in our GitHub organization.

https://github.com/longportapp/openapi

Installation

Visit https://github.com/longportapp/openapi/releases to download the latest release.

Usage

When you installed successfully, you will have a longport-mcp command line tool.

NOTE: You must follow Getting Started to configure your environment.

The environment LONGPORT_APP_KEY, LONGPORT_APP_SECRET and LONGPORT_ACCESS_TOKEN must be set before you start the MCP server.

Configuration LongPort MCP in your AI Chat

This part we will show you how to configure LongPort MCP in your AI chat (The screenshot have used Cherry Studio).

Use STDIO mode:

Ensure you have already configured your environment variables and install the longport-mcp command line tool in your system.

Use SSE mode:

You must to start SSE server first, you can use the following command:

longport-mcp --sse

And then configure your AI chat to use http://localhost:8000.