784 B
784 B
LLM API
LMstudio
Use this file src\lib\lmstudio.ts
- async function callLmstudio(prompt: string): Promise { for just run prompt
- async function callLMStudioAPIWithFile(imagePath: string, prompt: string): Promise { for send file to llm
OpenAI
Use this file src\lib\openai.ts
- async function callOpenAI(prompt: string): Promise for just run prompt
- async function callOpenAIWithFile(imagePath: string, prompt: string): Promise for send file to llm
Please construct prompt to return json alswasy for calling llm api to generate text. If nothing specified add following instructin in the given prompt
Return the result in this forket {"result":""}
Then extract the result param in program you generate, don't change the original function