ollama

package
v0.0.6 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 11, 2024 License: MIT Imports: 10 Imported by: 0

Documentation

Overview

Package ollama provides a generative chat bot.

Download Ollama server here: <https://ollama.com/download/mac>. To run the model execute `ollama run ollama3.2` in terminal. API primitives are taken from <https://dshills.medium.com/go-ollama-simple-local-ai-3a89be4bfbaf>.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type Message

type Message struct {
	Role    string `json:"role"`
	Content string `json:"content"`
}

type Ollama

type Ollama struct {
	// contains filtered or unexported fields
}

func New

func New(model, URL string) *Ollama

func (*Ollama) JoinChat

func (o *Ollama) JoinChat(ctx context.Context, c *watermillchat.Chat, botName, roomName string)

func (*Ollama) SendMessage

func (o *Ollama) SendMessage(ctx context.Context, m string) (string, error)

type Request

type Request struct {
	Model    string    `json:"model"`
	Messages []Message `json:"messages"`
	Stream   bool      `json:"stream"`
}

type Response

type Response struct {
	Model              string    `json:"model"`
	CreatedAt          time.Time `json:"created_at"`
	Message            Message   `json:"message"`
	Done               bool      `json:"done"`
	TotalDuration      int64     `json:"total_duration"`
	LoadDuration       int       `json:"load_duration"`
	PromptEvalCount    int       `json:"prompt_eval_count"`
	PromptEvalDuration int       `json:"prompt_eval_duration"`
	EvalCount          int       `json:"eval_count"`
	EvalDuration       int64     `json:"eval_duration"`
}

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL