-
Notifications
You must be signed in to change notification settings - Fork 1
Expand file tree
/
Copy path04-llm-classification.yaml
More file actions
69 lines (61 loc) · 1.6 KB
/
04-llm-classification.yaml
File metadata and controls
69 lines (61 loc) · 1.6 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
# Tutorial 4: LLM-powered classification
# Reads support tickets from CSV and classifies them using an LLM
#
# Requires: ANTHROPIC_API_KEY, OPENAI_API_KEY, or GEMINI_API_KEY
# Or use a local Ollama model (no API key needed):
# Change provider to "local" and model to "llama3"
#
# Run: weaver apply docs/tutorials/04-llm-classification.yaml
name: LLMClassification
tag: tutorial
engine: local
dataSources:
- id: tickets
type: File
config:
path: docs/tutorials/data/sample_tickets.csv
format: csv
transformations:
- id: classified
type: LLMTransform
sources:
- tickets
config:
provider: gemini
model: gemini-2.0-flash
prompt: |
Classify this support ticket:
Title: {title}
Description: {description}
Return JSON with:
- category: bug, feature_request, question, or complaint
- priority: low, medium, or high
- sentiment: positive, neutral, or negative
inputColumns: title,description
outputSchema: category:string|priority:string|sentiment:string
batchSize: "1"
retryOnError: "3"
- id: quality
type: DataQuality
sources:
- classified
checks:
- row_count > 0
onFail: warn
sinks:
- id: report
type: File
source: quality
config:
path: output/classified_tickets
format: json
saveMode: Overwrite
coalesce: "1"
profiles:
local:
engine: local
transformations.classified.config.provider: local
transformations.classified.config.model: llama3
tests:
- name: "tickets classified"
assert: report.row_count > 0