fin-summary is a lightweight Python package that extracts structured, actionable information from user‑provided text describing financial transaction issues (e.g., processing fees, settlement delays). It uses pattern matching combined with a language model to identify key details such as issue type, amount, timeline, and recommended steps, returning a concise summary that can be directly acted upon.
- Simple one‑function API (
fin_summary) - Works out‑of‑the‑box with the default ChatLLM7 model from
langchain_llm7 - Plug‑in friendly – you can provide any LangChain‑compatible LLM (OpenAI, Anthropic, Google, etc.)
- Returns a list of extracted strings that match the supplied regex pattern
pip install fin_summaryfrom fin_summary import fin_summary
# Example user description of a problem
user_input = """
I was charged an extra $15 processing fee on my $200
transfer that should have settled yesterday, but it still shows
as pending. What should I do?
"""
# Use the default ChatLLM7 model (requires an API key)
summary = fin_summary(user_input)
print(summary)
# -> ['Issue type: processing fee', 'Amount: $15', 'Original amount: $200', ...]| Parameter | Type | Description |
|---|---|---|
user_input |
str |
The free‑form text describing the financial issue. |
llm |
Optional[BaseChatModel] |
A LangChain LLM instance. If omitted, the package creates a ChatLLM7 instance using the provided api_key or the LLM7_API_KEY environment variable. |
api_key |
Optional[str] |
API key for LLM7. If not supplied, the package reads LLM7_API_KEY from the environment. |
You can pass any LangChain LLM that implements BaseChatModel. Below are examples with popular providers.
from langchain_openai import ChatOpenAI
from fin_summary import fin_summary
llm = ChatOpenAI()
response = fin_summary(user_input, llm=llm)from langchain_anthropic import ChatAnthropic
from fin_summary import fin_summary
llm = ChatAnthropic()
response = fin_summary(user_input, llm=llm)from langchain_google_genai import ChatGoogleGenerativeAI
from fin_summary import fin_summary
llm = ChatGoogleGenerativeAI()
response = fin_summary(user_input, llm=llm)- Default LLM:
ChatLLM7(fromlangchain_llm7)
Documentation: https://pypi.org/project/langchain-llm7/ - Free‑tier rate limits are sufficient for typical usage of this package.
- To increase limits, provide your own API key:
export LLM7_API_KEY="your_api_key"or directly in code:
response = fin_summary(user_input, api_key="your_api_key")You can obtain a free API key by registering at https://token.llm7.io/.
If you encounter any problems or have feature requests, please open an issue on GitHub:
https://github.com/chigwell/fin-summary/issues
Eugene Evstafev
Email: hi@euegne.plus
GitHub: chigwell
Happy summarizing! 🚀