Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@ on:
- 'docs/*/**'
branches: [master]
pull_request:
paths-ignore:
- 'docs/*/**'

jobs:
continuous-integration:
Expand Down
33 changes: 33 additions & 0 deletions .github/workflows/docs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
name: Build Docs

on:
push:
branches: [master]
pull_request:
branches: [master]

jobs:
build-docs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4

- uses: DeterminateSystems/nix-installer-action@main

- uses: DeterminateSystems/magic-nix-cache-action@main

- name: Build docs
run: nix build .#docs -L

- name: Upload docs artifact
uses: actions/upload-artifact@v4
with:
name: beam-docs
path: result/

- name: Deploy to GitHub Pages
if: github.ref == 'refs/heads/master'
uses: peaceiris/actions-gh-pages@v4
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./result
6 changes: 0 additions & 6 deletions .github/workflows/nix-flake.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,12 +17,6 @@ jobs:
- uses: actions/checkout@v4
- uses: DeterminateSystems/nix-installer-action@main
- uses: DeterminateSystems/magic-nix-cache-action@main
# duckdb-ffi is marked as broken, and thus we cannot build
# beam-duckdb via nix at this time
- name: Disable beam-duckdb
run: |
# We must specify a backup extension on MacOS
sed -i.bak '/beam-duckdb/d' cabal.project && rm cabal.project.bak
- name: "Check `nix develop` shell"
run: nix develop --check
- name: "Check `nix develop` shell can run command"
Expand Down
77 changes: 40 additions & 37 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,50 +53,24 @@ For questions, feel free to join
our [mailing list](https://groups.google.com/forum/#!forum/beam-discussion) or
head over to `#haskell-beam` on freenode.

## A word on testing

`beam-core` has in-depth unit tests to test query generation over an idealized
ANSI SQL-compliant backend. You may be concerned that there are no tests in
either `beam-sqlite` or `beam-postgres`. Do not be alarmed. The documentation
contains many, many examples of queries written over the sample Chinook
database, the schema for which can be found at
`beam-sqlite/examples/Chinook/Schema.hs`. The included `mkdocs` configuration
and custom `beam_query` python Markdown extension automatically run every query
in the documentation against a live database connection. Any errors in
serializion/deserialization or invalid syntax are caught while building the
documentation. Feel free to open pull-requests with additional examples/tests.

Tests are written

~~~markdown
!beam-query
```haskell
!example <template-name> <requirements>
do x <- all_ (customer chinookDb) -- chinookDb available under chinook and chinookdml examples
pure x
```
~~~

The `!beam-query` declaration indicates this is markdown code block that
contains beam query code. The `!example` declaration indicates that this example
should be built against applicable backends and included in the code. The
`template_name` is either `chinook` or `chinookdml` (depending on whether you
have quest a query or a DML statement). For `chinook`, the included code should
produce a `Q` query. For `chinookdml`, the included code should be a monadic
action in a `MonadBeam`. The `requirements` can be used to select which backends
to run this against. See the documentation for examples.

## Building the documentation

Beam uses [`mkdocs`](https://www.mkdocs.org/) for its documentation generation.

### Requirements
* Python installation with [`mkdocs` module](https://pypi.org/project/mkdocs/)
* Alternatively, open the Nix Flake shell via `nix develop`.

Then run `build-docs.sh`.
The dependencies to build documentation are packaged via Nix. You can build the
documentation using:

```console
nix build .#docs
```

TODO: define Nix package for docs bundle.
or, if you want to see what's going on in great detail:

```console
nix build .#docs -L
```

The documentation uses a custom Markdown preprocessor to automatically build
examples against the canonical Chinook database. By default, beam will build
Expand Down Expand Up @@ -128,3 +102,32 @@ to
enabled_backends:
- beam-sqlite
```

### Checking queries in documentation
The documentation contains many, many examples of queries written over the sample Chinook
database, the schema for which can be found at
`beam-sqlite/examples/Chinook/Schema.hs`. The included `mkdocs` configuration
and custom `beam_query` python Markdown extension automatically run every query
in the documentation against a live database connection. Any errors in
serializion/deserialization or invalid syntax are caught while building the
documentation. Feel free to open pull-requests with additional examples/tests.

Tests are written

~~~markdown
!beam-query
```haskell
!example <template-name> <requirements>
do x <- all_ (customer chinookDb) -- chinookDb available under chinook and chinookdml examples
pure x
```
~~~

The `!beam-query` declaration indicates this is markdown code block that
contains beam query code. The `!example` declaration indicates that this example
should be built against applicable backends and included in the code. The
`template_name` is either `chinook` or `chinookdml` (depending on whether you
have quest a query or a DML statement). For `chinook`, the included code should
produce a `Q` query. For `chinookdml`, the included code should be a monadic
action in a `MonadBeam`. The `requirements` can be used to select which backends
to run this against. See the documentation for examples.
5 changes: 5 additions & 0 deletions beam-duckdb/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,10 @@
# Revision history for beam-duckdb

## 0.1.1.0 -- unreleased

* Fixed an issue with modeling boolean conditions, whereby, for example, checking if something was true was modeled as `... IS 1` (as for Sqlite), rather than `... IS TRUE` (like Postgres).
* Added a `FromBackendRow DuckDB Scientific`, `HasSqlEqualityCheck DuckDB Scientific`, and `HasSqlQuantifiedEqualityCheck DuckDB Scientific` instances, which are required to build the documentation.

## 0.1.0.0 -- 2026-02-26

* First version. Released on an unsuspecting world.
27 changes: 27 additions & 0 deletions beam-duckdb/beam-docs.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
#!/usr/bin/env bash

set -e

. ${BEAM_DOCS_LIBRARY}

DUCKDB_DB=$1

print_open_statement() {
echo "chinook <- open \"chinook.ddb\""
}

if [ -f $DUCKDB_DB ]; then
print_open_statement
exit 0
fi

beam_doc_status "Creating temporary $DUCKDB_DB..."

rm -f $DUCKDB_DB.tmp
duckdb $DUCKDB_DB.tmp < chinook-data/Chinook_DuckDB.sql

beam_doc_status "Success, creating $DUCKDB_DB"

mv $DUCKDB_DB.tmp $DUCKDB_DB

print_open_statement
3 changes: 2 additions & 1 deletion beam-duckdb/beam-duckdb.cabal
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
cabal-version: 3.0
name: beam-duckdb
version: 0.1.0.0
version: 0.1.1.0
synopsis: DuckDB backend for Beam
description: Beam driver for DuckDB, an analytics-focused open-source in-process database.
license: MIT
Expand Down Expand Up @@ -44,6 +44,7 @@ library
, duckdb-simple ^>=0.1
, dlist >=0.8 && <1.1
, free >=4.12 && <5.3
, scientific ^>=0.3
, text >=1.0 && <2.2
, time >=1.6 && <1.16
, transformers >=0.3 && <0.7
Expand Down
151 changes: 151 additions & 0 deletions beam-duckdb/docs/Chinook.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,151 @@
-- This script is adapted from
-- https://github.com/RandomFractals/duckdb-sql-tools/blob/main/data/chinook/duckdb/create.sql
-- to match the schema that is used for Postgres and Sqlite
drop table if exists Album;
drop table if exists Artist;
drop table if exists Customer;
drop table if exists Employee;
drop table if exists Genre;
drop table if exists Invoice;
drop table if exists InvoiceLine;
drop table if exists MediaType;
drop table if exists Playlist;
drop table if exists PlaylistTrack;
drop table if exists Track;
drop sequence if exists invoice_id_seq;

create table Album
(
AlbumId integer not null,
Title nvarchar(160) not null,
ArtistId integer not null,
constraint pk_album primary key (AlbumId)
);

create table Artist
(
ArtistId integer not null,
Name nvarchar(120),
constraint pk_artist primary key (ArtistId)
);

create table Customer
(
CustomerId integer not null,
FirstName nvarchar(40) not null,
LastName nvarchar(20) not null,
Company nvarchar(80),
Address nvarchar(70),
City nvarchar(40),
State nvarchar(40),
Country nvarchar(40),
PostalCode nvarchar(10),
Phone nvarchar(24),
Fax nvarchar(24),
Email nvarchar(60) not null,
SupportRepId integer,
constraint pk_customer primary key (CustomerId)
);

create table Employee
(
EmployeeId integer not null,
LastName nvarchar(20) not null,
FirstName nvarchar(20) not null,
Title nvarchar(30),
ReportsTo integer,
BirthDate date,
HireDate date,
Address nvarchar(70),
City nvarchar(40),
State nvarchar(40),
Country nvarchar(40),
PostalCode nvarchar(10),
Phone nvarchar(24),
Fax nvarchar(24),
Email nvarchar(60),
constraint pk_employee primary key (EmployeeId)
);

create table Genre
(
GenreId integer not null,
Name nvarchar(120),
constraint pk_genre primary key (GenreId)
);

create sequence invoice_id_seq start 1;

create table Invoice
(
InvoiceId integer primary key default nextval('invoice_id_seq'),
CustomerId integer not null,
InvoiceDate date not null,
BillingAddress nvarchar(70),
BillingCity nvarchar(40),
BillingState nvarchar(40),
BillingCountry nvarchar(40),
BillingPostalCode nvarchar(10),
Total numeric(10,2) not null
);

create table InvoiceLine
(
InvoiceLineId integer not null,
InvoiceId integer not null,
TrackId integer not null,
UnitPrice numeric(10,2) not null,
Quantity integer not null,
constraint pk_invoice_line primary key (InvoiceLineId)
);

create table MediaType
(
MediaTypeId integer not null,
Name nvarchar(120),
constraint pk_media_type primary key (MediaTypeId)
);

create table Playlist
(
PlaylistId integer not null,
Name nvarchar(120),
constraint pk_playlist primary key (PlaylistId)
);

create table PlaylistTrack
(
PlaylistId integer not null,
TrackId integer not null,
constraint pk_playlist_track primary key (PlaylistId, TrackId)
);

create table Track
(
TrackId integer not null,
Name nvarchar(200) not null,
AlbumId integer,
MediaTypeId integer not null,
GenreId integer,
Composer nvarchar(220),
Milliseconds integer not null,
Bytes integer,
UnitPrice numeric(10,2) not null,
constraint pk_track primary key (TrackId)
);

create index ifk_album_artist_id on Album (ArtistId);
create index ifk_customer_support_rep_id on Customer (SupportRepId);
create index ifk_employee_reports_to on Employee (ReportsTo);
create index ifk_invoice_customer_id on Invoice (CustomerId);
create index ifk_invoice_item_invoice_id on InvoiceLine (InvoiceId);
create index ifk_invoice_item_track_id on InvoiceLine (TrackId);
create index ifk_playlist_track_track_id on PlaylistTrack (TrackId);
create index ifk_track_album_id on Track (AlbumId);
create index ifk_track_genre_id on Track (GenreId);
create index ifk_track_media_type_id on Track (MediaTypeId);


-- Inserting just enough data for the documentation to build
INSERT INTO Customer (customerId, firstName, lastName, company, address, city, state, country, postalCode, phone, fax, email, supportRepId) VALUES
(14, N'Mark', N'Philips', N'Telus', N'8210 111 ST NW', N'Edmonton', N'AB', N'Canada', N'T6G 2C7', N'+1 (780) 434-4554', N'+1 (780) 434-5565', N'mphilips12@shaw.ca', 5)
8 changes: 8 additions & 0 deletions beam-duckdb/src/Database/Beam/DuckDB.hs
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,9 @@
module Database.Beam.DuckDB
( -- * Executing DuckDB queries
runBeamDuckDB,
-- ** Executing DuckDB queries with debugging
runBeamDuckDBDebug,
runBeamDuckDBDebugString,

-- * Backend datatype
DuckDB,
Expand Down Expand Up @@ -109,6 +111,12 @@ runBeamDuckDBDebug :: (Text -> IO ()) -> Connection -> DuckDBM a -> IO a
runBeamDuckDBDebug debug conn action =
runReaderT (runDuckDBM action) (debug, conn)

-- | Like 'runBeamDuckDBDebug', but accepts a 'String' argument instead of 'Text'.
--
-- This is provided for compatibility with other backends
runBeamDuckDBDebugString :: (String -> IO ()) -> Connection -> DuckDBM a -> IO a
runBeamDuckDBDebugString debug = runBeamDuckDBDebug (debug . Text.unpack)

newtype BeamDuckDBParams = BeamDuckDBParams [SomeField]

instance ToRow BeamDuckDBParams where
Expand Down
Loading
Loading