~dmbaturin/soupault

2 2

SQLite as data source

Details
Message ID
<87a61o38in.fsf@arnes.space>
DKIM signature
missing
Download raw message
Hi,

I have a use case where I'd like to generate a static site from the
contents of an SQLite database, involving some joins and some data
manipulation. In the end individual pages should be generated (mostly)
for the rows in one of the tables.

Is this a use case that could easily be covered by soupault?

As far as I can tell from the docs soupault assumes a preprocessor is
called individually per file of a given type and produces one output
file per call. Is that correct? Because if so that directly contradicts
my idea of one file per row. Are there other assumptions that may make
this tricky, or do examples exist that could show the right direction?

Thanks!
Details
Message ID
<87fsbg61gc.fsf@arnes.space>
In-Reply-To
<87a61o38in.fsf@arnes.space> (view parent)
DKIM signature
missing
Download raw message
I guess the question extends to other use cases: Is it possible to pull
in external data (from APIs for example) and generate a dynamic amount
of pages from that?
Details
Message ID
<e7a38a46-aa6e-3a8d-8ba1-f72b1b09be2d@fo.am>
In-Reply-To
<87fsbg61gc.fsf@arnes.space> (view parent)
DKIM signature
missing
Download raw message
On 2023-02-08 08:49, Arne wrote:
> I guess the question extends to other use cases: Is it possible to pull
> in external data (from APIs for example) and generate a dynamic amount
> of pages from that?

I've been using soupault for something similar, which involves a 
pre-pre-processing stage (or 2 pass run of soupault).

Since soupault requires a single file->html conversion, you would need 
to generate some sort of intermediate skeleton files from your external 
data and use a template for the html conversion.

So you could use a script to extract the external data and output files 
into the site directory (e.g python script which makes api requests or 
sql queries and outputs single files with formatted results) the outpur 
files could either be further processed by another preprocessor, or you 
could export html fragments.

in the soupault.toml file, add something like this...

page_file_extensions =  ["txt", "html", "py", "sh"]

[preprocessors]
     sh = "zsh"
     py = "/usr/bin/env python"
Reply to thread Export thread (mbox)