Hi,
I have a use case where I'd like to generate a static site from the
contents of an SQLite database, involving some joins and some data
manipulation. In the end individual pages should be generated (mostly)
for the rows in one of the tables.
Is this a use case that could easily be covered by soupault?
As far as I can tell from the docs soupault assumes a preprocessor is
called individually per file of a given type and produces one output
file per call. Is that correct? Because if so that directly contradicts
my idea of one file per row. Are there other assumptions that may make
this tricky, or do examples exist that could show the right direction?
Thanks!
On 2023-02-08 08:49, Arne wrote:
> I guess the question extends to other use cases: Is it possible to pull
> in external data (from APIs for example) and generate a dynamic amount
> of pages from that?
I've been using soupault for something similar, which involves a
pre-pre-processing stage (or 2 pass run of soupault).
Since soupault requires a single file->html conversion, you would need
to generate some sort of intermediate skeleton files from your external
data and use a template for the html conversion.
So you could use a script to extract the external data and output files
into the site directory (e.g python script which makes api requests or
sql queries and outputs single files with formatted results) the outpur
files could either be further processed by another preprocessor, or you
could export html fragments.
in the soupault.toml file, add something like this...
page_file_extensions = ["txt", "html", "py", "sh"]
[preprocessors]
sh = "zsh"
py = "/usr/bin/env python"