~zzkt

////////////////

Recent activity

Re: SQLite as data source 2 years ago

From nik gaffney to ~dmbaturin/soupault

On 2023-02-08 08:49, Arne wrote:
> I guess the question extends to other use cases: Is it possible to pull
> in external data (from APIs for example) and generate a dynamic amount
> of pages from that?

I've been using soupault for something similar, which involves a 
pre-pre-processing stage (or 2 pass run of soupault).

Since soupault requires a single file->html conversion, you would need 
to generate some sort of intermediate skeleton files from your external 
data and use a template for the html conversion.

So you could use a script to extract the external data and output files

Re: caching and preprocessors 2 years ago

From nik gaffney to ~dmbaturin/soupault

On 2023-01-27 13:28, Daniil Baturin wrote:
> I'm not sure if it's worthwhile, though. I'm by no means against caching 
> asset processor outputs, just not sure if trying to embed an asset 
> management
> system inside of soupault is a good idea or not.

It's probably more trouble than it's worth. That said, if asset 
processing adds a significant build overhead, might be a good idea to 
include some suggestions and/or examples in the docs.

I can look at cleaning up the image processing scripts i'm currently 
using as a starting point.

Re: caching and preprocessors 2 years ago

From nik gaffney to ~dmbaturin/soupault

On 2023-01-27 05:07, Daniil Baturin wrote:
>  >do you have any plans to add caching for asset_processors?
> 
> That's complicated. With pages, it's simple since the output path is 
> decided by soupault itself.
> With asset processors, the user specifies a template for generating a 
> complete command.
> That is required to accommodate commands with peculiar syntax that makes 
> it impossible to just append the output file path,
> and to allow original and processed files to have different extensions.
> However, it also means that soupault doesn't actually know the output 
> path and cannot replicate what the user-given command would do.

Re: caching and preprocessors 2 years ago

From nik gaffney to ~dmbaturin/soupault

On the topic of caching, do you have any plans to add caching for 
asset_processors?

I'm currently using an external script for the asset_processors which 
checks pre and post checksums. would certainly simplify things if that 
was part of the standard build process.

also, is there a way to ensure asset_processors are run after the 
preprocessors have completed?

Re: caching and preprocessors 2 years ago

From nik gaffney to ~dmbaturin/soupault

Thanks! that fixed it.

On 2023-01-26 03:49, Daniil Baturin wrote:
> Hi Nik,
> 
> I fixed the problem. It was a funny case of missing parentheses that 
> made bits of actual logic interpreted
> as a part of a debug log function body: 
> https://codeberg.org/PataphysicalSociety/soupault/commit/599f0f921c32b0d5daf41e5ba4fa369f55acb15c
> 
> Could you try building again and let me know if it works for you without 
> debug now?
> 

Re: caching and preprocessors 2 years ago

From nik gaffney to ~dmbaturin/soupault

The site isn't public yet, but i'll try making a minimal example to see 
if the bug can be replicated.

On 2023-01-25 11:54, Daniil Baturin wrote:
> This is odd. There shouldn't be a need for any other settings, so you 
> have likely found a bug.
> 
> Is your site source public so that I could test it myself on the same data?
> 
> On 1/25/23 10:25, nik.srht@fo.am wrote:
>> I've been trying to get the new caching feature to work, but keep 
>> getting errors about missing hash files.
>>

caching and preprocessors 2 years ago

From to ~dmbaturin/soupault

I've been trying to get the new caching feature to work, but keep 
getting errors about missing hash files.

[WARNING] Cache directory for page site/crystal/Calibrating Future 
Experiences.odt does not contain a page source hash file 
(.page_source_hash),cache will be discarded!

the config includes these settings

   caching = true
   cache_dir = ".soupault-cache"

Pages are generated in the preprocessors stage using an external pandoc 
script. The site is generated as expected and looks like the site