Script to build OP Curations and cache the output#1196
Conversation
|
Going to take a more thorough look later but I think generally checking in the cached content is still the way to go, similar to how our other pages on the site that are derived from other data work. Possibly a future approach just to minimize the amount of data could be, like what we do with reference and contributor docs and such, is to have the script transform the data into a more directly usable format instead of storing it verbatim? e.g. just the titles and sketch size or something. But storing the full data does give us a bit more flexibility to change what we do with it, so it's also not a bad approach at all |
|
Caching at build time makes sense. Maybe we could reduce repo bloat by transforming the open-processing response and only storing the fields the site actually uses, also would a single consolidated op-curations.json file make updates cleaner than multiple generated files? |
|
Hello, Following up on an email convo I had with @ksen0 on this. Few notes: I will soon enforce BEARER tokens on API requests on OP, so I can imagine it will create some extra hassle on this repo to manage and pass around these tokens. Given that, it makes most sense to me to create a specific endpoint for you to pull all the necessary JSON information in a single shot. This could also be cached on my backend to prevent any resource clogging whenever multiple parties are building the website at the same time. I will check on the code and try to extract the data you need, and I will bring this into a single call, such as: You can also cache the output of this in case something goes off on OP site during a build and skip the data refresh on error (5XX, 4XX responses). |
Relates to #1187
This is marked as a draft because it is not clear if a different solution might be possible. Alternatives that I am investigating with OP in the next week-ish:
If keeping build process, then need a better approach; even if caching, this particular script generates a large amount of checked in json files which is not ideal.
Full disclosure on AI use: copilot was used for the script, since I was on a time crunch for the fix.