[davidmead]importing a WordPress export, back in WordPress is missing the last 2 years of content. This blog move is not going as planned. owning your content is a pain-in-the-arse 😉
[davidmead]thx GWG/ looking at the XML file is a message ><b>Fatal error</b>: Out of memory (allocated 242221056) (tried to allocate 130968 bytes) which is great to know 2 days after wiping it
Zegnat[davidmead] what was the backup process you did to get that XML file? Maybe we can warn people on the wiki if the backup process doesn’t tell you when it fails
AkyRhO, wolftune, friedcell, swentel and [Vincent] joined the channel
[Vincent]I also wanted to make it a feed option/newsletter generation. The idea being people who just want to “keep updated” can do it that way, rather than a daily stream
krychu_ and [relapse] joined the channel; friedcell left the channel
[relapse][davidmead] I believe the MySQL default backups is a bunch of SQL statements. So you should be able to slice the file up and import in blocks.
[metbril]Would it be possible to upload an archive of old mentions and comments to webmention.io or webmention.herokuapp.com to have one single dataset?
[relapse]I'm not sure about uploading into webmentions or similar as I'm not the author of the comment content, so it's not "mine" per se. I ended up importing them at the bottom of my article with a header "Imported Comments". It probably should be outside of my base h-entry entirely though.
[metbril][relapse] Thanks. That might be the most appropriate solution. I a still hesitating whether or not to open legacy comments on my site, for those that do not 'do' IndieWeb or st. like Twitter replies. If so, the historical ones could be merged into a regular comments section. However, if I ever decide to switch webmention services, there is no easy way to merge these. (I'm thinking silo's here).
[metbril]sknebel can this be done from any site or do the endpoints check the webmention source? In other words, can I script a bunch of curl commands?
[kevinmarks]also, it caches sent webmentions too, so if you send webmentions through it they are there for sites that don't have support themselves yet.
metbril, jgmac1106 and [jgmac1106] joined the channel
[jgmac1106]!tell manton trying to stay a bit disconnected between now and 1/22 email me at jgregmcverry@gmail.com for any Austin planning stuff and I am still checking wiki
[Rose]I'll temporarily subscribe to a high traffic feed and see if that kickstarts my stuff. It's on my server rather than living with a Single Point of Aaron, so I probably messed something up
Zegnat[Vincent]: are you using hosted Aperture? There was a problem on the server that crashed the indexing queue, so no new items are being crawled right now
[Vincent]Hmm, Monocle is still showing the old feeds (and is not marking things read). Aperture is showing the correct feeds now after logging out then back in.
[Vincent]I suspect I deleted a feed but clicked “keep everything”. Except there is no way to then view that data. When I archive the channel all the content vanishes.
chrisaldrichGiven what cleverdevil has, it's a much better presentation of a /Now page, though those often have some additional "story" which isn't automated.
chrisaldrichThough if you've got these sort of monthly archives, they could likely be automated to show the past N days in much the same way to give a simulacrum of a /Now page.
chrisaldrichI love the idea of how these do semi-regular updates that friends and family could use to check-in with you, or which could be emailed out on a monthly basis to friends/family who don't spend much time on social media or don't want all the notifications.
[tantek]Can you share how big your database export was (from how many years of blogging / number of posts) and what the exact result was of attempting to import it? (what errors etc.)
[cleverdevil]If you're on a shared host which aggressively limits memory consumption, or have not configured PHP with a high enough memory limit, it will crash.
[asuh]^^ I’ve run into this same problem trying to import an export and had to get my shared hosting support to temporarily increase memory limit to import large sql file
[Rose]Hmm, I think I can see how this would work with Twig filters in Grav. However that is a project for another time. Tomorrow I need to get my micropub endpoint in place, add destinations, and go crazy with it.