#[davidmead]importing a WordPress export, back in WordPress is missing the last 2 years of content. This blog move is not going as planned. owning your content is a pain-in-the-arse 😉
#[davidmead]thx GWG/ looking at the XML file is a message ><b>Fatal error</b>: Out of memory (allocated 242221056) (tried to allocate 130968 bytes) which is great to know 2 days after wiping it
#[cleverdevil]Still only rendering summaries for a few things... photos, watched tv/movies, posts, and interactions.
#[cleverdevil]I still need to add recipes, github interactions, statuses (might skip this, actually... too noisy otherwise), and a few other things.
barpthewire joined the channel
#Zegnat[davidmead] what was the backup process you did to get that XML file? Maybe we can warn people on the wiki if the backup process doesn’t tell you when it fails
AkyRhO, wolftune, friedcell, swentel and [Vincent] joined the channel
#[Vincent][cleverdevil] nice, I’ve been thinking of doing something like this for a while 🙂
#[Vincent]I also wanted to make it a feed option/newsletter generation. The idea being people who just want to “keep updated” can do it that way, rather than a daily stream
krychu_ and [relapse] joined the channel; friedcell left the channel
#[relapse]Oh that is a magnificent page, [cleverdevil]. Big kudos.
#[relapse][davidmead] I believe the MySQL default backups is a bunch of SQL statements. So you should be able to slice the file up and import in blocks.
grdryn, catsup and [metbril] joined the channel
#[metbril]Would it be possible to upload an archive of old mentions and comments to webmention.io or webmention.herokuapp.com to have one single dataset?
#[metbril]If not, does someone have an example how to integrate legacy comments, old mentions with one of these services?
#[metbril][cleverdevil] that summary url is not hackable. There is no summary for /summary/2018/ yet. 404 instead.
#[relapse]I'm not sure about uploading into webmentions or similar as I'm not the author of the comment content, so it's not "mine" per se. I ended up importing them at the bottom of my article with a header "Imported Comments". It probably should be outside of my base h-entry entirely though.
#[metbril][relapse] Thanks. That might be the most appropriate solution. I a still hesitating whether or not to open legacy comments on my site, for those that do not 'do' IndieWeb or st. like Twitter replies. If so, the historical ones could be merged into a regular comments section. However, if I ever decide to switch webmention services, there is no easy way to merge these. (I'm thinking silo's here).
#[relapse]Concur, I am unaware of any easy merge either.
#[metbril]sknebel can this be done from any site or do the endpoints check the webmention source? In other words, can I script a bunch of curl commands?
#[kevinmarks]mention.tech will send an arbitrary one, or scan a h-entry for outbound links
#[kevinmarks]also, it caches sent webmentions too, so if you send webmentions through it they are there for sites that don't have support themselves yet.
metbril, jgmac1106 and [jgmac1106] joined the channel
#[jgmac1106]!tell manton trying to stay a bit disconnected between now and 1/22 email me at jgregmcverry@gmail.com for any Austin planning stuff and I am still checking wiki
#[Vincent]It might be Monocle. When I log in, nothing has been updated from Apeture
#[Vincent]It hasn’t removed the feeds I deleted in Apeture etc. So it doesn’t look like they are speaking to each other
#[Rose]My Aperture is empty. I only added feeds an hour ago though. But I get the feeling that the Watchtower install is not actually talking to it
#[Rose](The feeds table is empty, which is not a good sign I think)
AkyRhO joined the channel
#[Vincent]I just tried accessing it from a different client (Indigenous) and that immediately falls over and crashes.
#[Rose]I'll temporarily subscribe to a high traffic feed and see if that kickstarts my stuff. It's on my server rather than living with a Single Point of Aaron, so I probably messed something up
#Zegnat[Vincent]: are you using hosted Aperture? There was a problem on the server that crashed the indexing queue, so no new items are being crawled right now
#[Rose]I am not using the hosted one, so I would assume that doesn't affect me. Unless Aperture has become sentient.
#ZegnatJust wanted to make sure that the “nothing has been updated” problem is known :)
#ZegnatIf you’re not using the hosted one, all problems you experience are of your own making [Rose] ;)
#[Vincent]@zegnat yes, I assume that’s the issue thanks
#aaronpkAperture should be back up and fetching feeds again as of yesterday
#[Vincent]Hmm, Monocle is still showing the old feeds (and is not marking things read). Aperture is showing the correct feeds now after logging out then back in.
#[Vincent]I’ve tried logging out and in etc for Monocle and Indigenous and they both show the old feeds only. So at least they are consistent
#aaronpkMonocle doesn't store anything itself other than the list of channels
#aaronpki wonder if you have two accounts? Maybe an http vs https thing?
#[Rose]Hmm, a good test feed is probably Aaron's all feed 😛
#[Vincent]I think I know what is happening with mine
#[Vincent]I suspect I deleted a feed but clicked “keep everything”. Except there is no way to then view that data. When I archive the channel all the content vanishes.
#[Rose]Ooh, I just found a http/s mismatch in my config. Though I somehow doubt that's the issue
#aaronpkshe has pretty good automated systems, they just aren't also automatically published on the web
#chrisaldrichGiven what cleverdevil has, it's a much better presentation of a /Now page, though those often have some additional "story" which isn't automated.
#jackyit could be a note made within the month with the tag "now" or maybe overloading the "post-status" to be "x-now"
#jackyso you'd just pick the most recent one to display
#chrisaldrichThough if you've got these sort of monthly archives, they could likely be automated to show the past N days in much the same way to give a simulacrum of a /Now page.
#chrisaldrichI love the idea of how these do semi-regular updates that friends and family could use to check-in with you, or which could be emailed out on a monthly basis to friends/family who don't spend much time on social media or don't want all the notifications.
#[tantek]Can you share how big your database export was (from how many years of blogging / number of posts) and what the exact result was of attempting to import it? (what errors etc.)
#[tantek]Since you ran into problems with that (with MySQL in particular) I feel we should document this as one of the problems with depending on MySQL
#[tantek](exports don't "just work" as they're expected to)
[asuh] joined the channel
#[cleverdevil]This wasn't a MySQL problem, I don't think.
#[cleverdevil]It was a PHP running out of memory problem.
#[cleverdevil]When running the WordPress export, it tends to suck up a lot of memory while it is generating the export.
#[cleverdevil]If you're on a shared host which aggressively limits memory consumption, or have not configured PHP with a high enough memory limit, it will crash.
#[cleverdevil](This is also a WordPress problem... its export functionality clearly needs some work...)
#[asuh]^^ I’ve run into this same problem trying to import an export and had to get my shared hosting support to temporarily increase memory limit to import large sql file
#[Rose]Hmm, I think I can see how this would work with Twig filters in Grav. However that is a project for another time. Tomorrow I need to get my micropub endpoint in place, add destinations, and go crazy with it.