User avatar
lumpen 🍉 III @scrum@wanderingwires.net
Private
fedi moonbat
he/they
anarcho whatever
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
2w
@0x4d6165 well i think json is a helluva lot easier to parse than XML lmao, and I am an xml enthusiast, but trying to do server to server or client to server stuff in xml is not a good idea, xml is better suited for relatively static documents or one way publishing like rss/atom feeds, and that's probably what held back early attempts at decentralized social media is how janky xml can be, like diaspora or OStatus are largely xml based using the cursed and now abandoned Salmon protocol

i think activityPub stands on the shoulders of giants as there was a solid decade+ of web ontology development that was basically abandoned around 2012-14 because the tech oligarchs didn't think it was going anywhere, the only people interested in semantics n such in the 2010's were indie web people

i didn't know parsing json-ld was a hassle in other languages, i'm sorry to keep shilling php here but there's this built in function
json_decode() you just take a json file or string and it turns it into an array, so you just work with an array, and if you want to turn it into json again you do json_encode and that's that, like the whole backend of this forum i'm working on is just folders with json files in it until i get the sql tables figured out, even then i kinda like the flatfile json idea

Eventually what I'm working on should serve as a basis of a minimal fedi instance, i've been playing around with a bunch of old fedi backups, the weird thing i've noticed is that mastodon, pleroma/akkoma and *key variants format their fedi backups slightly differently, like the way mastodon does it is highly redundant ( the more I learn about the technical stuff, the more i hate mastodon and it's consequences )

but i'm not even gonna fuck with activitypub directly, gonna use this library which handles most of that (
landrok.github.io/activitypub/ ) and i'm basically gonna just rip off the Misskey Sql tables lol
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
2w
having studied most of the major web ontologies, it's hard to deny the elegant simplicity of activity streams 2 www.w3.org/TR/activitystreams-vocabulary/#activity-types

peopel talk shit about this and apub, but it's not actually that which is the problem it's the fedi devs having corpo brain worms and wierdly limited visions for what is possible , like there's nothing in the spec that prevents them from say, having robost media management or integrating both micro-blogging and forums,
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
2w
things dreamed up by the absolutely deranged the AtomAPI bitworking.org/projects/atom/draft-gregorio-09/#Edit
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
2w
case in point, prov ontology which is interested primarily in mapping relations of influence, like this could be used to compile a RICO case www.w3.org/TR/2013/REC-prov-o-20130430/
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
2w
basically all these web ontologies are tools of Power, for survielance and command and control, like foucault would probably love RDF
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
2w
gonna write the ultimate SQL table oh fuck
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
2w
@0x4d6165 people are just going to complain regardless... my solution to this has just been off loading everything to the backend, using $_SESSION for a lot of stuff that I probably shouldn't, using weird CSS hacks excessively, php helps one out with this approach as writing switches involving templates manipulated by GET variables, pagination schemes, that's all quite trivial

But then doing that, the user has to click a lot more links and load a lot more pages, they're gonna have to refresh every time they want notifications, maybe that's a better way to do things, but no one wants that, so will opt back into javascript anyway
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
2w
okay so figured out that cookie problem, it's always just one little line, one little typo that is the issue, probably not good practice but I will just save a cookie forever
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
2w
this damn router script keeps deleting the damn cookie
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
2w
psuedo code, not screen readable the trick is handling sessions for non logged in ( anonymous ) users, since my board allows anonymous posting, what you can do is 'poison' a session by banning the cookie OR adding something to the session variable like Session['userType'] = banned

banning the IP of course works too, both of these are trivial to get around tho, so what i' gonna do is implement a 'hash cash" system, where you go to /post_office and request the minting of a
stamp , we create jpeg and the user gives us the has value, and that counts as a $_SESSION['stamp'] = id_of_stamp and so the user can browse or post as long as the stamp is valid
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
2w
finally got a handle on cookies and sessions, not that complicated actually
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
3w
torn between the 'write everything from scratch' and 'just use a library' approach
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
4w
please consider throwing some peeps in Honduras a few bucks if you can
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
4w
so I made some kind of tiny typo and my whole project got broked and I have to start over... I should really start using git lmao
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
4w
@GuerricHache so shiny
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
1mo
poor salmon protocol, too good for this world web.archive.org/web/20160729044615/http://www.salmon-protocol.org/
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
1mo
web ontology rant the semantic web is such a fucking psyop, all i wanted to do was write a little fuckin forum and now i've on the wayback machine reading about an rdf schematizization of reuters news categories and abandoned foaf extensions

pretty wild how much work was put into these web ontologies up until 2012 or so when they were all basically abandoned or absorbed into scheme.org, i guess this coincides with the general monopolization of the internet that happened, weird this is the moment the indieweb stuff picked up, as i guess kind of a holdover of the old xml way of thinking by people with money who didn't have to worry about chasing SEO trends

it does ulimately lead to a lot of wasted time because ontology is itself a philosophical problem right? the representation of things as information when the information is the thing, the ontology of computer science is
idealist as there's no outside of the system, an owl:thing is not the thing-in-itself it is necessarily the phenomena, it's being-for-others and the other being is the machine, it is only it's representation, it is appearance without essence ( i suppose you could say it's essence material basis is the machine code, but that has so little relevance to how words "semantic* or ontology are used in this way, no more than you can deduce the meaning of writing or art from the physical medium -- pencils, paint, paper, canvas -- it's made from ) so however one defines the ontology, you are making the reality no? like you set up the rules of a game, that is the game, with web ontoogy, the semantic rules in places dictate the structure of any navigation or UI, or law the existential horror on the table, the being of this ontoogy does not exist prior to our definition of it, our definition and elaboration of web ontology is what brings it into being, so whatever vocabular you choose, it is always a stopping-short, of the unmet potentialities of further definition, which are potentially endless, or find their limits at the imaginative possibilities of human language, the only choice you have is to settle on a set of terms that are good enough --> tho it never will be --> you just have to settle on conventions.

So the corpo project os destroying the internet and all human knowledge, what use do they really have with these web ontoogies with their ever expanding horizons of meaning? all they want to do is spy on people so they can sell them things, exploit their labor or murder them if it's advantageous to do so... so schema.org via microdata, that suits their ends. I recommend people read thru the schema.org terms because it's such an arbitrary and incomplete assortment of things, it reminds me of the medieval systems of thought foucault describes in "The Order of Things" categories are not thoroughly explored or enumerated, while trivial things are given exessive attention, becuase that's all capitaism needs it to be, not a general web ontoogy capable of describing and categorizing the undless posibiities of human though, but a simplified language of capitalism.

Can't help but think that, had the semantic web not been abandoned, and these problems dealt with fully, had the various incomplete schemas been integrated in a comprehensive way by serious academics looking to create a universal human library or something, the current iteration of LMMs would be much smarter, like they'd be trained on human curated semantic data , instead of SEO slop. just a thought.
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
1mo
this one I like, quite useful for semanticaly marking up historical info web.archive.org/web/20250614051123/https://vocab.org/bio/
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
1mo
truly the most arbitrary web ontology i've ever seen

web.archive.org/web/20220901040802/http://vocab.org/open
Calendar open.vocab.org/terms/calendar A calendar associated with this resource
Canonical URI open.vocab.org/terms/canonicalUri Denotes the canonical URI that should be used to refer to this resource
Cheese open.vocab.org/terms/Cheese Cheese is a food made from milk, usually the milk of cows, buffalo, goats, or sheep, by coagulation
User avatar
lumpen 🍉 III @scrum@wanderingwires.net
1mo
php problems need to refactor my shitty ass php because it's nothing but crappy inline functions, apparrently i should use interfaces and objects and trains n shit and not just write the same function over and over again, and I should rewrite that before I start working on new parts of the code, and doing this will probs make extending things easier, so