Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse

NodeBB

  1. Home
  2. General Discussion
  3. I have deeply mixed feelings about #ActivityPub's adoption of JSON-LD, as someone who's spent way too long dealing with it while building #Fedify.

I have deeply mixed feelings about #ActivityPub's adoption of JSON-LD, as someone who's spent way too long dealing with it while building #Fedify.

Scheduled Pinned Locked Moved General Discussion
fedifyjsonldfedidevactivitypub
106 Posts 24 Posters 3 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • hongminhee@hollo.socialH This user is from outside of this forum
    hongminhee@hollo.socialH This user is from outside of this forum
    hongminhee@hollo.social
    wrote last edited by
    #1

    I have deeply mixed feelings about #ActivityPub's adoption of JSON-LD, as someone who's spent way too long dealing with it while building #Fedify.

    Part of me wishes it had never happened. A lot of developers jump into ActivityPub development without really understanding JSON-LD, and honestly, can you blame them? The result is a growing number of implementations producing technically invalid JSON-LD. It works, sort of, because everyone's just pattern-matching against what Mastodon does, but it's not correct. And even developers who do take the time to understand JSON-LD often end up hardcoding their documents anyway, because proper JSON-LD processor libraries simply don't exist for many languages. No safety net, no validation, just vibes and hoping you got the @context right. Naturally, mistakes creep in.

    But then the other part of me thinks: well, we're stuck with JSON-LD now. There's no going back. So wouldn't it be nice if people actually used it properly? Process the documents, normalize them, do the compaction and expansion dance the way the spec intended. That's what Fedify does.

    Here's the part that really gets to me, though. Because Fedify actually processes JSON-LD correctly, it's more likely to break when talking to implementations that produce malformed documents. From the end user's perspective, Fedify looks like the fragile one. “Why can't I follow this person?” Well, because their server is emitting garbage JSON-LD that happens to work with implementations that just treat it as a regular JSON blob. Every time I get one of these bug reports, I feel a certain injustice. Like being the only person in the group project who actually read the assignment.

    To be fair, there are real practical reasons why most people don't bother with proper JSON-LD processing. Implementing a full processor is genuinely a lot of work. It leans on the entire Linked Data stack, which is bigger than most people expect going in. And the performance cost isn't trivial either. Fedify uses some tricks to keep things fast, and I'll be honest, that code isn't my proudest work.

    Anyway, none of this is going anywhere. Just me grumbling into the void. If you're building an ActivityPub implementation, maybe consider using a JSON-LD processor if one's available for your language. And if you're not going to, at least test your output against implementations that do.

    #JSONLD #fedidev

    hazelnoot@enby.lifeH rimu@piefed.socialR silverpill@mitra.socialS varpie@peculiar.floristV 4 Replies Last reply
    0
    • hongminhee@hollo.socialH hongminhee@hollo.social

      I have deeply mixed feelings about #ActivityPub's adoption of JSON-LD, as someone who's spent way too long dealing with it while building #Fedify.

      Part of me wishes it had never happened. A lot of developers jump into ActivityPub development without really understanding JSON-LD, and honestly, can you blame them? The result is a growing number of implementations producing technically invalid JSON-LD. It works, sort of, because everyone's just pattern-matching against what Mastodon does, but it's not correct. And even developers who do take the time to understand JSON-LD often end up hardcoding their documents anyway, because proper JSON-LD processor libraries simply don't exist for many languages. No safety net, no validation, just vibes and hoping you got the @context right. Naturally, mistakes creep in.

      But then the other part of me thinks: well, we're stuck with JSON-LD now. There's no going back. So wouldn't it be nice if people actually used it properly? Process the documents, normalize them, do the compaction and expansion dance the way the spec intended. That's what Fedify does.

      Here's the part that really gets to me, though. Because Fedify actually processes JSON-LD correctly, it's more likely to break when talking to implementations that produce malformed documents. From the end user's perspective, Fedify looks like the fragile one. “Why can't I follow this person?” Well, because their server is emitting garbage JSON-LD that happens to work with implementations that just treat it as a regular JSON blob. Every time I get one of these bug reports, I feel a certain injustice. Like being the only person in the group project who actually read the assignment.

      To be fair, there are real practical reasons why most people don't bother with proper JSON-LD processing. Implementing a full processor is genuinely a lot of work. It leans on the entire Linked Data stack, which is bigger than most people expect going in. And the performance cost isn't trivial either. Fedify uses some tricks to keep things fast, and I'll be honest, that code isn't my proudest work.

      Anyway, none of this is going anywhere. Just me grumbling into the void. If you're building an ActivityPub implementation, maybe consider using a JSON-LD processor if one's available for your language. And if you're not going to, at least test your output against implementations that do.

      #JSONLD #fedidev

      hazelnoot@enby.lifeH This user is from outside of this forum
      hazelnoot@enby.lifeH This user is from outside of this forum
      hazelnoot@enby.life
      wrote last edited by
      #2

      @hongminhee@hollo.social boosting this for the excellent points, even though I'm one of the people not using JSON-LD and frequently producing malformed documents.

      (And honestly, I don't think I'll change that soon. Sharkey only uses JSON-LD on one single code path, and even that's been enough to introduce critical bugs. I'm planning to remove the JSON-LD lib entirely from Campfire fork.)

      ((And that's not even getting into the security problems with every JSON-LD lib I've ever audited...))

      1 Reply Last reply
      0
      • hongminhee@hollo.socialH hongminhee@hollo.social

        I have deeply mixed feelings about #ActivityPub's adoption of JSON-LD, as someone who's spent way too long dealing with it while building #Fedify.

        Part of me wishes it had never happened. A lot of developers jump into ActivityPub development without really understanding JSON-LD, and honestly, can you blame them? The result is a growing number of implementations producing technically invalid JSON-LD. It works, sort of, because everyone's just pattern-matching against what Mastodon does, but it's not correct. And even developers who do take the time to understand JSON-LD often end up hardcoding their documents anyway, because proper JSON-LD processor libraries simply don't exist for many languages. No safety net, no validation, just vibes and hoping you got the @context right. Naturally, mistakes creep in.

        But then the other part of me thinks: well, we're stuck with JSON-LD now. There's no going back. So wouldn't it be nice if people actually used it properly? Process the documents, normalize them, do the compaction and expansion dance the way the spec intended. That's what Fedify does.

        Here's the part that really gets to me, though. Because Fedify actually processes JSON-LD correctly, it's more likely to break when talking to implementations that produce malformed documents. From the end user's perspective, Fedify looks like the fragile one. “Why can't I follow this person?” Well, because their server is emitting garbage JSON-LD that happens to work with implementations that just treat it as a regular JSON blob. Every time I get one of these bug reports, I feel a certain injustice. Like being the only person in the group project who actually read the assignment.

        To be fair, there are real practical reasons why most people don't bother with proper JSON-LD processing. Implementing a full processor is genuinely a lot of work. It leans on the entire Linked Data stack, which is bigger than most people expect going in. And the performance cost isn't trivial either. Fedify uses some tricks to keep things fast, and I'll be honest, that code isn't my proudest work.

        Anyway, none of this is going anywhere. Just me grumbling into the void. If you're building an ActivityPub implementation, maybe consider using a JSON-LD processor if one's available for your language. And if you're not going to, at least test your output against implementations that do.

        #JSONLD #fedidev

        rimu@piefed.socialR This user is from outside of this forum
        rimu@piefed.socialR This user is from outside of this forum
        rimu@piefed.social
        wrote last edited by
        #3

        JSON-LD is a trap. Sorry you fell in.

        1 Reply Last reply
        0
        • hongminhee@hollo.socialH hongminhee@hollo.social

          I have deeply mixed feelings about #ActivityPub's adoption of JSON-LD, as someone who's spent way too long dealing with it while building #Fedify.

          Part of me wishes it had never happened. A lot of developers jump into ActivityPub development without really understanding JSON-LD, and honestly, can you blame them? The result is a growing number of implementations producing technically invalid JSON-LD. It works, sort of, because everyone's just pattern-matching against what Mastodon does, but it's not correct. And even developers who do take the time to understand JSON-LD often end up hardcoding their documents anyway, because proper JSON-LD processor libraries simply don't exist for many languages. No safety net, no validation, just vibes and hoping you got the @context right. Naturally, mistakes creep in.

          But then the other part of me thinks: well, we're stuck with JSON-LD now. There's no going back. So wouldn't it be nice if people actually used it properly? Process the documents, normalize them, do the compaction and expansion dance the way the spec intended. That's what Fedify does.

          Here's the part that really gets to me, though. Because Fedify actually processes JSON-LD correctly, it's more likely to break when talking to implementations that produce malformed documents. From the end user's perspective, Fedify looks like the fragile one. “Why can't I follow this person?” Well, because their server is emitting garbage JSON-LD that happens to work with implementations that just treat it as a regular JSON blob. Every time I get one of these bug reports, I feel a certain injustice. Like being the only person in the group project who actually read the assignment.

          To be fair, there are real practical reasons why most people don't bother with proper JSON-LD processing. Implementing a full processor is genuinely a lot of work. It leans on the entire Linked Data stack, which is bigger than most people expect going in. And the performance cost isn't trivial either. Fedify uses some tricks to keep things fast, and I'll be honest, that code isn't my proudest work.

          Anyway, none of this is going anywhere. Just me grumbling into the void. If you're building an ActivityPub implementation, maybe consider using a JSON-LD processor if one's available for your language. And if you're not going to, at least test your output against implementations that do.

          #JSONLD #fedidev

          silverpill@mitra.socialS This user is from outside of this forum
          silverpill@mitra.socialS This user is from outside of this forum
          silverpill@mitra.social
          wrote last edited by
          #4

          @hongminhee

          >There's no going back.

          We absolutely must go back. Either we have a vibrant ecosystem where building stuff is a pleasant experience, or fediverse slowly dies while linked data cultists harass developers about nonresolvable URLs in @context.

          JSON-LD adds nothing to ActivityPub, it only creates problems. Time to move on.

          1 Reply Last reply
          0
          • mat@friendica.exon.nameM This user is from outside of this forum
            mat@friendica.exon.nameM This user is from outside of this forum
            mat@friendica.exon.name
            wrote last edited by
            #5

            @hongminhee @julian I'm a true believer in RDF from back in the day, so I'm hardly neutral. But...

            There are essentially no interesting ActivityPub extensions right now. Even Evan's chess example, no-one's actually using AP to play chess. It's just ActivityStreams + a few cute tricks now and then. Even if there were extensions, existing AP servers chop off and throw away data they don't understand. so none of these extensions could work.

            I feel like most of the "WTF am I learning JSON-LD for" criticisms are coming from this status quo. That includes "if someone wants to add a gallery thing or whatever, can't they make a FEP?" The way things work now, your extension either a) works only in your software or b) has to be painfully negotiated with the whole community. We're all gonna have a big fight about it on this forum anyway. Let's not pretend JSON-LD helps us.

            But if we add two things to the mix, the situation looks different. Those are 1. server software that "keeps all the bits", and 2. a whitelabel extensible app. That would make it very easy to spin up crazy new experiences for a sizeable existing userbase. Developers should not be forced to endure a FEP process, and they should not have to attract a userbase from nothing. They should be able to just build, without even worrying if they're stepping on toes. And of course, Fedify and libraries in other languages are a load-bearing part of that world, including enforcement of the JSON-LD rules.

            That world does not exist at all today, but JSON-LD does, so it's pretty valid to describe this design as premature optimisation. I dunno though, we don't seem that far away.

            1 Reply Last reply
            0
            • sl007@digitalcourage.socialS This user is from outside of this forum
              sl007@digitalcourage.socialS This user is from outside of this forum
              sl007@digitalcourage.social
              wrote last edited by
              #6

              @julian @mat

              Do you know about the backgrounds of the immers project ?
              , "no-one's actually using AP to play chess"
              the reason that we have noa AP chess service _anymore_ is #uspol …

              This all feels very unfair somehow cause I know the backgrounds but anyway …
              While we 2 days ago had a long thread about our use of Chess Games I will link the video from the thread https://digitalcourage.social/@sl007/116023149133783002

              immers with its federated locations and positional audio etc was supernice for playing chess !
              Our use is fairly similar and straightforward like we did the chess Social CG meeting in 2018 and the rc3 (usually 18.000 people physically but here it was virtually cause pandemics) https://socialhub.activitypub.rocks/t/rc3-chaos-communication-congress/1202

              Maybe it would really be fair if people are new to look into the 20 years Social CG history where some volunteers really gave much work 🙂
              🧵 1/2

              sl007@digitalcourage.socialS 1 Reply Last reply
              0
              • mat@friendica.exon.nameM This user is from outside of this forum
                mat@friendica.exon.nameM This user is from outside of this forum
                mat@friendica.exon.name
                wrote last edited by
                #7
                @julian I don't know as much as I'd like about AT Lexicons. That is, not so much how they work, but what the grand idea is? I don't even understand if Bluesky imagines them being mixed and matched JSON-LD style. I think not?
                1 Reply Last reply
                0
                • sl007@digitalcourage.socialS sl007@digitalcourage.social

                  @julian @mat

                  Do you know about the backgrounds of the immers project ?
                  , "no-one's actually using AP to play chess"
                  the reason that we have noa AP chess service _anymore_ is #uspol …

                  This all feels very unfair somehow cause I know the backgrounds but anyway …
                  While we 2 days ago had a long thread about our use of Chess Games I will link the video from the thread https://digitalcourage.social/@sl007/116023149133783002

                  immers with its federated locations and positional audio etc was supernice for playing chess !
                  Our use is fairly similar and straightforward like we did the chess Social CG meeting in 2018 and the rc3 (usually 18.000 people physically but here it was virtually cause pandemics) https://socialhub.activitypub.rocks/t/rc3-chaos-communication-congress/1202

                  Maybe it would really be fair if people are new to look into the 20 years Social CG history where some volunteers really gave much work 🙂
                  🧵 1/2

                  sl007@digitalcourage.socialS This user is from outside of this forum
                  sl007@digitalcourage.socialS This user is from outside of this forum
                  sl007@digitalcourage.social
                  wrote last edited by
                  #8

                  @julian @mat

                  We implemented this standard and you can create / describe your rooms [Place, `redaktor:fictional`] and the chessboard is just a geohash as described in the geosocial CG so the use is the same, just `redaktor:fictional` too,
                  You load the Collection of Chessfigures (pawn1 ...) can name them, they `Travel` over the chessboard ant the `Arrive` describes the `result`.
                  As always you can get very detailed with wikidata properties and entities but bare AS Vocabulary is enough.
                  In the end you have a Collection for the Travels which is your played game which you can replay or do whatever with.

                  But you can still install immers - it is worth a try https://github.com/immers-space

                  The reason for its end are the same as for the gup.pe groups and I hope people konw about it …

                  1 Reply Last reply
                  0
                  • sl007@digitalcourage.socialS This user is from outside of this forum
                    sl007@digitalcourage.socialS This user is from outside of this forum
                    sl007@digitalcourage.social
                    wrote last edited by
                    #9

                    @mat Has a reason, just wrote it to @julian in a DM, just didn't want to post public.

                    Not sure if you visited the link. This _was_ the community approval …
                    Immers was famous and we had some official Social CG meetings where I linked one where thousands of community people attended (?)
                    The W3C Social CG _is_ the Community (?)
                    Meanwhile even Public Spaces Incubator uses it which is to my best knowledge the largest upcoming iimplementor by far.
                    I mean apart from that it is pretty obvious after the meeting where we talked about "factual" vs "fictional".
                    mastodon has nothing to do with this. The majority of projects count in a democracy. We had a demo playing chess between 4 softwares.

                    Doing the official AP Conf and becoming elected Policy Lead, I had always asked the community. For 20 years 😞
                    https://conf.tube/c/apconf_channel/videos

                    Not sure if anyone did read the "Conformance Section" of ActivityPub. https://www.w3.org/TR/activitypub/#conformance
                    It is section 2 - You have to support "The Entirety"...
                    If mastodon does not it is not ActivityPub.

                    1 Reply Last reply
                    0
                    • sl007@digitalcourage.socialS This user is from outside of this forum
                      sl007@digitalcourage.socialS This user is from outside of this forum
                      sl007@digitalcourage.social
                      wrote last edited by
                      #10

                      @mat
                      Just btw, this is 7 years old https://www.reddit.com/r/chess/comments/94ubnd/chess_over_activitypub/ but anyway

                      However, given that I have, including immers and redaktor, at least 3 apps where I can use the first chess spec.:
                      if more than 2 implementations will also support this second chess specification, I will do so too.

                      @julian

                      1 Reply Last reply
                      0
                      • hongminhee@hollo.socialH hongminhee@hollo.social

                        I have deeply mixed feelings about #ActivityPub's adoption of JSON-LD, as someone who's spent way too long dealing with it while building #Fedify.

                        Part of me wishes it had never happened. A lot of developers jump into ActivityPub development without really understanding JSON-LD, and honestly, can you blame them? The result is a growing number of implementations producing technically invalid JSON-LD. It works, sort of, because everyone's just pattern-matching against what Mastodon does, but it's not correct. And even developers who do take the time to understand JSON-LD often end up hardcoding their documents anyway, because proper JSON-LD processor libraries simply don't exist for many languages. No safety net, no validation, just vibes and hoping you got the @context right. Naturally, mistakes creep in.

                        But then the other part of me thinks: well, we're stuck with JSON-LD now. There's no going back. So wouldn't it be nice if people actually used it properly? Process the documents, normalize them, do the compaction and expansion dance the way the spec intended. That's what Fedify does.

                        Here's the part that really gets to me, though. Because Fedify actually processes JSON-LD correctly, it's more likely to break when talking to implementations that produce malformed documents. From the end user's perspective, Fedify looks like the fragile one. “Why can't I follow this person?” Well, because their server is emitting garbage JSON-LD that happens to work with implementations that just treat it as a regular JSON blob. Every time I get one of these bug reports, I feel a certain injustice. Like being the only person in the group project who actually read the assignment.

                        To be fair, there are real practical reasons why most people don't bother with proper JSON-LD processing. Implementing a full processor is genuinely a lot of work. It leans on the entire Linked Data stack, which is bigger than most people expect going in. And the performance cost isn't trivial either. Fedify uses some tricks to keep things fast, and I'll be honest, that code isn't my proudest work.

                        Anyway, none of this is going anywhere. Just me grumbling into the void. If you're building an ActivityPub implementation, maybe consider using a JSON-LD processor if one's available for your language. And if you're not going to, at least test your output against implementations that do.

                        #JSONLD #fedidev

                        varpie@peculiar.floristV This user is from outside of this forum
                        varpie@peculiar.floristV This user is from outside of this forum
                        varpie@peculiar.florist
                        wrote last edited by
                        #11

                        @hongminhee I have the same feeling. The idea behind JSON-LD is nice, but it isn't widely available, so developing with it becomes a headache: do I want to create a JSON-LD processor, spending twice the time I wanted to, or do I just consider it as JSON for now and hope someone will make a JSON-LD processor soon? Often, the answer is the latter, because it's a big task that we're not looking for when creating fedi software.

                        1 Reply Last reply
                        0
                        • kopper@not-brain.d.on-t.workK This user is from outside of this forum
                          kopper@not-brain.d.on-t.workK This user is from outside of this forum
                          kopper@not-brain.d.on-t.work
                          wrote last edited by
                          #12
                          @hongminhee take this part with a grain of salt because my benchmarks for it are with dotNetRdf which is the slowest C# implementation i know of (hence my replacement library), but JSON-LD is slower than RSA validation, which is one of the pain points around authorized fetch scalability

                          wetdry.world/@kopper/114678924693500011
                          kopper@not-brain.d.on-t.workK 2 Replies Last reply
                          0
                          • kopper@not-brain.d.on-t.workK kopper@not-brain.d.on-t.work
                            @hongminhee take this part with a grain of salt because my benchmarks for it are with dotNetRdf which is the slowest C# implementation i know of (hence my replacement library), but JSON-LD is slower than RSA validation, which is one of the pain points around authorized fetch scalability

                            wetdry.world/@kopper/114678924693500011
                            kopper@not-brain.d.on-t.workK This user is from outside of this forum
                            kopper@not-brain.d.on-t.workK This user is from outside of this forum
                            kopper@not-brain.d.on-t.work
                            wrote last edited by
                            #13
                            @hongminhee if i can give one piece of advice to devs who want to process JSON-LD: dont bother compacting. you already know the schema you output (or you're just passing through what the user gives and it doesn't matter to you), serialize directly to the compacted representation, and only run expansion on incoming data

                            expansion is the cheapest JSON-LD operation (since all other operations depend on it and run it internally anyhow), and this will get you all the compatibility benefits of JSON-LD with little downsides (beyond more annoying deserialization code, as you have to map the expanded representation to your internal structure which will likely be modeled after the compacted one)
                            natty@astolfo.socialN 1 Reply Last reply
                            0
                            • sl007@digitalcourage.socialS This user is from outside of this forum
                              sl007@digitalcourage.socialS This user is from outside of this forum
                              sl007@digitalcourage.social
                              wrote last edited by
                              #14

                              @julian

                              Manu, maker of JSON-LD who also helped with the AP Confs, made this nice video https://www.youtube.com/watch?v=vioCbTo3C-4

                              JSON-LD is a normative reference to ActivityPub. The context of AP is only 1 line, maybe 4 if you support the official extensions. It does not make anything much larger.

                              It is for example important if you want to consume the federated wikipedia, wikidata, European Broadcasting Union or these Public Broadcasters https://www.publicmediaalliance.org/public-broadcasters-create-public-spaces-incubator/ but also to know that e.g. mobilizon uses schema.org for addresses.

                              I give you an example, if you include
                              "mz": "https://joinmobilizon.org/ns#", "wd": "https://www.wikidata.org/wiki/Special:EntityData/",
                              "wdt": "https://www.wikidata.org/prop/direct/"

                              in your context, then you know about mobilizon extension but also the whole common knowledge of the world …
                              I like that, now you can support the whole vocabulary of wikipedia and wikidata which is just JSON-LD.
                              You get it in all the languages of the world including the properties name.
                              No problem, if others don't support it, but sad for users.

                              @hongminhee

                              sl007@digitalcourage.socialS 1 Reply Last reply
                              0
                              • sl007@digitalcourage.socialS sl007@digitalcourage.social

                                @julian

                                Manu, maker of JSON-LD who also helped with the AP Confs, made this nice video https://www.youtube.com/watch?v=vioCbTo3C-4

                                JSON-LD is a normative reference to ActivityPub. The context of AP is only 1 line, maybe 4 if you support the official extensions. It does not make anything much larger.

                                It is for example important if you want to consume the federated wikipedia, wikidata, European Broadcasting Union or these Public Broadcasters https://www.publicmediaalliance.org/public-broadcasters-create-public-spaces-incubator/ but also to know that e.g. mobilizon uses schema.org for addresses.

                                I give you an example, if you include
                                "mz": "https://joinmobilizon.org/ns#", "wd": "https://www.wikidata.org/wiki/Special:EntityData/",
                                "wdt": "https://www.wikidata.org/prop/direct/"

                                in your context, then you know about mobilizon extension but also the whole common knowledge of the world …
                                I like that, now you can support the whole vocabulary of wikipedia and wikidata which is just JSON-LD.
                                You get it in all the languages of the world including the properties name.
                                No problem, if others don't support it, but sad for users.

                                @hongminhee

                                sl007@digitalcourage.socialS This user is from outside of this forum
                                sl007@digitalcourage.socialS This user is from outside of this forum
                                sl007@digitalcourage.social
                                wrote last edited by
                                #15

                                @julian @hongminhee

                                PS, I am using the official JSON-LD processor of Manu and contributors, if support in any language is lacking, we just speak to the JSON-LD Group (glad about the 2 webintents coming together now as well ) …
                                Cause we are social …

                                1 Reply Last reply
                                0
                                • kopper@not-brain.d.on-t.workK kopper@not-brain.d.on-t.work
                                  @hongminhee if i can give one piece of advice to devs who want to process JSON-LD: dont bother compacting. you already know the schema you output (or you're just passing through what the user gives and it doesn't matter to you), serialize directly to the compacted representation, and only run expansion on incoming data

                                  expansion is the cheapest JSON-LD operation (since all other operations depend on it and run it internally anyhow), and this will get you all the compatibility benefits of JSON-LD with little downsides (beyond more annoying deserialization code, as you have to map the expanded representation to your internal structure which will likely be modeled after the compacted one)
                                  natty@astolfo.socialN This user is from outside of this forum
                                  natty@astolfo.socialN This user is from outside of this forum
                                  natty@astolfo.social
                                  wrote last edited by
                                  #16

                                  @kopper@not-brain.d.on-t.work @hongminhee@hollo.social expansion is actually really annoying because the resulting JSON has annoyingly similar keys to lookup in a hashmap, wasting cache lines, and CPU time

                                  1 Reply Last reply
                                  0
                                  • kopper@not-brain.d.on-t.workK kopper@not-brain.d.on-t.work
                                    @hongminhee take this part with a grain of salt because my benchmarks for it are with dotNetRdf which is the slowest C# implementation i know of (hence my replacement library), but JSON-LD is slower than RSA validation, which is one of the pain points around authorized fetch scalability

                                    wetdry.world/@kopper/114678924693500011
                                    kopper@not-brain.d.on-t.workK This user is from outside of this forum
                                    kopper@not-brain.d.on-t.workK This user is from outside of this forum
                                    kopper@not-brain.d.on-t.work
                                    wrote last edited by
                                    #17
                                    @hongminhee i put this in a quote but people reading the thread may also be interested: json-ld compaction does not really save that much bandwidth over having all the namespaces explicitly written in property names if you're gzipping (and you are gzipping, right? this is json. make sure your nginx gzip_types includes ld+json and activity+json)

                                    RE:
                                    not-brain.d.on-t.work/notes/aihftmbjpxdyb9k7
                                    1 Reply Last reply
                                    0
                                    • sl007@digitalcourage.socialS This user is from outside of this forum
                                      sl007@digitalcourage.socialS This user is from outside of this forum
                                      sl007@digitalcourage.social
                                      wrote last edited by
                                      #18

                                      @kopper
                                      @julian
                                      @hongminhee

                                      hm, we really need to differentiate between users responsibility and dev responsibility.

                                      Not sure if Hong saw the draft about the AP kv thing, it supports either JSON-LD fields _or_ as:attachment / as:context …
                                      wtf do I want to say.

                                      user story:
                                      We are working on 2 major and 3 projects fulltime which is
                                      - federation of wikibase / wikidata
                                      - federation of Public Broadcasters https://www.publicmediaalliance.org/public-broadcasters-create-public-spaces-incubator/
                                      and these https://codeberg.org/Menschys/fedi-codebase

                                      Let's say we want to federate a Country, then all the knowledge is sent in `attachment` with the fully qualified qikidata url in `context` [as:context - not @context ! - this is so confusing :)]
                                      For example the according entries from the PressFreedomIndex `collection` (co-founder of freelens here 🙂

                                      But anyway, the idea about having
                                      "wd": "https://www.wikidata.org/wiki/Special:EntityData/",
                                      "wdt": "https://www.wikidata.org/prop/direct/" in the `@context` was that any user can consume and federate wikibase
                                      incl.
                                      🧵 1/2

                                      sl007@digitalcourage.socialS 1 Reply Last reply
                                      0
                                      • sl007@digitalcourage.socialS sl007@digitalcourage.social

                                        @kopper
                                        @julian
                                        @hongminhee

                                        hm, we really need to differentiate between users responsibility and dev responsibility.

                                        Not sure if Hong saw the draft about the AP kv thing, it supports either JSON-LD fields _or_ as:attachment / as:context …
                                        wtf do I want to say.

                                        user story:
                                        We are working on 2 major and 3 projects fulltime which is
                                        - federation of wikibase / wikidata
                                        - federation of Public Broadcasters https://www.publicmediaalliance.org/public-broadcasters-create-public-spaces-incubator/
                                        and these https://codeberg.org/Menschys/fedi-codebase

                                        Let's say we want to federate a Country, then all the knowledge is sent in `attachment` with the fully qualified qikidata url in `context` [as:context - not @context ! - this is so confusing :)]
                                        For example the according entries from the PressFreedomIndex `collection` (co-founder of freelens here 🙂

                                        But anyway, the idea about having
                                        "wd": "https://www.wikidata.org/wiki/Special:EntityData/",
                                        "wdt": "https://www.wikidata.org/prop/direct/" in the `@context` was that any user can consume and federate wikibase
                                        incl.
                                        🧵 1/2

                                        sl007@digitalcourage.socialS This user is from outside of this forum
                                        sl007@digitalcourage.socialS This user is from outside of this forum
                                        sl007@digitalcourage.social
                                        wrote last edited by
                                        #19

                                        @kopper @julian @hongminhee

                                        incl.
                                        - the properties in all the languages of the world
                                        - the knowledge of the world in all the languages
                                        - the wikidata relations and qualified statements including the nameMap etc. and all the urls to all wikiprojects incl. their languages and knowledge

                                        How else could I say to other softwares if they want all users qualified data, use wikidata vocabulary?
                                        wikipedia, wikidata, EBU, Public Broadcasters, taxi data is _all_ JSON-LD …

                                        kopper@not-brain.d.on-t.workK 1 Reply Last reply
                                        0
                                        • sl007@digitalcourage.socialS sl007@digitalcourage.social

                                          @kopper @julian @hongminhee

                                          incl.
                                          - the properties in all the languages of the world
                                          - the knowledge of the world in all the languages
                                          - the wikidata relations and qualified statements including the nameMap etc. and all the urls to all wikiprojects incl. their languages and knowledge

                                          How else could I say to other softwares if they want all users qualified data, use wikidata vocabulary?
                                          wikipedia, wikidata, EBU, Public Broadcasters, taxi data is _all_ JSON-LD …

                                          kopper@not-brain.d.on-t.workK This user is from outside of this forum
                                          kopper@not-brain.d.on-t.workK This user is from outside of this forum
                                          kopper@not-brain.d.on-t.work
                                          wrote last edited by
                                          #20
                                          @sl007 @hongminhee @julian i feel like you're falling into a trap i've seen a lot around AP spaces: just because the data can be contorted to represent something does not mean software will interpret it as such.

                                          any software who wants to support wikidata statements and relations will have to go out of their way to implement that manually with or without json-ld in the mix, and interoperability between those software will have to specify how that works. and in your specification you can indeed make it so Simply Linking to the wikidata json-ld (which i don't believe it provides out of the box, it does for xml, turtle, and n-triples, if we're talking about rdf. if not,
                                          their bespoke json format is just as authoritative) can work (but i'd say using the Qxxx and Pxx IDs and letting the software figure out how to access it would be better!)

                                          if you have the dream of making an as:Note and having it's as:attributedTo be the wikidata entity for alan turing... sorry, nobody other than maybe your own software will even attempt interpreting that
                                          kopper@not-brain.d.on-t.workK sl007@digitalcourage.socialS 2 Replies Last reply
                                          0

                                          Hello! It looks like you're interested in this conversation, but you don't have an account yet.

                                          Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.

                                          With your input, this post could be even better 💗

                                          Register Login
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups