• Aceticon@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Being confident whilst caring not one jot for being wrong or right is the essence of selling in modern society, especially when it comes to ideas (especially Politics).

    Large language models are great at producing text with all the correct subtle details to trigger the reader’s subconscious feeling of “this is somebody who knows what he’s talking” with zero of the subtle details that make people suspect the writer is unsure of what he wrote or even deceitful (in a way, LLMs are the perfect sleazy politician).

    That said, I do agree with you that amongst expert domain specific communities populated by people who have actual domain knowledge, assured delivery of bullshit doesn’t go far. It will, however, very likely go far in the more generic communities which seem to, at least individually, have the most subscribers, so an “influencer” strategy selling to “consumers” (so, B2C not B2B) might make sense if most of the population of Reddit are non-specialists using it as a “Portal to the Internet”.

    • rolaulten@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      So under this hypothesis, reddit becomes a trove of B2C bots/influencers attempting to pitch outrage/ dopamine (and the occasional consumer good) while letting the professional communities die.

      This begs the question - how much of reddits traffic is composed of said technical (as in deep knowledge in a field not tech) communities? Furthermore how much of reddits value prop to large language models is said technical communities (as a source of training).