> How do I feel, about all the code I wrote that was ingested by LLMs? I feel great to be part of that, because I see this as a continuation of what I tried to do all my life: democratizing code, systems, knowledge.
I don't see it as democratic or democratising. TBH the knowledge is stored in three giga companies that used sometimes almost non-lawful (if not lawful?) methods to gain it, scraping it off the gpl projects etc. And now they are selling it to us without giving the models away. The cost IS understandable because the horrendously expensive vector cards do not come for free, but there is only one country the knowledge is gathered in so this might as well fade away one day when an orange present says so (gimme all the monies or else..)
"democratizing" as in "I steal everybody's shit, make most content creators go bankrupt, then put it all in an LLM behind a paywall." Privatization of all human knowledge--past, present, & future. They own both parties, so it's not like anyone is going to vote their way out of this one--unless one considers guillotines a form of voting.
>> I don't see it as democratic or democratising. TBH the knowledge is stored in three giga companies
It can appear democratic while access is allowed, but if it can be revoked at any moment for any reason (it is private companies, after all, that own the AI playgrounds), then the illusion will shatter.
What is more, excessive reliance on AI creates skill deficit rather than skill surplus, and promotes dependence on AI. Wizards that are nothing without their magic wands, in a way.
This may not stand out today, but give it half or one decade, when the next generation won't have a pre-AI skillet to fall back to, and the seams will become all too apparent.
I already notice that my brains tends to resist thinking on hard things, defaulting to gpt-ing or perplex-ing things. Similar feeling I had when I bought a car with parking sensors - I almost immediately lost skills of parking my older car which doesn't have those. I had to re-learn it.