• Moderators, please send me a PM if you are unable to access mod permissions. Thanks, Habsy.

Tech Thread

serious question... if all AI is trained off of stuff originally created by humans, and humans have a number of flaws that we are well aware of (i.e. we lie, we cheat, we crime, etc.)... how do the AI developers prevent their AIs, which use human training data, from picking up all these unsavoury human characteristics?

I mean, we see AIs do all the same evil human stuff so far, no?
 
how do the AI developers prevent their AIs, which use human training data, from picking up all these unsavoury human characteristics?

They don't.

LLM's aren't new technology, they've existed for 60 years. We were promised that "moar compute" was going to solve all of the known problems with these systems and it's just not happening. Whenever we get given metrics that they're improving at (the new one is "look at these competitions vs humans they've improved and/or are better than humans at), it turns out that the metrics have been carefully designed for them to excel at vs how the work is expected to be done in a functioning business environment.

The core problems with LLM's remain. They're just not reliable if you need actual accuracy in your work and can't turn in iterations that invent data, lie, hallucinate, etc.
 
They don't.

LLM's aren't new technology, they've existed for 60 years. We were promised that "moar compute" was going to solve all of the known problems with these systems and it's just not happening. Whenever we get given metrics that they're improving at (the new one is "look at these competitions vs humans they've improved and/or are better than humans at), it turns out that the metrics have been carefully designed for them to excel at vs how the work is expected to be done in a functioning business environment.

The core problems with LLM's remain. They're just not reliable if you need actual accuracy in your work and can't turn in iterations that invent data, lie, hallucinate, etc.
scary shit.

AI is creeping into my workplace and I have no time for it.

there's an AI function on MS Teams now that takes notes from a meeting - I've already had it completely mischaracterize something I said. details matter too much in what I do to trust these lying machine fucks.
 
"AI" is the one of the best marketing term assimilations in human history. We have a long literary history of what AI is expected to be and the creators of these modern LLM's co opted an established term to describe their advanced pattern guessing spreadsheets. So it leads to a lot of people just parroting the term and getting it fixed in the average person's head that the systems we've become familiar with from sci-fi movies are here now when it's just not, and it's nowhere close to being there despite the hundreds of billions thrown down the well at them.

There are legitimate uses for LLM's, but they're not and likely never will be reliable for anyone who requires accuracy at higher than a human level (as you're describing in your anecdote). The entire business case for them when you look under the hood is that they won't be much worse than a bad employee at things, for cheaper. But a lot of businesses are going to replace a lot of average or better employees with these systems before figuring out that it's hurting their businesses in the medium to long term, before cutting their losses and going back to people.

I use a handful of "AI" aided tools in photoshop from time to time and some of them are legitimately useful most of the time (but still require some refinement when it's done), but some of them go nuts and just aren't reliable enough to build a workflow around. I was retouching a few images that I shot for licensing to a custom homebuilder but had to fix some parts of the roof that weren't ready the day I shot the project and tried to use Adobe's generative fill (the fix using photoshop's legacy tools is a bit tedious). I shit you not, regardless of the prompt I provided, generative fill was putting random items on a roof with a completely different shingle pattern. A fucking vacuum cleaner was the highlight of the 3-4 attempts. Worth taking the 10 seconds on a punt to see if it gives me something useful? Sure. Something I can build a reliable workflow around? Hell no.
 
There are things I really prefer to operate myself. I do appreciate the "sensitive" doors at grocery stores, but opening a car door myself isn't such a chore. Besides, what if the sonofabitch loses power. Have they worked  that shit out yet?
 
too many bells and whistles in vehicles these days.. too many things that could potentially require a computer chip to be replaced, a visit to the dealer etc etc..
heard a mechanic this summer curse and swear over vehicles that the engine stops every time the vehicle comes to a complete stop and restarts when you release the break... he said you'd be surprised how many extra chips and parts etc are needed for that function.. i have 0 knowledge about vehicle mechanics so i have to take his word on it lol
 
Back
Top