Not sure i agree that we should highlight the suffering of these folks over those that have been doing even worse stuff for years.
For example there are roughly 360k+ people working in the US prison system. Probably observing killings, rapes and all kinds of awful things regularly.
And in a way that is probably more intense than just reading it... because they need to smell it, hear it, and sometimes even clean up after it.
There is sacrifice all around us. People choose what level of sacrifice to expose themselves to based on economic benefit, their other options, etc. For better or worse we call it capitalism
"Today, ChatGPT refuses to produce the explicit scenes the team helped weed out, and it issues warnings about potentially illegal sexual acts."
A responsible company would have tracked and documented this in a database to inform relevant authorities in the countries where it was witnessed, helping weed out this material. Disgracefully, just using it to "clean" an AI for the "benefit" of its users is akin to watching someone being attacked in front of you and turning your head away.
This reminds me of how social media moderator have to view horrifying images/videos/texts and removed them
Not sure i agree that we should highlight the suffering of these folks over those that have been doing even worse stuff for years.
For example there are roughly 360k+ people working in the US prison system. Probably observing killings, rapes and all kinds of awful things regularly.
And in a way that is probably more intense than just reading it... because they need to smell it, hear it, and sometimes even clean up after it.
There is sacrifice all around us. People choose what level of sacrifice to expose themselves to based on economic benefit, their other options, etc. For better or worse we call it capitalism
There’s always human in the loop for some amazingly seamless models that we see. But those humans are not well paid and are almost always anonymous.
Outsourced for cheap labor.
Wow
This may well be the most important newsletter Big Technology has published.
"Today, ChatGPT refuses to produce the explicit scenes the team helped weed out, and it issues warnings about potentially illegal sexual acts."
A responsible company would have tracked and documented this in a database to inform relevant authorities in the countries where it was witnessed, helping weed out this material. Disgracefully, just using it to "clean" an AI for the "benefit" of its users is akin to watching someone being attacked in front of you and turning your head away.
It left me thinking..
Insightful as always. Fascinating article on higher education.