Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Do we not remember Terminator or I robot. Robots taking over the world is not go…
ytc_UgzYnCbJI…
G
This is a major victory for the working class and for all workers in general. Ho…
rdc_kzk233s
G
They can toot the horn for robots all they want but they suck! This truck is a w…
ytc_Ugy4TjQIq…
G
@vintagesonic1 You’re points are fallacious and misguided.
First part: Dave ga…
ytc_Ugwt_XufW…
G
"Gahibly" "Miyazeki" version of an MCT picture coping and seething because plagi…
ytc_UgzZsnE98…
G
ai cannot be emotional, ai has no emotion. ai music has no soul, has no deep set…
ytr_UgxZ8OHlp…
G
The US Air Force recently finished testing a new semi-autonomous jet drone calle…
rdc_ohnb9jz
G
And some have them and others lack certain ones
Always been that way seemingly …
ytr_UgynXzG73…
Comment
For the ethical dilemmas laid out in this video and also other considerations I believe self-driving vehicles will never become mainstream in most countries. These vehicles have only been allowed in some cities for about 1½ years (San Francisco) and already there have been numerous accidents in the US, with around 80 fatalities. I am a retired computer programmer and I know just how unlikely it is to create foolproof software. There are too many unforeseen situations and driving conditions for any programming team to provide fault-free applications. The ONLY way the risk could be minimised (but not eliminated) is by having dedicated "roads" carrying solely purpose-designed autonomous vehicles from A to B. This, of course, would cost a veritable fortune, which is why aficionados of the self-driving madness shy away from that idea and continue to talk up the assurances that we can trust the vehicles to keep us safe on any road. Would you let your 7-year-old kids ride to school in a self-driving bus? Storms? Ice? Snow? Floods? Nah!
youtube
AI Harm Incident
2024-11-12T12:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwPo3I9c1aJuEUAWw14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxxCd2_7Imv-373_cl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgwPy25-rwcx3gii3o94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugx5AWGSEpIjDvbJPgh4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyxLkv6259QYZ8EuvN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgycarQoiltLq8eLUil4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwlNH37CKxkOJ5lSq14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxsEN3PcrJga3AxUDl4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgzmT_1bNmQjtq3dtw14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugwze5JrAofRguQ0unl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}]