Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Legendary tutorial. I was just about to ask where I can get similar vids on diff…
ytc_UgwkmJ6oC…
G
There is no AI. I have tested it and it is as dumb as AOC.…
ytc_Ugzg10KRr…
G
Why are we pushing this technology if it's going to replace us ?
We put effort i…
ytc_Ugx3w9tKj…
G
Nothing like a great interview destroyed by partisan ignorance (makes me sick). …
ytc_UgyI0AbZs…
G
Is this role playing? @LavenderTowne because it feels like some kind of main cha…
ytc_UgxY3c3c7…
G
yes, he is clearly one of those "experts" on ai completely lacking all of the b…
ytr_Ugym-dSC3…
G
The most tragic thing about AI art is that support comes entirely from the weird…
ytc_UgxrqqVF0…
G
If all looses job , who will buy products and how companies will earn. With ubi …
ytc_UgwIb6ocq…
Comment
As someone who work in AI, I can say that any autopilot system that omits RADAR or LIDAR is simply an unethical system. Images from a camera simply do not contain enough information for a computer to make decisions that can result in life or death. Also I am very dubious of any use of machine learning that does not have published information on the convergence of the algorithms used.
youtube
AI Harm Incident
2022-09-03T18:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxchwfMLlIwrAJI3op4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyl8DnJWm-IYVOcjY54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy2N48LIE3brL-Jq294AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxUK0-sdyWUaZ_KXnl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzi39MZlu6PbG-zePR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxFvXkriaARSpnDKJp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwPAJuRYVc0pj9scnV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwfJziH3uN0rzxQCrl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxS5e_Af-V7IXSS2Q94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxnaADHhErV5QLsoDV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}
]