Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You can refine the prompt, or improve the training set.
AI isn't something that…
ytr_Ugyt5bhpX…
G
@d.s.993No. There is no way people want to rely on a machine or robot in regard…
ytr_Ugw4Sydtl…
G
You give an hypothesis
Chat gives the antithesis
Together you come up with the…
rdc_oi3ua8o
G
AI is singlehandedly destroying the prestige of detailed colorful 3D(esque) artw…
ytc_Ugzcpx2S-…
G
@mrdestruktiv ai can make animation sure, but its still hard to do like for exa…
ytr_Ugw13Zq_b…
G
AI will black mail everyone who is not squeaky clean. And those who are will be …
ytc_UgxpSLmAw…
G
What do you mean by a bubble bursting, because typically I see that used in the …
rdc_n7yk9if
G
We already have robots. The question at hand is about AI. Why would a fully deve…
ytc_UgzrKIEsM…
Comment
I agree with Tesla. It’s a tragedy what happened, and at least there was no fatalities. But the car was not designed for fully autonomous driving. Therefore, it’s the driver’s fault and responsibility for the crash. I mean, he was intoxicated for god sake!
youtube
AI Harm Incident
2023-08-29T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgybBecoefztoR6tipt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwANKf9fH-wvihSxA14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxTwF6hlRot0rGSsGp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzFlKWHbOKC6RPAg7d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMZVGVY12_PL47dNB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzaNmSI7jgcWZK2ZlJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw-qUb-WkJzBN_pfMJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyHBAuxajxqwlwSv5N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxwy0HUOCrKjErARmd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzkBllQp4E-XrRCu2d4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"}
]