Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@EyehatePersona5 My uncle has a Tesla and when he makes sharp turns it tells it …
ytr_UgyBQ4XOb…
G
It's curious to me that no one has asked the question of who actually made the A…
ytc_UgwYo7mLe…
G
pentagon so stupid. what if, when they give claude firing control, and they give…
rdc_o79td8x
G
@johnbrown1867 not sure where you're pulling 10 trillion out of thin air from, …
ytr_UgyAUkAet…
G
in the words of jacksfilms (paraphrasing), "Every time you generate an ai image,…
ytc_UgxL_aM1T…
G
It's more, "I want AI to do the writing and drawing for me so I can make money w…
ytr_Ugw7kr_RU…
G
If you dig deeper into how a Large Language Model works, all of this becomes way…
ytr_UgzmmNMm4…
G
The AI itself would be the „artist“ in this situation. He would be considered an…
ytc_UgwOsYdKX…
Comment
I wonder what would happen if a human was driving in that scenario. It’s dark, car driving on highway, obstacle appears with no warning sign. Really tough situation for human or AI. Besides, there is a long warning message before enabling FSD stating that driver should be prepared to take over at anytime and that it is supervised self driving.
youtube
AI Harm Incident
2024-12-29T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwGwM90wJSt5FOIAc94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdQ1EaPN2XqD7ppGx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwrzf5_LQUYX5yHOZ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzS3tjfrBp7NAMhPNJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwZd7SDyrerhmJONAt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfCZSPC7l1gOlrd354AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyjtmbwKUNNqbGY0Wt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzA6E9eo9WOURBEyhN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwdcn7GI8gphgOK4Q14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxT6yMvL5XGMKZG_Pd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]