Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What app is that? As far as I can tell, chatgpt doesn’t do voice chats…
ytc_UgymCOKdx…
G
@stephenr85 Well OpenAI recently got rid of important evidence in New York Time'…
ytr_UgzFqF5Dc…
G
Any talk of ethics is utterly absurd for AI. You should focus on the human exper…
ytc_UgxNGzcJu…
G
Its just funny as youll get people who dont want to actively train/fix ai. So it…
ytr_UgwQ6nvZc…
G
AI has already failed because it was created by the most dangerous animal on the…
ytc_Ugy4yaJoQ…
G
discovery is difficult because no one really knows why ai do what they do. the l…
ytr_UgzeoJR7D…
G
AI can have my job, just leave the paycheck. We will call it "universal basic i…
ytc_UgxyD_kip…
G
I made an AI!
I let him fight against many bad AIs but he was self teaching ever…
ytc_UgyXbNc2H…
Comment
Still it chose the wrong lane turn left. An intelligent humann driver would have taken the right of both lanes to turn left, to be in the correct lane to turn right after that. A.I. is still dumb and not on par with human drivers. I never would give the control to an A.I. in a car that i am capable of driving myself, no matter what circumstances.
youtube
2024-10-09T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwNVTIIAVOIn916UD54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy-PSHfWczfRlT8gQV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzIzbOj6P8E35t-69p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyFGHuy7CphzaMNhTN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugy1Pua3GDeI5oafNkV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwMzxhU3rjVS_rvxH54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzBgnSUCwrzMiUTeJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxYmj9flA9LuqT11J14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwmdzgLMLPCCX7b6fp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxmw3BibKP41oXStRh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]