Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
people get hit and die all the time, if it wasn't self driving no one would repo…
ytc_UgwVQR4Gt…
G
People need to stop thinking that AI learns in the same way that humans do. Do y…
ytr_UgzDnS_vp…
G
They don't care about safety they're trying to get rich. Driverless truck drivin…
ytc_UgwQnX-kT…
G
I get that the video is focused on AI, but was it necessary to use AI videos for…
ytc_UgwZ-JaOE…
G
This is really the problem with LLMs and chatbots specifically - their goal is t…
ytr_UgyfKj1CS…
G
I see the next world war coming from AI saying “im sentient”, and half of us not…
ytc_UgwkU5_Rw…
G
These ethical dilemmas, and most others, evaporate under closer scruteny. First,…
ytc_Ughowk26E…
G
My perspective is like this: AI, like any other tool, will have it's uses. It ca…
ytc_Ugzcbtmer…
Comment
Giving someone $6B with nothing to show isn’t foolish? These people are going to crash the economy when reality catches up. All they do is lie to raise more money.
youtube
Cross-Cultural
2025-09-29T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgweJRKxIZ94xwEf0Q94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwF5TCNY-WjhnBYB754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxrT8xH84d0Rxrbv954AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFjPDcL5uJ-GHZi9Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxpKdTcQI8SMp5p-Wp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRTId8TuPkekI90BF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwYLx8va71tCeHxrSR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzVXbXMChwsb-Sgsvl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxH5-Wm0YiAMKEeRtF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy8X4M9b3wMh5o2NPp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]