Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate ai it’s stupid we’re acting like we haven’t seen terminator or heard of i…
ytc_UgyMY17S-…
G
The super intelligent AI argument is plausible but they would need to have or be…
ytc_Ugwhq5pAB…
G
I would want the same AI School Plan that Elon Musk uses for his Children!…
ytc_Ugx8D9JP9…
G
Your argument about stealing/inspiration is spot on but I would like to add, the…
ytc_Ugwnz1r-p…
G
Why not using both radar and AI cameras and if they disagree, just make a sound …
ytc_UgyiVTXmN…
G
As a Black proud Man i have no more to say that that is one hell of a Based A.I.…
ytc_UgzVQxOx2…
G
I can name tons of jobs that can't be replaced by AI. this is making a mountain …
ytc_Ugz0XWdQE…
G
As someone working in a similar space, his view is very similar to how I would j…
ytc_UgwG7kGHv…
Comment
In 2025 the human race was standing at the AI cross road . The right road is Gene Roddenberrys Star Trek the left road is James Cameron’s Terinator . The way human nature is it will be the left road .
youtube
Cross-Cultural
2026-02-07T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugywft4lSCbgyQk4CAJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypxhoSWxvwn1sOm0R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxMIvqBSiH-2KAct_Z4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzUhcWFR45BybQ4oVN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw8FracifhSasQU7pR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwbV2LlZ0eQPmLQFz94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwzCT6S4k8Hloj03rN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyOOTot7Hwu21MaEfl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyc7IouW5ftoTm4fQF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyCH738S2q03cQ-2xF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]