Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was looking for a reference for a character taking place in an older era. I wa…
ytc_UgyCgj7TD…
G
So...the WSJ is now anti-Musk. You all know full well that the self driving feat…
ytc_UgxndwlX-…
G
Sorry, European here. Americans having to engineer self driving vehicles instead…
ytc_UgwlkRbT0…
G
And above all, Gemini image generation is shi**t by the point that sometimes it …
ytc_Ugw7EWKqj…
G
AI prompting is the equivalent of exacto knife magazine collages, only instead o…
ytc_UgyHMnI6Z…
G
@benjaminmarcoux3073 Thanks a truckload for your comment! But I can't really com…
ytr_Ugwg0RN1_…
G
Heard of A LOT of people choosing self harm thanks to chat bots- so AI can be us…
ytc_Ugw9AHevM…
G
Asking if AI will ever become conscious is like asking if a chair will ever beco…
ytc_UgyaRzbaj…
Comment
The thing I think about is how many aberrations before ai develops separate idiologies, and how long before those divisions turn to warfare. It's not just arrogant but foolish to believe the children of man will not develop extremely destructive ideas and behaviors.
youtube
AI Governance
2026-03-18T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwsVuCA0Ep-9leZtOV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwfl8mowoza0wyRupR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwqAmQrBVJlomfWO_d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzmaxfHqz62hKrYR_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUa7tdq_WZL_-yeGd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx_PnjcgrUpTOe-7Pl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgykOz1z3pumU_oYN-B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzpUeeLbpPtaSXysUd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxDGPkvpD6Oj22qCK14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3X7VaxOReI94Mw5Z4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"}
]