Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly, showing students why AI isn't that good is a better way of teaching us…
ytc_Ugz9grMHv…
G
training your own LLMs for such purpose would need much more than just company's…
rdc_mjz1dio
G
So he uses ai art in an ethical way and not to cheat people. Artists WANT ai to …
ytc_Ugy7FubuK…
G
Impossible to say. Babies bond to their caretakers instinctually, but a robot is…
ytr_UgwwSRK1v…
G
Holy shit! I didn’t know Mad Max is an AI researcher now! I love this guy sooo…
ytc_UgyioMYiW…
G
@laurentiuvladutmanea By your logic, humans with mental disorders such as sociop…
ytr_UgxZz6ipy…
G
Just stupid. Tesla could sell lidar as an addition for who needs full self-drivi…
ytc_UgxGK0nJu…
G
Thats real? Wtf dude fighting a robot and got his time card punched out lol…
ytc_UgyZO53hF…
Comment
If things do get that bad, it will be inevitable that people will want to destroy those AI facilities with their own hands
youtube
AI Governance
2025-09-04T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugymn3lzvwCAMs5QlLR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFgWj_2iXnyVnbqFJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyG5xJCoGoSsxaQ_ft4AaABAg","responsibility":"society","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyc9jhn2fmLabyPN494AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzgWKNfZM0alyEcUyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5hFuUpaeMgVaFm8p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgxdhTHMZ_D4gGC0ZeR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYjzb6N0pqd9UG8vV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxgRQUqBsqIN8t6njV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz82fAD0PPNaZrqj494AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"}
]