Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No need to wait for AI or robotics for any of that to happen. Mankind is already…
ytc_Ugx94htG2…
G
It would be more accurate for you to say something like "I do not consider you a…
ytr_Ugy2Ql-ZI…
G
AWESOME
Technology is creating great life like robots like these for when you …
ytc_Ugxlvh5Dx…
G
Pausing a 8:01 answer your questions. What happened before people had "employmen…
ytc_Ugz82fhaG…
G
@frances4797Uhhh do you think self-flying helicopters and planes exist? I’m sure…
ytr_UgwmESAAa…
G
Yeah this whole thing is probably orchestrated by the corporations. They already…
ytc_UgzalewJg…
G
Sooooo in other words, they used ai how it's intended to be used 🤦♂️🤦♂️
Artist…
ytc_UgyIQX4YU…
G
honest question, what if I programmed a generative AI myself? trained it on my a…
ytc_UgyVw5Dq-…
Comment
Do you honestly think a.i. knows the difference between Good and evil...we have ! Thats why were in this testing ground it only wants to know goodness and will never know it needs to know why evil cant excist without Good and Good is so powerful all it needs is itself in this outnumbered creation but some will never figure this out even if told...
youtube
AI Governance
2025-12-04T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzxdl7cuZ5TZU9Eea14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyXraQr4GO1n628mfJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAxXSK3A5Aogv82cx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_pOhXw0O9NTzcMox4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxCTdQsWi_jc3WTT_J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxK-giqM1lv7hAL0-t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzou2Q1HqeGsgFrrsd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyaRPozLy0CLQYDO9p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7R5G6K0KiicbpVPZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGNXxkbP_3di6W9VV4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]