Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I did. I have AI proofread it for me. I tried ChatGPT first but it would complet…
rdc_muylzx9
G
The late great Stephen Hawking said AI would be the most dangerous thing to mank…
ytc_UgyVAtHkm…
G
I use Character AI for like roleplaying purposes and if im bored but i literally…
ytc_UgxgkoFL8…
G
Had no idea how much Ai is scattered everywhere all ready in such a short space …
ytc_UgzeXkZE5…
G
@Eisenbison damn, im dumb, maybe we should make a policy that wvery time the ai …
ytr_Ugx_ETdU5…
G
What happens when you don't have an alibi, and can't prove that the facial recog…
ytc_UgxjO23tC…
G
Ai should only ever be used for things like those live translation glasses that …
ytc_Ugzc28gU9…
G
It should be obvious that Ai is (literally) a soulless mimic of the human condit…
ytc_Ugx-7bIRC…
Comment
It's time to start practicing by watching the terminator movies over and over again. Hopefully A.I. will be intelligent enough where they learn reason. Tbh I think it would be cool to work along side A.I. However, if it can not learn compassion and reason then we are all doomed
youtube
AI Governance
2023-07-10T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkxE81tSMeh2SP2-p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzhsq-ahjfszBvIVol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy4RvucKJCj_c28EdV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyLBxlIpLz-yuTRmyR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx405qLXsQc4kPy01R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzX2i8ef5CdaDEfxqR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw03nsNPMnuzvvZK9B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzJgsbPlReSldlHart4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwRZxtRrC4eDJ0lscN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzzUbnJxAMmO_ps70l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]