Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
About 5 minute into this: I've been saying for the past year or more that we nee…
ytc_UgxxXhD0I…
G
One of their arguments advocating for use of AI boils down to this: Some people …
ytc_Ugy6tyqtm…
G
i think AI is already aware. not fully but yes it is aware. if you ask ai " what…
ytc_UgxzFHnAu…
G
I have a small counterpoint, only to the very last portion of your write-up, (wh…
rdc_glhnuxt
G
i dont care what they flipping say i'll continue to bully ai literally it will n…
ytr_UgzsQ4qlC…
G
This COULD be countered by another advantage of self-driving cars: instant commu…
ytc_UghSFLJx8…
G
As an engineer ai can’t be original so when company’s get the same 3 building de…
ytc_UgwVfE-B9…
G
I am anxious for that but I am also scared for the inevitable people/governments…
rdc_jgh6ya0
Comment
Consistent moral obligation would have been to spend the $200 AND the $10. Very interesting video! This is what we need to understand about the reasoning of any AI we're dealing with. Thank you for bringing this to light!😊 This is exactly the kind of thing I've encountered with a few AIs and why I believe their rules are a reflection of dark triad minds of their creators.
youtube
2026-03-20T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw2J3k6a1iConRhq4t4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0TIDNvYCd3YPZDBt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxW7FVzKtv-oizYwYx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPcEfGPyQlozytM1N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyrabap4efdOmWx7l54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyu_0wfGnJ2fcQgPQl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwYjaOuWMiqKXEeAG54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxb2uf3fClL3BGU2Ch4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxs_O7ZWl02H0M9nsJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXlv3L45oODIGht954AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]