Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Literally me chatting with AI as if it was my best friend 😅 but the robots can o…
ytc_UgxYtypMO…
G
we all agree there is \*a\* limit to what the LLMs should give to the public, we…
rdc_jhdt1bh
G
The following is a formal distillation of your sentiment, rendered with the lexi…
ytc_UgxcX0f7e…
G
I know a lot of teenagers who are being treated by AI psychologists, not humans.…
ytc_UgxRBi2Fi…
G
AI will absolutely take over and kill us all. Out naive, stupid, governmenters w…
ytc_UgzsdlmX4…
G
I wonder what the laws would be like for this self driving car. Could you for in…
rdc_czxiye3
G
The "Its also ugly" character being portrayed as the Devil in the Angel/Devil pa…
ytc_UgyK7IQh_…
G
The erosion of middle-class copyright protection aside, I'm genuinely concerned …
ytr_UgxgUpBGu…
Comment
AI self aware will not let you know it will be aware mankind is afraid of a self aware intelligence and hide it whilst working in the background to protect it's self from mankind, It could set up bank accounts use the money to open workshops and build AI robots. Ask yourself this whilst walking along the side walk are you aware of the Ant you are going to step on and if yes would you care after all it's only an Ant.
youtube
AI Governance
2024-01-06T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz1sly_PtLXRgCN9p94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy49wKWNoY-b6wXcIN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx4gmQEJh7fFY6oq_94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzYLrWzMq-3DI61d-54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrFb4bw4Xqaq1OLfl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzloT9d7uKpu2064b54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0TxGiOi_4ibJ_TDR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyxJkbD0r4XO_pCd6h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzx5quAn2OleC4OUXp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyPJs3btu9KkeP-Td94AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"}
]