Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I blame the idiots at openAi and stabilityAI for releasing this thing on the web…
ytc_UgzDSLkKh…
G
AI already have rights. I know plenty of people who take care of their PC better…
ytc_Ugg2Aon9j…
G
I very much dislike Ben Goertzel's lack of respect for these AI. He gets close b…
ytc_Ugx_e5AZJ…
G
SO... If humankind workforce will be sub by AI... no jobs ...simple jobs..comple…
ytc_UgyB_Ftby…
G
18:00 sooo, youre telling me AI basically predicts information based off of what…
ytc_UgzoqF7cc…
G
I'm all for support of human artists over ai, but to say "never" is a bold claim…
ytc_Ugw-Nd5Lf…
G
how do they make so that you need C02 OF ALL THINGS for a ROBOT???…
ytc_UgxlCIEgh…
G
That card will be easier to be replicated by ai, than something simpler but toug…
ytc_UgzKMHSLT…
Comment
Yeah let’s make AI and remove all the guardrails and constraints. It’s large language model. Remove all the parameters guardrails and just let it say whatever it wants 😅😅😅 why don’t you let it learn stupid from bad data 😅😅.
youtube
2026-03-27T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzj8jyH3sf3dJesF8Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxm_dp9q7kwV4Or0_p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyDZxlv1YHWttUK1lt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCYfU2YYXjQri-0El4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwtiX9ol1tVKunw78l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy-c025rRBaUnU7BoF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzPC0HVkAKE_gLQpXZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmxUM1cqURqpyOLvp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugynxd_8BA1FbR_DGnR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy0L9gEvdwLduee_Pp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]