Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“Human values?” lol, humans do terrible things to get and maintain all sorts of …
ytc_Ugy9rIy6O…
G
AI is different than other tech innovations of our past, for sure. But like othe…
ytc_UgzemRghA…
G
We don’t need John Connor. We need the AI version of Wikipedia, in that it’s a f…
rdc_k9i243g
G
So we’ve really taught AI to be like us and it turns out we don’t want super hum…
ytr_UgzAFidDe…
G
Listening to these men who say they are warming us about AI is like listening to…
ytc_UgyK__KdW…
G
My question is as a aspiring comic book creator who lacks structure, i use chatg…
ytc_UgwDoe84n…
G
Give the computer the award, it did all the work. All the so-called AI artist di…
ytc_Ugx4INMB7…
G
our art teacher got caught using ai art (very obvious) for a event poster for ou…
ytc_UgxtAmR7i…
Comment
Ai would need to become a lot more intelligent and become cognitive before it replaces people. It's just a giant collecter of statistics it seems to me at the moment and even then it's answers can be incorrect. My Google smart speaker hasn't even figured out I ask it to tune into the same radio station if it doesn't hear my voice clearly. Go check out AI hallucinations and perhaps be a bit more sceptical. Browsers are trying to force us to read AI answes first. As for the future well who knows but simulation sounds like a con to me. "There is a thin line between madness and genius" and "you can fool some of the people some of the time but not all of the people all of the time" spring to mind. Unplug and go for a walk.
youtube
AI Governance
2025-09-04T22:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyqo93NdBxmXjtjIzt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwUZaCD1HiqSAMX7_F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugyqs0oNEQXWLRDmSIF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGGGrQzSkAEENJ7bp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx2ROtkNd4utYHyMmV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyzGghUHc_WC5Z6ubd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxyINZ2NYG32HjBPz54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxmE_G9WlOVoECyO-F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyN7rGjpYF2mVPZ8-94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzLVn7gHYggDj6DprV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]