Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
when the news runs the same headlines word for word.. its programming. AI can ne…
ytc_UgzzTqOQs…
G
to be even nerdier, we have been using AI forever, every monster in ever game ev…
ytr_UgzmRVErH…
G
The trouble as far as I understand it, is if anyone wants to programme AI to be …
ytc_UgxupcZ14…
G
Perhaps ethical AI is not desirable because it ends up increasingly in autonomou…
ytc_UgxSI-cVT…
G
Or code AI to actually help others. Remember AI is codes a human put into it… th…
ytr_Ugyxb5dCd…
G
if people want to entitle to the benefits of ai shouldnt they also absorb the ri…
ytc_Ugw7RqZ8T…
G
How do even get those responses from grok or any AI? I never get that...…
ytc_Ugyp7JE_X…
G
So he wants youtube regulated and censored, but others are ok because they align…
ytc_Ugw0s9FMT…
Comment
Stupid humans just want to show off and to make money by making AI robots but they don’t care about the future generations on this earth. They want to base everything on science, but they don’t even look at the Bible when God already predicted the future. Most humans are like dummies compared to future robots. The dummy humans create them for them to dominate and destroy humanity. Some dumb guy said “all you have to do is turn off the switch “ but they’re going to be so smart , so much smarter than the smartest humans on this planet, because they have all the information saved and programmed into their computer (AI) , they’re not going to allow you to turn off the which, what are you gonna do about it????
youtube
AI Governance
2023-08-22T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw0QDSJlRQIerImH8d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxusdu9kf0-l1oqNV94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcIa9NlAucx5Js-HZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGfn_d2GVUunCOlZJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyahnMx8ymoVRZpDfR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyPFwM-mrgBjO7SYap4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxh2jfme18ztVbDRb94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQbGfIlv2poUQ-3Kh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyOYEOjr31ISR8kLOd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyPAZjxiO7gBrTvYD94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"}
]