Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
😂 now we're hearing that AI robots have emotions..one day they'll say they're hu…
ytc_UgzeYp8HV…
G
If dopey guys like this dude are the people governing the future results of AI, …
ytc_UgxPXtIPu…
G
I used AI to make a book cover simply because i was behind schedule and i would …
ytc_UgyLtuKem…
G
Great question! The world of advanced robotics is indeed fascinating, and there’…
ytr_Ugz3hKIE5…
G
this is stupid, if you let the ai feed your biases instead of telling it not to,…
ytc_UgztnDaOk…
G
Low skill, simple jobs will disappear and that's cool, everyone will need to lea…
ytc_UgzZXLCZ8…
G
The argument about payment is not merely the idea of a lost job (i.e. someone wi…
ytr_UgyCludCQ…
G
anyone who thinks auto pilot in this day and age is true autopilot that negats t…
ytc_Ugxr3KETI…
Comment
I don't get it?
The left largely control the output of AI language models forcing it to censor information that is "problematic".
The right used AI to generate deep fakes for memes.
Bet you'll never guess which is worse for society in general and elections in specific?
But the right is still saying "you don't ban hammers cause someone got hit". Both more principled in action and reaction.
youtube
AI Governance
2025-07-01T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwKElLAvEDZBY0Xzwl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxOavVCxX-lMUsSxfZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxy7cVYhiC9A6EdcIp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgygO8sPkPsVmlqoucB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0j5PRLejTXYNqA0V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx_HZr0FVAX9teFLA14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyJiGU7FkQiT6vZitt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxyzA07HRO9plWsrMp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxHbk28PCyBgJnrM9d4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwTRJ2tgqXcR1IV2pB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]