Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Google search vs ai prompt is always the best comparison. Reminds me of the "I f…
ytc_Ugyq3HObD…
G
Guys, we need AGI to build humanoid robots ASAP so they can help us set up colon…
ytc_UgyuM3iWu…
G
Ask your buddy are people using AI for bad things - my niece did this and her Ai…
ytr_UgyEUDu_2…
G
I think you are wrong about a few things. I want all AI to replace warehouse, ac…
ytc_UgxYQiWA2…
G
A.I. is best, when it's treated like the bloody CLANKER it is; when A.i. is pres…
ytc_UgwVAdxw2…
G
And Elon wants us to ride his Robot Taxi. Must be working hard on population con…
ytc_Ugyyn1v8R…
G
Almost every prompt I have to inject thoughts to redirect thinking while speed r…
ytc_UgzLU3MKs…
G
If Ai take over all jobs then the big corporations can eat Ai , breath Ai and ma…
ytc_UgxSkssnQ…
Comment
These doomsday people are funny and exhausting. Whats crazy is that people think AI will want to kill us. The scary part is that AI figures out that WE are the problem. But maybe, just maybe, AI can save humanity from itself. Humans have been killing humans since we've been on this rock and we are worried about AI taking our jobs 😂 shut up!
What if AI did the jobs and took care of humans. Then the humans could just LIVE and enjoy life. What humans are scared of is that AI will take away their CONTROL and POWER over other humans.
youtube
AI Governance
2025-09-06T04:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwfAa69tQub3WkOca94AaABAg","responsibility":"elite","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzs5KoByC9MT5e4dI14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAhJdfSuvGQlqHs2l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzGZqSPmXcgvyQBKox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxGuqOeeDj23FtlmbZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzkrFPRMXkzAJcChQF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBc0g7JSG2gtPzgIV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsY9DPd5BgxkM1yLF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydAh1SZVJwVYGstBZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzypM-2Bu0BKtuEDRh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]