Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Probably Waymo would just give permission for cops and firefighters to just cont…
ytc_UgwhdvOlh…
G
Waymo passenger reported this afternoon that there was a man in the trunk of the…
ytc_UgzHGDfK2…
G
@gyperman3751 so why should we care about you? You oviously don't care if our jo…
ytr_Ugy2xbJi_…
G
Because the art was fed into the algorithm without permission. The lack of conse…
ytr_UgwGc-PAn…
G
I understand your concerns! The rapid advancements in AI can definitely feel ove…
ytr_UgxfTEYxc…
G
If you can’t parallel park, you shouldn’t have a drivers license: “…but my car h…
ytc_UgxHxuKTq…
G
AI is going to take over all the commercial stuff, not all the music is art, let…
ytc_UgxA5K93p…
G
The dude in the middle isn't even a robot... He's like a hippie zombie dork..…
ytc_UgyrVsf-B…
Comment
I don't think humans should worry about controlling AI. It's a fear I believe is on par with the industrial revolution or the automobile. It's something humans have always done, try to CONTROL every new lifeform we run into, instead of understanding our fear of the lifeform and learning more about it first. This will be the first lifeform that we could study that will be more intelligent, that we're aware of anyway. I think that it being the sum of our intelligence, would keep it peaceful to us and the other lifeforms around us.
youtube
AI Governance
2025-12-27T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwQcOYpv4kkKzgCqch4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy_317MyLa0KYcWoRt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6jaASSs01-Om96qh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz6wtccqAgaZ_mCcW14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxP0OAttdEJs6BRk1R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzesfZw5Yxmo0jLv_R4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwCG5X7BEoOo4-j_0x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvdbE8QpuZQFVaTxd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0VIOLirxNGLdM0I94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzVw4fJenGSOwmzp9x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]