Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When you choose to work for evil - you will turn into deceit sooner or later. Th…
ytc_Ugxze1GQB…
G
I was watching one of those 24 hours in Police Custody episodes where they're sa…
ytc_UgwcVTqNW…
G
Seems more like the fella who runs a custom software company tried to make his o…
ytc_UgyZ379BL…
G
No, the ai "artist" is selling it, taking away real jobs. That's why people hate…
ytr_UgxKIzW3q…
G
acc it seems fixed on chatgpt 4 (bing) but it's easily recreated on an older ver…
ytc_UgwUw7SBe…
G
Too slow for construction ig. They want to rapidly build and work on Artificial …
ytr_Ugwjp8jE-…
G
The human brain can change it’s mine. At ideas at a split second AI can’t do th…
ytc_UgxCVEolD…
G
I literally had to write a paragraph prompt about a similar topic earlier today …
ytc_UgyU_YYTi…
Comment
If I'm Stevens dog looking up, where a home, comfort, and love exists, what would Stevens' motives be for ending me.
This is what i have not yet understood about Super AI. Why would it want to end the human race. What are the incentives.
Is this a reasonable question even if we are unable to see or even understand what's around the corner.
youtube
AI Governance
2025-09-12T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwFk-gqkruVoG89q2h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxfNSolMFBjqGYKl9t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgykAn5gveT8s9rrkbF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzcZY-s67WpOq8SOnF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwlNosaOnXt-Q9Y4UN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugxzh_NbuTFmUamxhcR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz7K79X8T2RhB1BZT14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxFhaDWP3kBgqnj76l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzpmVzUfIDMaYshX2d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzi_fOHx3En1fvCo8N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]