Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I build A.I & there are no jobs. We waited for them to exist it never happened.T…
ytc_UgzkOxX8S…
G
This whole process will change once they figure out how to automate the binning …
ytc_UgwMxBL1d…
G
"I got really good at lying...I started to enjoy it...I started to prefer it"
"[…
ytc_UgzY-mlbj…
G
This is predictive programming. Everyone who's anyone knows that AI will be the …
ytc_UgwOXYBiV…
G
Well, I see commercial artists like painter workers, not true artists. You sell …
ytc_Ugwmxvkgf…
G
He must have instructed the LLM to respond like that. The tone and choice of wor…
ytc_UgzBHEFKn…
G
The problems with self-driving cars is stupid but hear me out, we need more coni…
ytc_Ugy6pfZ2M…
G
Listen.. It doesn’t matter if an artist has put their heart and soul and 50+ hou…
ytc_UgytOalwz…
Comment
I think people in the west overlook something that I see in Japanese midea more is the idea that if you have an AI that preforms better on ALL intelligence test then humans then it would also be superior to humans morally. The thing we have to worry about is giving power to AIs is doing it with ones that are as intelligent as us.
youtube
AI Moral Status
2025-10-30T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzbpe_VtRtLrfYT2q14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTv1SbQpOov23wFap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8JovREX4z1BNKLzl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgztPYatKQfW7WONQJZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwqt_SKxgEL1MKMCNp4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyOBVO28zhlpiqBidh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1MAWYIsT_uytvNux4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyh1RvXNPKmD4-d0Id4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3VKeH7Xhyb7XT6Id4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGyvZRRYorjaWfiJ94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]