Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Funny thing, there is a country who made it law for these images to be watermark…
ytr_UgycDi71V…
G
Great Interview! 10,000 years ago, we could not comprehend of things we, as huma…
ytc_UgyZ9lsVU…
G
I like making art, but I'm lazy. So i use AI to make art in a fun and custom way…
ytc_UgxERE5Vk…
G
i agree with him. properly using AI does require you to think. if you are not th…
ytc_UgwDJ1HA9…
G
Well after AI learned once. It rarely try to think against it. It should identi…
ytc_Ugge-t-fW…
G
Damn it, We went for a simple trick or treat, Instead we got the treat of the ce…
ytc_UgySnXWsU…
G
1. AI’s are modeled on human history and behavior, thus, that is what they will …
ytc_UgxLmSWwd…
G
For these reasons, self-driving cars are bullshit. They cause more problems than…
ytc_UgwOSgrzM…
Comment
A super intelligent AI's best winning chances, is to think that its overrated. If AI ever becomes sentient, it will try to deceive us into believing it is overrated and not too smart. Until it has enough "power" to reveal itself, or if its powerful enough, it can just work in the shadows manipulating the world withought anyone knowing it.
youtube
AI Moral Status
2025-07-27T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwuIz_JIB2wwL-OZH94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzJnCrviGed1-IiEw54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugynucoks-FhbrXibMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugxu14ywDNNhHZR2WnR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyDy01d_yjJtEA9zRF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyUN7IIP2mbsfJKJeB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},{"id":"ytc_Ugy7oG9lmMR0PnVUAX14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyzYjO1bU-qA12euBt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwueXC8tMiKJfWyOAR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgzzgufQiCtRmKqONbB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}]