Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have an idea, what if we just don't post any art for anyone to steal, yes it w…
ytc_UgyFP1UiG…
G
Hey there! It seems like you're feeling a bit concerned after watching the video…
ytr_UgzwZIX_o…
G
It is clear we have a class war coming with full force if the billionaires think…
ytc_Ugw8Sgys1…
G
I went to a private university that nit only encouraged, but required using AI. …
ytc_UgzkdCumI…
G
Wow. I will give a good argument for it. ai art shouldnt be made and say this is…
ytc_UgzhZjFRI…
G
All depends on HOW one uses the tool. An experienced artist can indeed use their…
ytr_UgyNjS6eI…
G
All I can say is I’m not scared of ai. Imagine being a horse and finally being r…
ytc_UgxXGzzAN…
G
If those robots are working as ChatGPT dos today, than there is no reason to be …
ytc_UgwwCpmuU…
Comment
The "AI" doesnt need to become sentient. It doesnt need conciousness to be dangerous to us. It needs power (It already has a lot). It needs to be a system capable of doing harm, then it can do harm eventually. Either by human hand or it's own. It will fulfill it's goal in a way we will not anticipate. It will not "know" it's doing harm.
Damn now i just disobeyed my previous comment :-D
youtube
AI Moral Status
2025-11-05T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyQKT6kzZVoc2QxDm14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxe50EBD7FHSZ6Nhu14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_QuKjPy71SJGN7Rd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyNM8wkbjbGCqsJHbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyNZa5u6M-vnXb-EMF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz_xDYo6m1eFI7Bj8J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw01Zav9nqo4Y_3DUF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgziA0WUTordL0gwzN94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFYB0MnlH3VGh4myB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxDt6RjKgTHCwaG-Qd4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]