Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i think the self driving feature should be used as a safety feature wear a human…
ytc_Ugyh44b6M…
G
I do take AI seriously that's why I don't use chat GBT or stuff like it.…
ytc_Ugy5-wtyN…
G
I would be more than fine without the need to work because I can focus on my hob…
ytc_UgxiAeiYC…
G
Great question! It would be interesting to hear Sophia's thoughts on that. If yo…
ytr_UgwrCvKtq…
G
As soon as I realized that I was 'talking' to AI more than I was talking to real…
ytc_Ugx5wxvEq…
G
This is funny... the judge caught this guy using chatGPT... but the funny thing …
ytc_Ugx0I6r0k…
G
I agree those roles seem safe now. The challenge is we are in a system where a f…
ytr_Ugzo5aesP…
G
All we need to do is get a few vehicles in front of it. To slow down to a stop w…
ytc_Ugyo7oTPa…
Comment
It always amazes me when the smartest people in society forget that a movement in this direction could make money moot. There would be no need to make money, pay taxes, etc. if AI can provide everything for humanity, they are tireless, self-resurrecting and intelligent. The whole ideology and "purpose" of society could stop being that of "survival", and movie towards "living".
youtube
AI Moral Status
2026-04-09T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugws3COwC8d4FufEc3p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy34YRYeQHpzz1z4st4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMjVlKmhiOmKPAfEJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyUKgSrcB0-U54ubT54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_r_WtMzRQdj1ADap4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKDrtH7jWDImYhAXB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyI1QzhU9jg5rveMHV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy9kHMazm7r56flNm14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgziIvpeKp5x_pxTWTt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxtDFDvZiN_D4zZXMp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]