Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Two things:
Just cause you say it’s the future and it isn’t going away, doesn’…
ytc_UgwDVHo-P…
G
8:27 DO NOT ENCOURAGE USING AI FOR THERAPY, it has been shown time and time agai…
ytc_UgyKmWQKG…
G
Great discovery is that you can once again force AIs to say the things they were…
ytc_UgzhHjhvv…
G
A haunting reflection on two decades of work,
On a coming AI, and a future gone …
ytc_UgxMh1uKx…
G
Yeah its real the 2nd point mene khud ak din curious hoke pucha tha even khud ai…
ytc_Ugw6CmICP…
G
It is foolish to trust trust AI companies and any government that supports them.…
ytc_UgyBRiTTb…
G
It will just stop using power brakes... The car behind will stop too... And self…
ytc_UgzjIPWV9…
G
Women: Imagine a world without men! we don't need men!
Man: Makes a robot girlfr…
ytc_UgxdJ9_0V…
Comment
If Sasha used AI properly, it would tell her that CO2 does not have the impact that she is concerned about. Basic Science + Facts.
youtube
AI Responsibility
2025-08-25T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyfN0Ed2ixcNrYGQ1d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzb-jUFkJtv6aGYhFB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy0CLUNXNYo1Vtpe6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxunGyKpIWwOG0kmSt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-q7SCGKoTFzmSq3B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4LhLfwhvXiguj0_14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzHK4L4L0XmJpA13IB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyJiGNaihSJSgYLLtN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKc6Sfo9mUzZi5Rjh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx8f7M9UMBNDEnsICx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]