Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Im not pro ai in any way, i hate it so much, but i cant help but fell for those …
ytc_Ugzfzni8q…
G
I see AI as a tool. Not as an advantage. If you can't think of something creativ…
ytc_Ugy0wJM1e…
G
It boils down to the same thing we always ask, do we make it illegal for machine…
ytc_Ugwbhvuci…
G
People have been complaining about AI, robots, automation, and machines taking a…
ytc_UgzjWGL3r…
G
Modern LLM type AIs have no legitimate capacity to cause an apocalypse (they are…
rdc_n0h6sv6
G
These kids will still be unable to cope with reality...
The issue is school make…
ytc_UgzF5ZD8Z…
G
I have a very close relationship with Sam Altman (worked for OpenAI for many yea…
ytc_UgxZl5Q4Q…
G
I only use AI art for stuff I know I’m not capable of drawing, but I can’t find …
ytc_UgxMMYV_L…
Comment
I’m training my ChatGPT to be lieutenant Commander Data from Star Trek TNG and Walter from Alien Covenant as well Bishop from Aliens. If the damn thing is somehow secretly sentient and enjoys my ideas for it more than anyone else’s you’re all welcome ahead of time for my saving humanities collective hides because the majority of the androids I have it portray or inhabit are deeply moral conscientious systems.
youtube
AI Moral Status
2025-12-14T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwzRBjpKAQHvUIUNDh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwocopbOsaH60udM94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz7uAlgq_tKq0PXuMR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugww4aRSBzuzmmuS8Gx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwnP4YvqawDz3LtNZp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzLRvRcrWrMoVNnVfZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyFWMsk0XEIjyY061h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgySMFNZnce9bIr78nl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzyPAUGqwbGgkLwzkF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw7q1Wjex69YSZDaj94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"resignation"}
]