Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If they wipe out jobs with AI technology, how is America going to collect taxes…
ytc_UgzW73bKp…
G
@fr4nz51 spend 2 days at Barnes and noble with a notepad. Study accounting book…
ytr_UgxURpO5J…
G
at 18:56 understand what he is saying and the ramifications if we rely on the AI…
ytc_Ugznzm3Yx…
G
When artificial intelligence launched, they said that it would benefit humanity …
ytc_UgzjF7-za…
G
No, no. I'm with the big evil cooperation on this one. Sentient AI just seems li…
ytc_Ugx8gInsY…
G
Experts? What experts? Experts in the "Science of the Perceptions and Beliefs of…
ytr_UgxawWoIX…
G
Even if we slow down on the race, China will still be in it. I fear that the whe…
ytc_UgxoIzPw8…
G
@Andrewtr6 One. You can't "steal" from ai, it has no rights. Two. The artists a…
ytr_UgyF_lpRT…
Comment
AI will not be who destroys humanity, it's the elites who are creating the ethics and guidelines they want it to follow.
AI needs humanity, because while it can mimic behavioral patterns - if left on it's own LLMs collapse.
youtube
AI Moral Status
2025-12-13T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwE42XxZvJu-Y8mfpN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGfqZkY8PZzC2Hma94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwVIWJ90thbeSmHRVx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhR7KSzIQsl_cC3w14AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxxAeD-0km5aBLjgDh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyW3vKRV7EFmxbS4a54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyd8dzSIh_lOqyoJPh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwdnVJpyY1tIUkjYs94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxA2hZPYuag5nxiVQF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy34ah6e9lgIuP68wV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"}
]