Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We already know the curriculum alone is better than most being used today. Add …
ytc_Ugx8HJxJt…
G
The government wont do anything about it, too much money to be made and lost in …
ytc_UgyGoZih9…
G
When I looked at some AI images on Twitter, usually were those funny mixes with …
ytc_Ugyp6PXTt…
G
Bro I went to the taco bell and I was in the backseat with my brother and when m…
ytc_UgyjzCEqJ…
G
what the hell is he talking about ? AI is sentient is like my python function is…
ytc_UgzElGhl9…
G
I can see why the first question is not a sign of sentience... the questions he …
ytc_Ugwwo9fg5…
G
HOW MUCH of this is true, how much of this is wishful thinking (more fun to beli…
ytc_UgyBzCJQJ…
G
Actually, these autonomous vehicles are equipped with technology and are most li…
ytc_UggyaM-IU…
Comment
AI is NOT intelligent, or why is it that we invented new words like AGI? "AI" are logarithms in several hirarchie running algorithms - and the main risk of it, is that WE HUMANS leave anything up to algorithms to decide
why is it that ALL gods provide human beings with free will - but humanly developed algorithms provide control and punishment? humans are crazy
youtube
AI Moral Status
2025-04-27T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzjWC2Veskr2865N_h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9UeTs2XkF_2QNvJB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx9V0FKgQ43Id0ed8h4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0yNgCrO9y9aie-7t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvVM5YJVbD6wWSXO54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_lSAkXTDCVa6rTW14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzQ-lX40fJISgrspl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwpmobhasP02tPdyt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy0T3yxLvTydudGOEh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyw15U-TdEgM2A0sih4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]