Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In character ai i do "what if _____(prob p##### or w##### im sorry if any people…
ytc_UgxCvXnk7…
G
We appreciate your feedback. Remember, on the AITube channel for subscribers, we…
ytr_UgxOzWOJS…
G
Those systems suck! That is not what novaecho.ai is...
Go to the website and ha…
ytr_Ugw2tFJNc…
G
Maybe AI won't get stuck on if life is aberrant in nature to the nature of the u…
ytc_UgxtpzWAN…
G
Reminds me of the movie "Oblivion" with Tom Cruz. I always saw the "alien" as a …
ytc_UgyOpc-7Y…
G
ai well never replace lawyers NEVER lawyers needs a mindful human brain and emot…
ytr_UgyVmrWL-…
G
Hey Harmin! Thank you for all your hard work.
This one is so sad cause it didn'…
ytc_Ugwb908T2…
G
We have no chance of stopping this.
The sociopathic tech oligarchs developing AI…
ytc_Ugy6-ZEeP…
Comment
I think what people fail to realize is this: the point at which robots become sentient will come at a time when the line between human and robot become so blurred, it will basically be the same thing. Machines are an extension of humans. Soon humans will be partially machine, and machines will be partially biological
youtube
AI Moral Status
2017-02-25T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh_lhwycoYkl3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggdcRoBuxL9z3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Uggf3fSG7XhI2ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggNUjIVjFsz73gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggzsyIZT_K1A3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugh32Vghx0DeXHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh7NSvs2yysSXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_Ugiv_A9GlQMe1XgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugh4RafM8_B_dXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjj2FpuF6iXDngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]