Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Governments are the kings in slight of hand, using distractions to accomplish go…
ytc_UgyymEPoH…
G
These AI programs injecting themselves into “art” isn’t even the only case in wh…
ytc_Ugwi4EqRl…
G
The possibilities described here are not what happens when students use AI to co…
ytc_UgwhyYroJ…
G
Exactly, this is the paradox, I get its going to replace jobs, but the companies…
ytr_Ugy-lFVSd…
G
Nope, nope nope. Even more than before, I don't want a driverless car.
"Sorry,…
ytc_UgwUVJmo4…
G
@RedOneM Our brains are not machines, biological or otherwise. If you genuinely …
ytr_UgwTYJeGk…
G
So wait… the leader of AI safety is just a whistleblower saying we can’t make it…
ytc_Ugy2VaRrr…
G
AI can't operate itself it has to be given order ....but we can't understand how…
ytc_Ugx_Rokia…
Comment
AI doesn't need to change ITS ability to read our emotions, what it is going to do is CHANGE the way we project them. It's already happening in how we interact with automated systems like Siri and Alexa. WE are the one that is going to bend to the whims of the robots and AI systems. Our direct interaction with these systems more and more will change how we interact with each other.
youtube
AI Moral Status
2023-05-01T23:0…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzQY1iuRYpn1fTo0eV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwIkjWWaaEThEaUCzp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx5S4HjnzkfXUQaLJB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz54IlsrP8TuMonZ294AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxX8n-g2kwAAybIYep4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwb1MdBsyTStzB69TV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwrdVfHifUxYuEtUGp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzn8hc-QmfLr0f0kMx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyi-uBCLnQa3Hckx6d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwHNccXKOlTdxolVXF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]