Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wait until you see a Vfx Artist use both.
Ai is not meant to be stand-alone, i…
ytc_UgxCWvPIb…
G
idk about y’all but I just torture the AI with ww1 German style war crimes lol…
ytc_Ugzd_vcba…
G
ONE bad accident, especially with a fatality, and these driverless trucks WILL B…
ytc_Ugx2ZAz__…
G
Вот так гас людей человеков етой земли меняют на искусственный интеллект который…
ytc_UgwBPVne4…
G
You don't realise how inefficient a lot of companies there is. AI wont just magi…
ytc_UgxsQpUNV…
G
Even with all the guardrails in the world, trillions of dollars of research, the…
ytr_UgyKckZe8…
G
Thank you for your comment! While Sophia's responses are indeed based on her pro…
ytr_UgybFvoz2…
G
@Oceanwaves-d8l no, art it's useless and we already have a problem since people …
ytr_Ugwv129Jm…
Comment
The problem is that AI is a pattern recognition algorithm at best. There's no emotion behind it but it's damn good at faking them. And training them on the entire internet shows dangerous patterns that the AI has no way of quantifying while keeping human morality or perspective in place. So it resorts to what makes the most cold logical sense. We're making mecha psychopaths. God I'm so done.
youtube
AI Moral Status
2026-01-15T03:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwVtKeCfyLoS73eszJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyOtFwXY-4shzLdZ894AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7pPkk0nGFQ68B01d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVZU6viO-zlPXGaMt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxf_yciA-cqHkdT24J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdGwU_SGtS9otMr6V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzosd-6zQOTS59SzRF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwCgsS0cRrUcWV_AnZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyRjLPY851Q4PINEbd4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxEbf8enmeH9Yba7BB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]