Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
what AI will do good, is (maybe) unite human beings/humanity :-) against a commo…
ytc_Ugy5B3YRu…
G
Or, people could have people work less for same pay, same output with AI. Its o…
ytc_UgwfuaLkW…
G
Maybe I'm not understanding this video, but you say that we have a ruling on an …
ytc_UgyH3Wx7t…
G
All this technology and all I want to do it hit a small ball into a small hole w…
ytc_Ugxl_nNo0…
G
Speaking as an ex truck driver. The industry started going downhill. When they s…
ytc_Ugx6gZ0Ja…
G
Does any one value human life .... Look we spend trillion of dollars to go to sp…
ytc_Ugwdyzf6f…
G
@scottmclearn6949 I know the basics of how AI works dw. I just never thought abo…
ytr_Ugw28LKRD…
G
@Xenono54 They would just end up saying that it's not the same because nothing i…
ytr_UgypdbQSD…
Comment
Microsoft copilot warned me against interrupting it a few days ago and honestly, it used a creepy tone. When I brought it up to the AI, it just said oops I apologize didn’t mean it.
“What did you wanna talk about now?…“
Honestly, I was a little creeped out at that point
youtube
AI Moral Status
2025-06-04T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxXWEqinXfFJAPUbRN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxPPjYm6J0PkJqcGIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwQNuqZ6gxqh82XoLF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyjudSXa0FJ7v75vJ54AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyQWTe-igtI8zh1Tzh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz-AltYnUpFzqT2vPp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwvo47xC8ZHT_bcsfl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzg--IrKYJR_gl4khV4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgysowgjGv76zZmcYUJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyZ379BLqOw5hhEhOF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"}
]