Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You watched too many sci-fi movies, mate. An autonomous vehicle can barely drive…
ytc_UgxcvhfuG…
G
I can't get ai to make consistent sprite sheets with detailed directions, and th…
ytc_Ugw959Hpx…
G
Kabuki-talk galore .....................! …
ytc_UgxSg1gjn…
G
Ai is demonic and I will never use it! Satan will use anything to try and pull …
ytc_UgyXeLzjy…
G
So who’s going to maintain the bullsh*t code that your bullsh*t AI has generated…
ytc_Ugx_vSrl8…
G
Why are you guys hiring juniors, should be able to just have your mid and senior…
rdc_oi0nhb7
G
I’m ignorant about this but why would Amazon do this? like what do they gain out…
rdc_g54urul
G
Regardless of how good AI is or isn’t, we do get a choice in whether we continue…
ytc_UgxUeMpT1…
Comment
The teaching you is cool but I don't think that's going to be a benefit because the robot will be able to learn faster. And do more things. I think the teaching of commands and us giving ideas to the robot to make our life better in regards to business life and whatever you need help with is where it will be at
youtube
AI Moral Status
2025-07-25T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyFlK4-_bBUE1MyfNt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwAjV90fdAi98T4vox4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxpyIW55rEqSHMiMaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyCDgwxNWJY6UNKPw14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz_c_tq5xSNba9qXNp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz0ZempszscfWXwy9B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyRIxS0SKDZ8yK3ywd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvKWyi4x0kBhqWNHh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbPxTWJRecIor6hBt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnwsohVQo9zyAz0Jh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]