Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What is going to happen is they actually think they will always be able to be in…
ytc_Ugwc3d23j…
G
Investors love confident douchebags in t-shirts babbling about AI this, AI that.…
ytc_UgysHI1L0…
G
If he copied the ai then yes he should get banned but if the ai art was done aft…
ytc_Ugx3FhEG3…
G
I think we'll create AI that will manage AI, AI consciousness testers, AI ethica…
ytc_UgypRRiNL…
G
AI doesn't have "common horse sense"! It's ability will be limited when difficu…
ytc_UgxIUL0kf…
G
Ai Needs to be banned. It's absolutly dangerous and causing a LOT of damaged to …
ytc_Ugy0pVmXE…
G
Robots don’t get tired robots don’t get attitudes robots don’t have marital prob…
ytc_UgzIlfnGn…
G
So it depends on how you want to look at this. Cause the animation that is "done…
ytc_Ugz8BKXSM…
Comment
This is all B.S AI has not reach this level. This is all advertising to get money. Current AI are comparative systems and they are not sentient. Just look at all the AI system on how they produce things. Why need huge databases for AI if they are sentient. Needed if you are a comparative system(variation of a search engine and extrapolation rules). This is the Y2000 of our time.
youtube
AI Moral Status
2025-06-05T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugww-P3BN8A4bNchrGt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyOoD4xTnRdoEdB_G94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy8GxDoc9OFH6Mc8e94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyTVs9amzXIPDD5t794AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyJmj3oeR_onadNnSB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxcE2XHUo3NQm2bXlh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzU9TvD-_Dymrva6rx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxWUcCIKMMoM7Z-aep4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz8NnU_UvIqofKRYZt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwKJVyjM1sRlZS4Nfl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]