Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He is misinformation AI ROBOT I GIVE LESS THAN 4years and they will be far more …
ytc_Ugwg4a9Bz…
G
Its called prompt engineering actually, which is a little bit of a skill but not…
ytc_Ugy3KW8V-…
G
Our school has given three options to us
1) information technology
2) artificia…
ytc_UgxyFBf8J…
G
Interestingly, as a species we've had the technical ability to clothe, feed and …
ytc_UgxpUkME3…
G
Ai "art" is ehh
I have done all 3 of them
Traditional, Digital, and Ai
I like…
ytc_Ugz0GJXrm…
G
Also, fun fact, it can't ever get better at what it does. Because the art it can…
ytc_UgwcDJhck…
G
This discussion feels like the difference between Frankenstein's monster and giv…
ytc_Ugz5e7EHo…
G
As someone who is a fan of ethically sourced AI for making accessable skills I d…
ytc_UgwqVK2Z4…
Comment
@zelle8651 if you want to give up, you can. If you want to fight, it will delay the inevitable. Whether AI takes over in 2 years, 10 years, or 100 years is still up in the air.
youtube
AI Moral Status
2025-06-27T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxetQysDQj9_Q2xckF4AaABAg.AIxPYFx4QiYAJp3ryBuMn_","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw4nzYNIQEQ5juTJBx4AaABAg.AIxOmVudEl4AIxaYDix11J","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugx3iYiKFHOGB5EjQ_J4AaABAg.AIxN5TEdFlJAIxcC2e9iIM","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzSTGvcRHkkbjtzo6d4AaABAg.AIxMvz7yH0oAIxNOcl55dn","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugxq7-j9ziOIIivmij14AaABAg.AIxMW61t5GJAJZWcxYU07I","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgzWujMYveyHzHSWLbZ4AaABAg.AIxLTWqGG5hAJrulJeb95H","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugz_jQzq2Hsc4FyqE3l4AaABAg.AIxLPr4JaLIAIxOM1AWydp","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugz_jQzq2Hsc4FyqE3l4AaABAg.AIxLPr4JaLIAIxRPKpM2MQ","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxEIlC-DNKKPe6pwd94AaABAg.AIxLKKl28qPAIyzJmsOx_h","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugz5Cmz2kdC5ml_KXS14AaABAg.AIxL8X-jkPdAIxM_wAy4Wn","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]