Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Neil is a great genius but he can handle numbers the 35k deaths will remain same…
ytc_UgxAcV3-j…
G
IF AI COMPANIES CAN USE OUR INFORMATION…THEN WE SHOULD ALL GET PAID…NOT JUST THI…
ytc_UgwEAUpDs…
G
NO the billinairs incharge of ai could whipe out the working class get it right…
ytc_UgzEsi35b…
G
Let's be real : humans are way more better becuz AI only copies art style does n…
ytc_Ugxoz9V-C…
G
Consciousness is one of the biggest mysteries of humanity yet many people believ…
ytc_Ugz_FOSV-…
G
This is unfortunate. It shows a complete lack of real understanding of how the t…
ytc_Ugy2_1uSy…
G
The only thing AI can be used for is to translate text from one language to anot…
ytc_UgzUAHibt…
G
It's hilarious to see just how humanity is creating it's own cage... day after d…
ytc_UgwXCVF_V…
Comment
"Robots don't need this "to survive" thing programmed into them because if they were to be broken or destroyed, all you would do is replace them."
"Any robot deserving of rights would not be a good robot to force to do labor, so the robots doing labor will not have the capacity for suffering so we don't have to worry about it. "
I wouldn't be so sure about that. More often than not, you want your robot to be able to avoid damage and destruction, because you don't always have the luxury of being able to replace or repair it at will. Most obvious example being self-driving cars - typically you want the car to get from A to B reliably and to be able to reuse it afterwards. There difference between "avoiding crashes due to programming" and "avoiding crashes due to fear of feeling pain or failure" is very hazy and possibly non-existent.
youtube
AI Moral Status
2017-02-23T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ughgv7iY07dgTHgCoAEC.8PKQh0lPF8S8PKbc3nzIAa","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ughgv7iY07dgTHgCoAEC.8PKQh0lPF8S8PKyy545rVu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ughgv7iY07dgTHgCoAEC.8PKQh0lPF8S8PL3H-oWz5B","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgguDvg9CgsPlXgCoAEC.8PKQg1Pf6gK8PKc_pBnsUA","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgjA4P2zvsANW3gCoAEC.8PKQ9Hag29m8PKZKqNPj6-","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UggjLJg0B5wF13gCoAEC.8PKQ1GdiHQu8PKTwojH3zq","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UggjLJg0B5wF13gCoAEC.8PKQ1GdiHQu8PKU72pei1P","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytr_UggjLJg0B5wF13gCoAEC.8PKQ1GdiHQu8PKUhss0zjQ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgiF1GtuwNWeLngCoAEC.8PKQ-ggL5F38PKWOWuvUml","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UghhM8kfbs8KNngCoAEC.8PKPW_ZjQHb8PKWuhrMaot","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"indifference"}
]