Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Given how things are it could actually be a bot trying to disrupt the conversati…
ytr_UgySqiep8…
G
Absolutely mind-blowing! This documentary truly highlights the dynamic landscape…
ytc_UgxKxuyfJ…
G
Why worry about battlefield weapons. AI will take your job which will effect you…
ytc_UgwZIQCIM…
G
>New York is the only major American city to require businesses to post signs…
rdc_jcjjyej
G
Hmmm so you telling me every single AI that they gets fed data and statistics ev…
ytc_UgzDMmioV…
G
I don’t think AI makes “art” because art is ultimately an expression of the self…
ytc_UgzVdgmFV…
G
Autopilot is not full self driving. Autopilot does not stop at lights or stop s…
ytc_Ugy980eEW…
G
This is fantastic! As an adult who loves learning and writing , I have found Cha…
ytc_UgxeJcJGB…
Comment
I think that the fundamental difference between AI and a normal program is that whereas normal programs are sets of instructions an AI is a set of goals, simulated by a program which is still just a set of instructions. To get a non artificially intelligent robot to do something you have to give it specific instructions, to get an artificially intelligent robot to do something you somehow communicate to it the goal you want achieved and it figures out how to achieve it by itself. So teaching these robot ethics and treating them ethically would be simply practical, I mean you could torture an AI into doing what you want but wouldn't it be so much easier and safer to make it want to serve you?
youtube
AI Moral Status
2017-02-23T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjUKMnhflFwrHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiltTSEWD_SEXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggEVfo-0BT3v3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggxUeCR4fvePngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj-C0VSwgP-VXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugiu3igcszow23gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9D6n1e0Y6IngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiLfvLZG9z0PHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugg8zOaOKpgfSXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggejCERUBBXa3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]