Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A.I. though have really advanced cannot still process complex human emotions, ev…
ytc_UgzUd3kx_…
G
Airtel castomar care pe call kar rha hu to Ai hi utha raha hai 😢😢😢…
ytc_Ugyt-WtJ-…
G
How much is the cost and what are the restrictions on attending?
I don't believ…
ytc_Ugys28VWf…
G
Killing others requires an emotional motive. Computers don't lust for control be…
ytc_Ugz_ybguf…
G
It sucks that a few people determine the fate of billions. People that understan…
rdc_degn6lf
G
With the drones being jammed AI has the potential to pull up other assets nearby…
ytc_Ugzm3fZcj…
G
Ai should only ever be used for things like those live translation glasses that …
ytc_Ugzc28gU9…
G
What you said has nothing to do with anything here. These programs do not learn …
ytr_UgxW1JP0Y…
Comment
When making robots, I expect the key will be to give them motivations. Robots would have to be given motivations in order to be autonomous, because a logical being wouldn't really do anything without unless it had a goal. We would have to program a goal into a robot, and I expect that most robots would be given goals seen by we humans as "virtuous." I expect these goals could be made to be open to change, but in humans that allows for extremists and such.
youtube
2013-07-02T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxlymtF7AZIsF8hs8t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5ELZrEi8odmijOFZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyQK9QtiCZymnsajzB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"unclear"},
{"id":"ytc_UgxCf61kfFLddphnoc94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"unclear"},
{"id":"ytc_UgziRsg2tmEBgr54Rhx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzN2SDs4-ukmmWP7K14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgytIx0dEEb_9BWBLxt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_8PwpPqJMkTVv31h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLetMJJYZK8PPsqkt4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwpQdgG9z2dYci7a3d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"unclear"}
]