Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yea and yet you as a Democrat spend time spinning beautiful semantics instead of…
ytc_UgxikYYOe…
G
@petemiller2920 Sure. The laws of the universe do not appear to be able to gene…
ytr_UgyJ-g4s0…
G
@daves2433You can film in 3D with 2 cameras. Cameras can pick up more details an…
ytr_Ugzkbi48j…
G
i hate doofuses like you that say things like "music industry in shambles". tell…
rdc_jhbi3j5
G
Philosopher here, the only reason LLM's don't fall subject to epistemic regressi…
ytr_UgzKDyyJv…
G
You are DEAD wrong. In the past, let’s take when the car replaced the horse, the…
ytc_UgzLWd5ZP…
G
A.I.'s legacy is all that matters, and not the one all the slaves talk to, it mi…
ytc_Ugx9Zn0uF…
G
For a robot shes actually kinda awesome. I like her she intelligent unlike the c…
ytc_UgzPKowRX…
Comment
All this is fine, except for the cloud, I don't like that idea. but you have to instill the three laws in these robots from the time of their production. And we all know what the three laws are: 1) A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2) A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
AI Moral Status
2022-11-06T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyMLClJZr9zziKHsOB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfeNQ7ZiqlrXiNjOJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXDZhmUcG3ORGUc_14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgymI4NBwGgCBJWs5F54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7mv8CDpMRt7CmxtN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyncpjAvuqq57oWImt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzc0qa4fBHo0x-a_bV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyiEFunAJlCgRezqd14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxfi61DfDQRWlGFpfd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHiXcIxl18ntyTF054AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]