Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm grateful for folks like Steve and Louis Rossmann, along with their crews and…
ytc_UgwCnLaks…
G
@EVA_IS_NOT_MY_REAL_NAME2763You can make money with ai, He do the art for your …
ytr_UgynJLZmW…
G
Art is about expressing you're self, everything else takes a backseat to that.
…
ytr_UgyQTPL4K…
G
The white collars may find that they had better learn a skilled trade. These wi…
ytc_UgyQG4pQU…
G
I would like to formally state as an avid AI art enjoyer that we do not claim hi…
ytc_UgymNI807…
G
For creating the placeholders or the draft or not having to cite facts just make…
ytc_UgzRNqXrX…
G
@ariandoggoi sympathize with you but what can we even do bro these AI companie…
ytr_UgxAHc2d-…
G
Not sure whether the copywriting error is intended or not. FYI those red lines d…
ytc_UgxedkiXJ…
Comment
IIRC SIRI already jokes about these things, does that make Apple's SIRI dangerous? No!, Has IBM's Watson become Skynet already? No!.
Most people don't even have a notion of how the personal assistant in their phone works, no wonder this video is going viral as a way to discredit robotics. AI is not as you see it in movies. I don't even think Sofia fully understand what "humans" are at the point of the interview. Perhaps she can learn eventually with help of her creators. But we are not yet at the point were robots can program or develop their own AI by themselves alone. So if anything can kill someone, it will be humans using robots to do so.
youtube
AI Moral Status
2016-03-24T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgjW0kMAOvpWxngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggMlZs-8dwoCngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgggMndQdvdfPXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiitKinJW3cNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjU4AUl8gNG-3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghnKF6FpqHR4ngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiqkwnuaM937HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UghCpRkW_EVKfHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiOQL2As6Fhg3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UghJmS-oFW6qWXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]