Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
OKAY SERIOUS IS THIS A REAL ROBOT? NO WAY. SHE LOOKS LIKE A GORGEOUS MODEL. A RE…
ytc_UgyRr6Fyl…
G
we have nothing to offer to well design robot that does not rust, does not need …
ytr_UgzwomBKc…
G
Wow , it seems undeveloped countries that are founded by agriculture will start …
ytc_UgyZNWDEQ…
G
Bernie, We the People must replace the fascist Technocratic Oligarchy with a Tec…
ytc_UgyO95N1q…
G
There's this study which has found that if you give AI a task and add that it's …
ytc_UgzPaDE6d…
G
Real question, what happens if someone draws something ai generated, without pub…
ytc_UgzAmBVek…
G
Why are there so many people blaming the victim? If it wasn’t a self driving car…
ytc_Ugzv81s39…
G
Teslas Robotaxi have cameras inside. So it should detect the mess and put it not…
ytr_UgzKO6KQp…
Comment
AI in computers still relies on pre-defined code. You get the illusion of AI because as computers hold and process more data, they can react to more situations. As far as what goes on in Sci-Fi with robots turning against humans on their own, I don't see that happening any time soon. I would think that would require a processor "brain" to be able to rewire itself to create new code, kind of how your brain works. As you age, your brain rewires itself with experience. Wires etched into silicon can't do that. To avoid what they are talking about in the video, you just need a fail safe for every decision you program into the machine. You have to do this with just about any software you create.
youtube
2015-07-30T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgjHpoi4MMGqgHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Uggc-9bes9wUWXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj9aX1JiUSK3XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggnJEnC7z1pzHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugg4you0I9WF0XgCoAEC","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugh984wo3xCWJngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggnR24j2_LMwngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggNnprVproRXXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghVP7t4IjdXLHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UggSCIMbCmQoD3gCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"}
]