Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Really interesting discussion, but I think there are a couple of important point…
ytc_UgylSr_8q…
G
"who's going to validate that the code does what it claims it does and confirm t…
ytr_UgwG2dA3f…
G
I dont agree with this demo and conclussion, DAN act/response accordingly, becau…
ytc_Ugw1P5t9c…
G
I also had early jobs like that .. one .. ironically? .. at amazon and similar t…
ytc_UgzGfiMUf…
G
Sick of seeing this guys face..How do I get reddit recognition of this for my ne…
rdc_hj4fm32
G
Ai will never have the intention and passion the humans have to spend years lear…
ytc_Ugw5G9l0n…
G
...and she totally looks like a robot. They made them this human-like 30_ years…
ytc_Ugy3UXIT-…
G
i dont think this "works" ai is made to please people , it doesnt have beliefs t…
ytc_Ugyx7tqGt…
Comment
Sooner or later, you’ll never know, if the person you’re with, is real or a robot…
(Imagine if robots went against humanity and you don’t know who to trust because they look human 😃 Giving emotions to robot can get them to feel hatred and anger 💀)
youtube
AI Responsibility
2024-07-16T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw0KrwPBWoRnrX5vCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwNQMaLELCV3kEqdfl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBs-OJPp32jSw5yCx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz8ZNWgNj6_zGb4owJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5dlBvhgbOtmE8VZB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2HzMhsZ_POe8tYBl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxNWtyTETXD9ssUYR14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyxcGe-j_Pguk58yu54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTABjsYDhhQEi6K4d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyyi6XtMftQ7EOoKaR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]