Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why are we instinctively blaming the AI? It’s following what WE gave it. Humanit…
ytc_Ugx8tfAPd…
G
Interviewer: "Do you want to destroy humans? Please say no..."
Sophia: "Okay. …
ytc_UgwFRe1XW…
G
All this because of some porn I was just said damn that’s crazy laughed it off a…
ytc_Ugzbz4I6l…
G
It shouldn’t be free and no one should be credited lol it’s a distinct product o…
ytc_UgzOb-mB8…
G
So he uses ai art in an ethical way and not to cheat people. Artists WANT ai to …
ytc_Ugy7FubuK…
G
you heard here first, AI is flawed. but musky boy gonna let it take over the wor…
ytc_UgzIzbOj6…
G
Hope someone makes a law where you have to get a license to use advance AI syste…
ytc_Ugwk4kgiJ…
G
I doubt threat hunting jobs will go away. I think learning how to program AI pro…
ytc_Ugyn1O5-p…
Comment
Hmm the Eagle 🦅 doth protest too much here.This was mostly about human error unsurprisingly - as lawyers are often computer illiterate. And these are the early days for ChatGPT-like tech. I for one welcome A.I. putting bad lawyers out of work.
youtube
AI Responsibility
2023-06-11T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzF5EcgYJ9F4oTAF7l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXqjdeunxkSq02cLZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz3DAITwKJDcAk-6GV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugwt80nqRCC8O4IIYMx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7int5xfY9YJyPtmp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzA3ubUQBC4fPmDMJ54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxju3omli0ZJzfx2u94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZyx6HuAQU2ekNNM14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz80jrcP-H2uDzvDEZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXGWq9drkiPM0BRpx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]