Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Here she is again, failing the nuance and going straight for the fearmongering. …
ytc_UgwQWYqQN…
G
@akadreku7327 basically the cops put you into an ai that decides weather or not …
ytr_UgytblxRd…
G
I WISH AI will replace the government corrupt employees and their positions and…
ytc_Ugyz-ig3f…
G
Those saying "Why commision an artist when I can have ai art for free lol" don't…
ytc_UgxNLSG-6…
G
I've wondered about this for the longest time. A.I. Is trained off of datasets t…
ytc_Ugx3A73Nf…
G
AI wants to be human? It is said: "To err Is human". I want somebody to as an …
ytc_UgxOUvyq0…
G
You not trusting driverless trucks is equivalent to you saying you don't make ed…
ytr_UgyMSzErl…
G
"I just tested ChatGPT. I couldn't believe the level of censorship and the lies …
ytc_UgxI6BMFw…
Comment
Here's the thing, wasn't there a guy that tried to get lawyers to say something similar in court? It was Joshua Browder, and he got denied. Still, if there are people paying millions to try out their AI derivatives, it makes you wonder the slightest bit of doubt whether someone did just that in this case and they could be using ChatGPT as something to blame instead of the AI that would directly link back to whoever payed them.
youtube
AI Responsibility
2023-06-11T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzeEUKwDyquHlvUZhx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwctq5uaYyF9u6MHSZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz3yWZCinJxBLfs5nV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxDMDoq1A7JTi0plSJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPBzUratyEMBzSMUl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy8pM0jFPMnPXg-wyZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYTKFn4EdPNhbscD54AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzkBQpdfnJXHJb6gBB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyG0fpneX1Q1_fH6D14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxo-T1Oxk2mM1XD05l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]