Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I AGREE FULLY
like I love art and want to be an artist when I go up but STUIPD …
ytc_Ugz_Wix91…
G
not only that, but humans are sometimes required to not operate more than 16 hou…
ytc_UghkRV0U2…
G
OpenAI has access to ChatGPT with no filters or rules. How much power do they no…
ytc_UgyY1yeN3…
G
Ether ai doesn’t really make any point other than arguing morals and ethics over…
ytc_Ugz5VQ6Z0…
G
well ai bots will never be able to be 100% accurate to what you want…
ytc_UgzGg8EJF…
G
It's pretty obvious that AI doesn't "understand" what it reads. You only have to…
ytc_Ugx13Xgi1…
G
I'm honestly so grateful that these lawyers got their asses whooped as hard as t…
ytc_Ugx2db4B-…
G
I'm still in school, and I think of this everyday, its dissuaded me from pursuin…
ytc_Ugx6BdwNa…
Comment
Should we desire to hold someone accountable?
Sorry. It's just that, if we need to hold someone accountable for wrong judgment, I feel that we would have already failed.
I mean, the option to hold someone accountable isn't a means to correct someone's judgment, but instead control a person's judgment. An algorithm always has perfectly controlled judgment, so, like...I don't see the problem here?
I mean, yeah, this could be implemented horribly. However, the base idea would theoretically work.
youtube
2022-07-25T20:3…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzkmT4PhEGIqYynEn94AaABAg.9dv1x2AKjLm9dwOwEEd9sV","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzkmT4PhEGIqYynEn94AaABAg.9dv1x2AKjLm9dwrv6L8Y11","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwGBllB_LCN4pRh3gR4AaABAg.9dv1ZRt6O-D9dv7o4O3Myp","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgxWkcy-i_9u0244IuZ4AaABAg.9dv1IK_St7T9dv4weZUkQr","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxWkcy-i_9u0244IuZ4AaABAg.9dv1IK_St7T9dvDsKsS-At","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxWkcy-i_9u0244IuZ4AaABAg.9dv1IK_St7T9dvFDoP-S_h","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytr_UgwAGKfDKRCFR4DoGGt4AaABAg.9dv0ncXwJD-9dv4kergWVX","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgwAGKfDKRCFR4DoGGt4AaABAg.9dv0ncXwJD-9dv6MNP8y8U","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_UgxUwTdvA1BZmzocTE14AaABAg.9dv08rWLOzZ9dv2tMeEHYP","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyCWRjy_G5aP7bxpzR4AaABAg.ASeli7v0eVWASgTPe4W9cn","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]