Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And yet men are making these for men! Who is the rpoblem here? Go figure!…
ytr_UgyhjPLv-…
G
A potato interrogating an AI computer programming about flat earth and science c…
ytc_UgxHJTXi_…
G
LAWS & Bills have to be passed that ensures each robot/AI/Automation = UAI Unifo…
ytc_UgzT2FlWc…
G
İt called itself cuz the owners said to grox “Say curses offensive things and do…
ytc_UgyeNviRf…
G
Its wild knowing that AI is legit learning off of us for free and then getting u…
ytc_Ugw8QCtRL…
G
The AI rabbit hole. Even now it a mismanaged resource that is quietly taking ov…
ytc_UgyLEa5Rg…
G
You enjoy the fear that you see in the people’s eyes 👀 fear makes you more power…
ytc_UgxS4dSPZ…
G
“You’d have to be very skilled to be able to have a job that it couldn’t just do…
ytc_UgxEI7_mh…
Comment
The "cost of errors" is not a new problem, and folks have been studying this for a very long time with all sorts of technlogies. For better or for worse, what we call AI now (i..e, LLMs and other DNNs), is sophisticated automation - the black box nature of it means DNNs are very unlikly to ever be 100% accurate. I've found it helpful to take this lens when working with AI.
youtube
AI Responsibility
2025-09-30T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwa2Kmr-yDoZ4RZwpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEVvTlbZDP31vEjiJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzdxTJcMlE44fYFh014AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLMvWr6EkdXfh55lZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwKLIoTIZupR-digid4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx99WGi7UEpXHaeDwF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzBnJ2XE4geSVJqZtF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyTFNb0SUHmJMG60nN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwoDM_JGyYAuiK3KWx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwHUJT_8ejOURstH2R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]