Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@mac_mcleod No? The automation was an artificial human (doing a human task) that…
ytr_UgzRM6Iin…
G
Was this embarrassing as all get-out? Absolutely.
Could it prove that AI lawyer…
ytc_UgyS-jfdV…
G
Too bad the government is in the pocket of corporations, and will do nothing abo…
ytc_UgzuDKckw…
G
What AI are they using?? Gpt still kinda sucks on even just medium sized coding …
ytc_UgyNzNFwE…
G
Especially when they start asking for robot rights equal to human rights in the …
ytr_Ugx6KnyGw…
G
You reprogram them so they don't make those mistakes. Robots do What they are pr…
ytc_UgxVTDG_A…
G
They are all acting in a big worlwide stage.They will take us our lands,our food…
ytc_UgzMZpy-Y…
G
@fernandomaron87 yep.. I see that as a win.
We have kids and move on, or we h…
ytr_Ugyk1sknY…
Comment
I also know why I can’t make a decision because it’s in its programming. There’s something in the programming that does not allow AI to search everything pros cons things that came out. Oh, this is a bad person and then things that came out afterwards that proved you know what not a bad person still won’t make a definitive decision. It’s in its programming, which is human.
youtube
AI Responsibility
2026-03-03T03:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_w3gupnaqWxLCw-54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyMTdEYhGDBa1U-i2R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzEyDLZwR3e8NGg02B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgycR16IkMQHLlVk2hl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNHo5KnZeRQRmEvBx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVyM2sxpbwudFfblh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugykc3RO2ljbzK2eCuZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzzyz0fFqHEZ-g5Trd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxuk6MUPDWc1dWHonN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwaH5y5I8JVz2_Uish4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"}
]