Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree the fact that driverless cars or self driving cars are good in terms of …
ytc_UgxFxLtLE…
G
Ai is crafted to agree.
Ai isn't for those who know nothing about this and can…
ytc_Ugyr6kAOF…
G
if we’re all training the AI’s (and we are) then we need to train them to be pol…
ytc_UgzMu1tug…
G
People who support ai art don’t know the point of art. It’s not to make somethin…
ytc_UgzvPs60B…
G
If a human takes twenty years to learn what an AI can learn in minutes, it's sti…
ytc_UgxrcOIu_…
G
AI in its current form is not sustainable though. It is running at an economic l…
ytc_UgxuYn2WK…
G
It all looks the same. It all has the same lifeless, soulless feeling. Even the …
ytc_UgzWQJSIt…
G
@natzbarney4504Not necessarily. You’re placing a human lens over a being that is…
ytr_UgzNC11xF…
Comment
this video is a direct way to understand how language models are flawed. While ChatGPT is assuming that there is a moral obligation in a situation, it doesn't recognize it's own hypocrisy even when it's put directly in front of it. It not only fall for the exact same bias we fall for, but it doesn't comprend it even when made confront with evidence.
(English is my second language, I'm sorry if there were errors)
youtube
2025-10-24T22:1…
♥ 13
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxv4aE8Ope9ZHiIyq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxPLErCIEKUGChrmu94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxaK3jwaLxTG0T8LbF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrDS68UTnqnIkNtzZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwmH8PeWZ-iineKNHR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrkXL01ahVDkEXWn94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxv5VO3YTfBnSGNAfx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyAORyHaXxqlF-mWnZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJTmksDTaMFoS5mWp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYm8RocxVTilulWCZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"amusement"}
]