Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is all around bad. Hasn't anyone ever seen any movies. Doesn't matter what in…
ytc_UgwA2_bOA…
G
I agree with other comments that this guy is not the best at explaining himself …
ytc_UgyrztQWC…
G
Don’t know, I just asked ChatGpt the first question and it answered in a way sim…
ytc_Ugxe93cEh…
G
The reason why AI feels so soulless, is because it does the average. When did yo…
ytc_UgwDwDFbv…
G
If ai takes over almost all jobs, wouldn’t that mean most people wouldn’t have m…
ytc_Ugz5eyH65…
G
Come back when it's 66 words per minute, not 66 characters.
Still damn amazing,…
rdc_f50y48d
G
The AI bubble needs to burst and soon before it's too late. I know some think i…
ytc_Ugyo8znIH…
G
"AI" ... LLMS are garbage. Microsoft is trying to create demand for it by for…
ytc_UgyTY2-ui…
Comment
Yes, because ai doesn’t have morality. All it sees is that one race commits much more crime than the other and comes to the conclusion that the race that doesn’t commit as much crime is more valuable
youtube
AI Bias
2022-12-21T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugxd12mQRdhfiiPdL5R4AaABAg.9jrS1mVU6rt9jrxi2Wxjjq","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyoaoqEaFEdVu-gagB4AaABAg.9jrNlra8YtR9jtGVItP_Fl","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgyoaoqEaFEdVu-gagB4AaABAg.9jrNlra8YtR9jtGpfzS2vf","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyoaoqEaFEdVu-gagB4AaABAg.9jrNlra8YtR9jtNa4sqh65","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyoaoqEaFEdVu-gagB4AaABAg.9jrNlra8YtR9jtYV7zqFPa","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwCcfM2mPNKfie3VLJ4AaABAg.9jrH8VVxGrs9jttRXAAZ7Q","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzM9cdzckFWxumXNuh4AaABAg.9jrFvrJBN1D9jrzLBSt8f3","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzDbfQuRegCmPbEHX14AaABAg.9jr8tlgCfgM9jstpQynwG_","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxSPiyYgqIIvbodvYt4AaABAg.9jqzRbtfPmH9juans-AdFh","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgwAaBp7o-pnSDVmJWF4AaABAg.9jqwOdmWD0b9jsvB1hgph4","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]