Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't understand the point of this. Obviously the AI model does not have actua…
ytc_UgztYiLts…
G
"ughhh can't you just do it?" as he's stuck in a self driving car is wild…
ytc_UgxNV9hRp…
G
😂😂😂😂 you lost your clients on fiverr for your shitty fanarts to an ai artist ? 🤣…
ytr_UgwgknU-J…
G
The only reason i want to use ai art programs is to touch up my own concept arts…
ytc_Ugyi3a1z2…
G
Ai struggles when it comes to uncommon problems in languages that aren’t in the …
ytc_Ugz2aT7w4…
G
Im an artist and i sometimes use ai generators to give me an idea for what i wan…
ytc_Ugx03q8wp…
G
If your self driving car is 2 meters behind a loaded truck, you programmed it wr…
ytc_Ugxgjwd0T…
G
There is nothing human like with AI. I cannot believe the developers don't under…
ytc_Ugyvr3tc8…
Comment
Some things are worth doing even if they aren't a net positive for the environment. I mean, compared to factory farming does AI even show up on that graph? I highly doubt it. It's kind of like making plastic straws illegal and plastic bags illegal while then also allowing Dupont to dump 400 million gallons of garbage in the river every year.
youtube
AI Responsibility
2023-11-09T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzrwXdYpo3Uqoamt2l4AaABAg","responsibility":"society","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEn5L_-R1wxvVw7-V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzTdXDGNGrZzSmfD0B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwCHzHKiYDUzVIA8z14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwY16aXC8SUFT9zzbd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6owrZ55fPHnd7dtJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugz4YfgCE4I3EUjQHL14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxHGVJvGzMZg_MWbyZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwVJFWAMHvXwfW_CYV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgysfQ7k_ZckAFSixjZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"mixed"}
]