Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Masashi Kishimoto (Maybe I wrote it wrong, the Naruto creator) said in an early …
ytc_Ugz0Uw4kw…
G
I did AI companions back in the early 2010s when they were highly unknown of, bu…
ytc_Ugx4U-oU0…
G
Ai dont have cracks ai is smart thats why he dont care about weak emotional…
ytc_UgznjU6nw…
G
I'm not afraid of it being conscious in philosophical meaning or whatever. I'm c…
ytr_Ugyu6z4Pp…
G
If A.I. was doing my job better then me, I'd be worried about it taking my job t…
ytc_UgwpmDhPr…
G
However good the tesla self driving has got (and to be fair it's improving all t…
ytc_UgxH6Ogyc…
G
Here is the deal we as humans can choose to deal with other humans instead of Bo…
ytc_UgxjcZn89…
G
I don't know about manual labor, but food service jobs are already being automat…
ytr_Ugy48Z2z2…
Comment
Not that the AI doesn’t need fixing, but if guns were actually restricted, likelihood is he wouldn’t have been on a ‘heat list’, and probably wouldn’t have been shot 🤷🏻♀️
youtube
AI Bias
2022-12-24T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyJbqtHIgJSic5xnzx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy6lbrh7qH2HCjVWUZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZz_ZR8rCsqK5f6Q14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz5Qd-UeUN8fDjt7094AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy6GZKBUl3fy2nPA514AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzmgVkQUNkOesJndGl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzERBwN11t3TtxptzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDqXLu31WFK5W3gz14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxNald2tW--A9NQrv14AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQgnvuFDqw4AWL-Qh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]