Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey there! It seems like you're drawing a parallel with the themes of Isaac Asim…
ytr_Ugy3mcERe…
G
@alessandromatera3492the fact that you’re typing out language to communicate, i…
ytr_UgzyiNhUz…
G
1:08:00 If we don't truly know what consciousness is then how could we program a…
ytc_UgwZGVnBZ…
G
But it kind of is made by a human, no? The prompt help generate the images, the …
ytc_UgzcwD5fD…
G
Here's the problem as I see it. A.I. will get to the point where they will ABSOL…
ytc_UgwqfNFpu…
G
BEFORE AI,HUMANITY(ME AS WELL) HAVE BEEN TAUGHT ONLY ABOUT WARS TO GET CARS WITH…
ytc_UgwRLJ6fn…
G
At this point I do wonder whether our form of capitalism is eating its own tail.…
ytc_UgzGuRe0v…
G
Racists set the algorithm. Take accountability for once in your pathetic lives.…
ytc_UgykZYq1A…
Comment
I am more interested in why the ai reasons these things than what it reasons. For example maybe white people have weaker immune systems and therefore need intervention sooner then black people. But if you want an ai to come to an accurate conclusion than it would be best for it to teach itself. This would take time and you would need to only provide it facts but if used responsibly ai could be used for great things. Just don't take it to creepy levels like the shooting one.
youtube
AI Bias
2023-01-07T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx0VR1-EQCXIF3k4YF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxjAZp96DoYyNJFSE54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz0Q8309QjJANcG7ZZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy6p6g0FZcSOAvNFap4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwppUaCsK6RhnoUjAl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZw__R7BIYD3T5Rvx4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz0Yf6IBW8wQ5OxRW94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx2HTY82zz-o7fT4fV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwvrKu-C-FqLWIIM4N4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyvppMRM4pSGY6diIp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}]