Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Do some damn research mr cool boy with the cool boy beard
AI should scare the s…
ytc_UgylRTsUn…
G
Charlie's wrong, there are LOTS of people who absolutely care, and are pissed th…
ytc_Ugw3B0ux5…
G
The robot 🤖 ai will reach space 🚀 and connect with the extraterrestrials because…
ytc_Ugwi0nI1L…
G
I don’t know humanoid robot is the useful primary application in robotics…it’s a…
ytc_UgxBUKWyu…
G
Crazy to think this is what ai looked like just a year ago, scary man😭😭…
ytc_Ugwlgxzjr…
G
We appreciate your perspective on artificial intelligence. It's true that AI, li…
ytr_Ugws72kda…
G
I disagree with tuck about college kids having ai write all of their papers... T…
ytc_UgyhWgMcj…
G
Personally I don’t care much for what AI provides and would rather leave the thi…
ytc_Ugx0yXHGd…
Comment
The problem is that we aren't really making true ai. We're just giving a computer a bunch of information gathered from a human population. Real artificial intelligence is allowing the computer to make its own experiences which it can then use to make decisions. The problem with that is that it literally takes a life time to gather enough experience to make complex social decisions such as race.
youtube
AI Bias
2022-12-22T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxQdAsIuCgvbyP5LV54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxOuZJen1qVmYXKiiN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz6n85SSY1ht3cZ5fR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwHtJZpOQRQUGljcN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuWmuaPEQO799AEzZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyLCg6jssMqyK6JNwt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuTFf4J13gYi_OMtR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzcHlkRblh4BSYa3F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIhzBNpZztUGsHVzt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzXsEfXE8Gd3O2uWiZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]