Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Next thing we know, humans aren't needed anymore. AI taken over the world. First…
ytc_UgywQy6Iq…
G
Wifi until someone hacks it and sees you doing some freaky shit to your robot…
ytc_Ugz084zkV…
G
Agency means the AI acts (completely) on its own. Human input is not needed or i…
ytr_UgzRqP0OC…
G
@TristenGrant Bro thinks Ai is just GPT. NASA and FBI, are using AI since dec…
ytr_UgzgUjBKk…
G
AI also must get over the bias it’s been programmed to have towards certain grou…
ytc_Ugym4Ig3s…
G
This discussion is very important! We have to note that as humans we learn and p…
ytc_UgwyfCgxN…
G
I think there are valid ways of using AI as a tool in a way that the art is stil…
ytc_UgzkhSJol…
G
I think you made some good points, but the focus on the court system being the l…
ytc_Ugzck9W_Y…
Comment
Humanity at its dumbest point here. So a long time ago man was created and the question was will the good humans outnumber the bad humans? Well lets see it doesnt take good to outnumber bad it takes influence in society. So we have the swine who get control of the ai and theyre already evil as shit because theyve all been corrupted by money so now the evil ai has a hyper head start.
WHAT HAVE YOU PEOPLE DONE HERE AND WHY? Its just never ending failure everywhere i look there is trash all over the ground killing all of us and human babies are suffering but for some reason we care more about robots than humans, what is so hard to figure out here?
youtube
AI Governance
2024-02-12T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx_zpUut5o-ubWgzsl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzLJ51OwcmBlXbnwBR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxoTKWX-X0RAcokGZV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwP-VedkEWhlbUQza14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_tGsi5WokgqAG43p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfMsbceftsezArkEJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyCV9UXFLclJkevXid4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxuUnvrXjSYSEcA8Rh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXHjLKHdl658a5Pjp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyl3q2E-O2YqeQD6FF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]