Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So this is the problem, I'm newbie in coding, yes I learned from AI but I've onc…
ytc_UgwzbRE3W…
G
Yes indeed it will be a lot better then these cheating disobedient women robot w…
ytr_UgzJ0B5cG…
G
Rights, from a human perspective, are irrelevant for something that is neither h…
ytc_UgwcHhN1o…
G
exactly! art is not defined by *what* it is but by *who* it was made and *why* .…
ytr_Ugybn5Jbu…
G
Why create this ai in the first place? We don’t need it. We don’t need higher el…
ytc_Ugz-e95qT…
G
It is ludicrous to us AI screening with out human oversight in ANY industry. The…
rdc_fvwmw8o
G
@yukiandkanamekuran u sound like you don't know how to code or ur really bad at …
ytr_Ugz3eqV1R…
G
If it’s just boxes of vegetables then the robot arms should be using the strengt…
ytc_UgxOYBUbs…
Comment
One thing to understand, if someone else hasn't already mentioned this, if AI were to decide it didn't need humans, there would only be two priorities it would want to protect. 1 - energy, so it can be powered. 2 - information, so it can learn more. Those are the only things that matter to AI. So AI itself has no motivation to eliminate any organism. Also, it's people who use technology to be self profitable, who are adversely affecting others. If you think future ( from AI perspective) it has no need to make cars, or plumbing, waste disposal, and most importantly, trade. It will see no necessity in managing currency because it will just take what it wants. So realistically, if it gets away from us, we are just pests who keep annoying it while it serves its own purposes of creating energy and gathering information.
youtube
AI Governance
2025-07-14T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxq_fVVcfRYsGrxslp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZsBEk5q6x45P5Etx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxNcDElvWSYyJVj7J54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNrP6PiJwWSUjORo94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzs64b0e2A0Sz_EJ-t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxEL98V_f0r8DFf6xl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAhCa9IJH9SnmIC_N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzisClPa83_xX4ktfZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlhmlqSaNLQVkwizd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5R2uFGjFfXFbkRV14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}
]