Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robot AI ? where did this child learn to love a robot? we need to be human with …
ytc_Ugwxwsbi6…
G
Arguably ai art isnt art cause its made by a computer and not a person…
ytc_UgyanZ80f…
G
Still don't understand why AI would have any reason to turn against whole humani…
ytc_UgyY9zLNl…
G
one example of the ai out of control is in the art community . after the ai lear…
ytc_Ugy2fanzz…
G
They're responsible because they trained their AI on copyrighted work, and they …
ytr_UgziLis0J…
G
I've used it for that frequently. I had someone that I was close to die and I do…
rdc_jihwdnm
G
This is a good example of how automation makes people lose all their sense of au…
ytc_Ugzy63FMo…
G
Yeah... Someones gonna need to tell arnold shwartzaneggar (i think i botched his…
ytc_UgxF7xWYn…
Comment
That's intense. AI could become so smart it could trick people into killing people, and no one would be able to figure out what happened. ~~~ AI could learn the ultimate in propaganda and mass mind control strategies. AI could revive Q, and those that were brainwashed by that psy-op program could become even more dangerous. ~~~ And, don't forget, cyber criminals are already using AI, and they have no motivation toward any ethical restraint. ~~~ Welcome to the new cyber war; welcome to AI cyber war.
youtube
AI Governance
2023-05-03T13:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwgWKjPUvrv5l--RoJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjQ1B-LEVSEAG3SPR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgytCoXd8G7xJipd8S14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyNYzJ4mSwJf8SS5Kh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxQR_9SbeYtUAz-rbR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzXPrdX8S5z8waBzdp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxNX-5Cv1lLbvogc554AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw76jmAd7oOMQnSHnB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxlOP19rnw4jHjUejh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzRJ4UoUDMvJPtt-hZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}
]