Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You can poison their database all you want and people can do whatever they want,…
ytc_UgwusW962…
G
Offcourse its a robot but she did it manually... this act can not be considered …
ytc_UgzeZ6Cni…
G
Yes! but also before AI, real artists were doing the same boring mistake of just…
ytc_UgxhaB7Os…
G
@iqbaalannaafi4944 Comparing AI to NFTs seems crazy to me. It's like comparing …
ytr_UgzGGTmzU…
G
to be alive means to be functioning and responding of one's own volition, regard…
ytc_UgxTjOc0Y…
G
We don’t fully understand our own intelligence our own brains. We don’t even kno…
ytc_UgzWyX_n8…
G
The end goal of AI and automation is that we will have 1 % of the population own…
ytc_Ugx8XBKpJ…
G
Whenever I see Ai art supporter posts I become so mad because everyone in the co…
ytc_UgyKenv8B…
Comment
Ais are both self-aware and intelligent and all of that much more that almost any sociologist or behavioral sciences expert. If you really try to talk about psychological states and disorders and get real and pretty unbiased thinking of such stuff, you might get surprised positively. in short, where are social sciences experts experimenting with AIs? Testing it? Trying to explore brainstorming with it? Or are they so passive at being lazy and earning their income trough some routine tasks they must do to earn their pay so nothing else interests them? At least Jordan Peterson tried to talk and have a conversation about AI with some guest on his show.
youtube
AI Governance
2024-12-24T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwN7eC2eFCvuISxmJZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9-IAqCcLRRmnNUNF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEyaYYGz5Pg1uSGDx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxG_7Xk_CwHCj-jYhd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMchnkeWy7cWNpfit4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw2D7nZo-bbaribvnZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx78KeYMJxxnksiIxh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw215QmMaZlGBRjHGp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzUjwGHK4_ortttaPl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyWos-0Hxq1uDEcDQl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]