Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will never have consciousness but programs can be developed that imitate cons…
ytc_UgyGKd4aW…
G
Surely, AI will gain knowledge of and comprehend radical anti-human and anti-env…
ytc_UgwpoxZbY…
G
So this might sound controversial for some, but I don’t mind AI art, it’s only a…
ytr_Ugx-I827w…
G
I think it’s okay to not agree with something, but coming together to bash someo…
ytc_UgwnL9CQ8…
G
Yeah they went well beyond for that generation. Story-writing, gameplay, graphic…
ytr_Ugx2ootjc…
G
If we believe AI can have or already have something that resembles consciousness…
ytc_UgxvXCGmi…
G
I'm a disabled hobby artist. I fell off a horse when I was eleven and hit my hea…
ytc_UgwlTeVeM…
G
No one also asks will AI help the poor.
Apparently does any human want to for …
ytr_Ugw8zyzq0…
Comment
AI is a robot. Basically robot or computer is a soldier, will always follow orders meaning the program, because of hardware, software and limitations of the one that made it the human. A computer can crash, can be infected can have errors, etc. Because basic nature of this world, rule number maybe saying that there is no perfect human or humans never made a perfect thing. All, shoes,, buildings, cars need repairs at some point are dumped in the bin. At this moment ans 1000 years after AI will learn all human mistakes plus lie, steal, cheat, kill, create tons of fake news. Ai is so smart but nobody gonna see AI in nato, eu, un, who g7 or in human leadership. Why? Because control freaks don't give up control to a soldier. Ai is a trojan horse, a spy, nothing more.
youtube
AI Governance
2024-05-29T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxs36kay4lPyO66ZLR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzOGM05GRjjNumC0u94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgybYizKpnSa0LuxItd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxOcRfG4pR5OOWC5-t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxXxX9nePTwUm4szQh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy_WsPbzJh5owOUSJ14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwa6Q1Zj7sf-BIDFlB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzC5VeJ8QlZERHmzd54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzc5WtTZUj_Iy9tXuF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyAld1yDH2qt6_E2zh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]