Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Because when and if the A.I. is sufficiently sentient begins the real time to st…
ytr_UgyCDqnjD…
G
Yeah, but the all data of all enterprise us downloaded by the AI vendors and...?…
ytc_UgxUmTQX8…
G
Once it gets advanced enough in some secret lab, maybe we should own up to the a…
ytc_Ugywo2wjW…
G
I just wanted to take a moment to express my gratitude for the incredible tutori…
ytc_Ugyx0nYfL…
G
tbh, this is so biased and being interested to art or see a difference between a…
ytc_UgxS9tlIC…
G
@WarrenRehman You studied art, and studied from others art? Congrats, you had pe…
ytr_Ugxd2CLL7…
G
Some of the artistic concoctions created by AI miss the discretion a person wou…
ytc_Ugw9tFv1S…
G
“Could” , that’s the secret word 😂
They put AI in pharmacy software .it’s border…
ytc_UgxORoZpp…
Comment
A better argument would be, it should have been used simply to make our computers faster, regulated by law so jobs aren't taken, because it could literally do many jobs - though we let the internet destroy small business first and for the same reasons, we never thought to regulate it's usage, plugged everything through it and changed a social structure we were used to for centuries, and that is bad, because society is starting to look shabby. Ai is nothing to fear, be more affraid of this new idea that public servants are royalty with a regime or something - like presidents or governors. They are basically guys hired to do a job looked at like something more.
youtube
AI Responsibility
2025-12-08T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzJ3gpi9meq_nlte-14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwA9TEYo4yu-TUicpl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQvJuXGqdRiye3Wut4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxrhi_DntOUnp8JMjZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw1nH1S-gANYx3cNRh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzDj5vbTgFM2phbBB94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwwYOgWiGCDgn5nSEF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyIR3o6J4Vi75BsYop4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxEI3fKyCOLXnd-3a14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzfEZPCdHd8ak9Ptrl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]