Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Right now, AI-based apps are less profitable then regular apps. Once you put ou…
ytr_UgxDkv0wD…
G
dont cry ower AI use AI fined new Job Using AI What people dont understand AI is…
ytc_Ugy8bJOjx…
G
Neil, thanks for covering this. Given the chatter about AI, I decided to dive de…
ytc_Ugw6uGeza…
G
If an artist "trained" himself using other people works, should they be all payi…
ytc_UgwcWaTmj…
G
Proof that not a single person working on AI has ever read Asimov, or else if th…
ytc_Ugx1rjUHF…
G
No. The AI was programmed by a person specifically to prey on children. It was t…
ytr_UgxC3CUWY…
G
The very huge mistake of this world was AI and the man who created it, very stup…
ytc_Ugxt6uGuB…
G
I disagree with a lot of stuff in here but you know what? Good shit because this…
ytc_Ugwvs1gfK…
Comment
I'll be honest, I think there is only a very slim chance that ai doesn't try to do anything crazy against us. I mean, training robots to control tanks? What were they thinking? Giving a machine the ability to harm us is possibly one of the most stupid and idiotic decisions imaginable. I just hope ai ends up being intelligent assistants to help us do things better ourselves, rather than letting them do it.
I'd love to see what others may feel on this perspective though.
youtube
AI Governance
2024-11-01T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxTpdmMt6kdNJpcZG14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxRRkHEHoza-bxufS94AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxD-EGl2UEKxQQ1c0R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxB3jCardHTfpymyaF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2VJmw6EwVyaMYKeF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwGaklaMlVYW3PJ_xp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwwJk80CxaKT6ZCIoZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyPIloPqI07qoIadCx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyYmSSMC_UDwCjedOR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy6zy5JS2YiZn8o9BR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]