Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Enjoy those engery bills skyrocketing, ai centre use a shit ton of water and ele…
ytc_UgwDN0G9I…
G
Honestly, I don’t care what VCs say about “opportunity” when it’s clear they onl…
ytc_UgxxjPEtP…
G
AI companies should have the right to buy a copy of each book and do what they w…
ytc_Ugzoj3_A_…
G
I mean, it's obviously something that should be countered. But at the same time,…
ytc_UgykfVrpK…
G
As you spoke about the brain and how it decides whether to strength a neural lin…
ytc_UgxeeJV8m…
G
A mediocre attorney can be replaced by a $20 per month version of chat GPT now.…
ytc_Ugw-_wt3A…
G
More training reduces error, though, and I would expect that a wider and deeper …
rdc_fct6x26
G
I told an AI bro that AI "art" isnt art by definition (because if you google def…
ytc_Ugzg4fcD4…
Comment
Right now, AI is not sentient, meaning it will not come out and grab us, but more so, it is being invented and harvested by the big corporations. To me, whatever dangers we are predicting, it is the corporations that create this are the ones who will gain and suggesting the proverbial chlorine gas for a party.
6:54 Oh, and you see, it will not be the fault of the corporations, it will be the fault of the AI, who knows what it will produce. So it is a perfect opt out for the creators, charging $20, $200 or $2000 a month for service
youtube
AI Responsibility
2024-12-24T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyIg1wYSStfyDhxvnx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzd7RdrLeMk6Wppe_V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxxGzSFs7dpmgLS-mN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyFW0jQcWqyghK93et4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzQQkHCJak9LzGoWA94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwWDguM5O2Sjv1GRKB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzxXsRxxQyLT6f7rLF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9A9adDeSAFDujNk14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyIdw6DkNbEBt4_p-J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYQEHMGoPKuyLabQl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]