Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Man my dating success rate is so deplorable, they could program this robot to sa…
ytc_UggTHWOyf…
G
I’m just confused why we are having AI do these things in the first place…
ytc_Ugy4VX5te…
G
Asimov and Clarke knew the impossibility of setting safe goals for AI, but Wolfr…
ytc_Ugxx6qeyY…
G
Basically AI is simply predicting the answer we want to hear, it all started to …
ytc_UgwpW2BjF…
G
So in most sci fi having a robot almost indistinguishable from human beings was …
ytc_UgxYQQ3rc…
G
13:41 this is sooo stupid. If "Theoretical Picasso without hands" prefered to u…
ytc_UgyDskkw8…
G
If we had more Waymo vehicles than we would have less hispitizations from veh./p…
ytr_UgyV_isuB…
G
I'm amazed that a man this clever leans into the idea that AI, in any form, EXPE…
ytc_UgxCn_NH8…
Comment
Not Ironic, this man and this video are extremely pro-AI. It's an astroturf. Even the organizations like Centre for AI Safety is an astroturf funded by AI companies.
youtube
AI Responsibility
2026-01-21T20:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytr_UgzdLif5gXu276WAafx4AaABAg.9pM74Y6oxYI9pM9DCyRLK9","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytr_Ugwz7KLg-XMujNPBvZp4AaABAg.9pM70GnPbNu9pMB7teCcY9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_UgxXorIDOUi3IFMt28N4AaABAg.9pM6-2ikeJy9pM72DN2AVi","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytr_UgxXorIDOUi3IFMt28N4AaABAg.9pM7UyhGpv9","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytr_UgxXorIDOUi3IFMt28N4AaABAg.9pM86vwxBpB","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_UgxT9X6OXxr1UomQG_94AaABAg.ABr1zaI6qJjAKGfT9HEzwJ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytr_UgxGY8thiQWtQyo7cGp4AaABAg.AQlSQWG_Y8KAUdCyq_4ikR","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytr_UgwTuBB0pkBk30QDEO14AaABAg.APHZ4Oq80aTAPROrSRLM8M","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytr_UgwLWijgKdxuhN5eyzN4AaABAg.AM2RVy2uJU_AOyEkA4WfXY","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},{"id":"ytr_UgwLWijgKdxuhN5eyzN4AaABAg.AM2RVy2uJU_ASFplMQOMgk","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}]