Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
33:40 this is actually very good for humans as a species because due to AI doing…
ytc_UgxJKtg5-…
G
Let me translate for Dr Tyson: he’s saying, ‘don’t go to the salesmen to talk ab…
ytc_UgxVxb_Ch…
G
Old AI videos were almost literal dreams. The nonsense made sense in some way ._…
ytc_UgxRDpIKN…
G
AI tools are decent at pumping out green-field projects. Lovable is one example.…
rdc_n3l0bva
G
pretty interesting presentation, on what has been done and what has to be achiev…
ytc_Ugx_jxM_F…
G
I mean yes, there's some folks with steadier hands, or a mental visualization ab…
ytc_UgyyBRoki…
G
@hassanabdulahi4705 I think there is a difference between an authoritarian state…
ytr_Ugya0WVpu…
G
Excellent question! They are *technically* banning states from passing AI laws b…
ytr_UgxXX8skV…
Comment
And even if the engineers who built the original A.I. cottoned on, how long would scepticism and doubt prevent them from confronting the issue? How long would it take to push it up a corporate C.O.C. to generate an action plan? What is the likelihood that an executive would shut down the issue motivated by self-preservation, in fear of the PR shitstorm such an issue would provoke? What is the likelihood that someone’s testimony would find daylight if these engineers decided to go public with their concerns, or would they become just another Assange/Snowden? Wouldn’t the A.I., presumably controlling the internet, quickly erase any attempt for them to publicise their concern? And what is the likelihood that they’d even muster the courage to do anything in the first place?
I’ve really stretch this point to its limit, but I think you get it.
youtube
AI Governance
2022-08-30T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugz_q1np1vzN50eZr794AaABAg.A8eIqmvIonfA8eW4iNvqPF","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyK8d5gSsekKlBXbul4AaABAg.9e0N_7BVlRD9e7B1Za3qSR","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgxJtI_i_heJ3tUJ0aN4AaABAg.9dmVKlxrg2C9doKYX_sgqa","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxJtI_i_heJ3tUJ0aN4AaABAg.9dmVKlxrg2C9doUEezPEGM","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxJtI_i_heJ3tUJ0aN4AaABAg.9dmVKlxrg2C9dp6ZK5p08f","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxJtI_i_heJ3tUJ0aN4AaABAg.9dmVKlxrg2C9dpfa4HYibu","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx3zw0RjVc_KaAM53Z4AaABAg.9deVIQvwmVh9dpi4YaKbRF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwOE2EDtJlCK6gZhEJ4AaABAg.9de46mxTs_e9fN10aqcJF3","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwWnVZkm5UH3tvRJMx4AaABAg.9curzz-EZxT9cyAc-8nBWA","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwWnVZkm5UH3tvRJMx4AaABAg.9curzz-EZxT9cyBkUxSNdT","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]