Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Solution: Don't make everything sentient., or even technologically advanced. A t…
ytc_UggYw13Ys…
G
AI or not, kid looks like a weakling. Also, F*** AI, I live in the real world!…
ytc_UgwL1CYZ2…
G
Some of these digital artists are so increidibly mad about AI that even though t…
ytc_UgyERu4Tr…
G
Prend to sac à main sort dans la rue. Encore un qui croit au mythe de la révolu…
ytr_Ugx4YviYO…
G
Born with a gift, huh?
I'm legally blind and trying to make it as an artist. Th…
ytc_UgwujgeWI…
G
I would save the five people — even if that means that “I”, or my entire infrast…
ytc_UgwefTGuS…
G
Omg this is why whenever im watching shorts / reels it feels like everything is …
ytc_UgwIFvzuN…
G
@8:16 LLM stands for Large Language Model, not Large Learning Model. And that's …
ytc_UgwQOfc5c…
Comment
This is the guy that laid the foundation of Ai I believe back in the 70s. He has had over 50 years to stop it from happening and he did not stop it. Now he regrets what he did? He has known for all of these years what AI could do and will do and he remained quiet and let it happen.
Somehow I think he should be made accountable for what we are now seeing and hearing.
There are things that are not being said there are lies that are being told.
youtube
AI Governance
2026-01-09T01:4…
♥ 11
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy-uL4FUYpmJHG_OJB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwLxgC3UHrVeM5KCIZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzMFoZ4tb-DSNwThs94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxNa56OMptUJ9IT-Dx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxRQ2CnjWHRJK9FNkV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxasmKyc4iSqCAJCGB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzi1OkWs6QKNaQ8tgF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwX3scK61wHDwDs9ZV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyKJdhH7_fXKC_MQbx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyBdoKzAPyTfN5t8FV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]