Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who is buying the goods being delivered, if people’s jobs are disappearing - dri…
ytc_UgybleOAB…
G
Tesla auto pilot isn’t meant to be 100% hands off or AI. So yeah the driver is c…
ytr_UgyROPwKE…
G
Five hundred years ago, ninety nine percent of the world economy was agriculture…
ytc_UgyRDsX_l…
G
gentle reminder that AI is just a reflection of us, they learn from us, and if y…
ytc_UgynaA2Qy…
G
I would rather be a "bad" artist and have the drawing level of a chile, rather t…
ytc_Ugx6c0vDr…
G
The idea of using ChatGPT to fool AI detectors is intriguing! I've been explorin…
ytc_UgxXTirVq…
G
this is so horrible... please continue updating about the nth rooms and the kpop…
ytc_UgxTe9J6t…
G
lol.
the probem is you think like a human to develop with AI. when you have to t…
ytc_Ugw3Z5Kc-…
Comment
@aziz_hassan-h2i Only AI can solve it, feels lazy, why give up responsibility to a machine? or a god or whatever? You can have narrow AI on specific areas of interest. Why would you need an AGI or superintelligence? Is not even in the interest of the rich. If you take away jobs who pays for your corporations/products? You still need people to have some money so your richness has any value. Also you need to keep people busy so no weird revolutions happen. I fail to see what benefits are to be extracted out of pursuing AGI without a better understanding of intelligence and mind. Risking everyone's future and lives and way of life does not seem justified to me.
youtube
AI Governance
2023-05-10T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyLz_SYDbl15lKzAGd4AaABAg.9pWfJRfGDQY9pWjUFRq4QI","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugz5cGMHLoKEApqHnhl4AaABAg.9pWeKAF6mjO9pWmEXtxl2P","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz5cGMHLoKEApqHnhl4AaABAg.9pWeKAF6mjO9pXV1QgiSqJ","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytr_Ugywi9ivsNsjeRpGG7p4AaABAg.9pW_gx9y5gj9pXhDNJsjgL","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwZ051mj8S6EKP92m54AaABAg.9pWRudbogXj9pXiVrZ2ZJy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwZ051mj8S6EKP92m54AaABAg.9pWRudbogXj9qQiHX7JVWp","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugx0qmuita12sozuOld4AaABAg.9pWAp5ubrGm9pXZlOlq-yS","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx0qmuita12sozuOld4AaABAg.9pWAp5ubrGm9pXlZpMvaJf","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyhHQAPj8mqCuF_HG14AaABAg.9pVzSReIfrw9pWp6hZ-Apg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgznmX6mCwSQ2mHFEc54AaABAg.9pVzJQnSe-49qudDmrleLy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]