Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So this is specifically ai and not general automation, I think a lot of jobs lik…
ytc_UgyeSzaD6…
G
Did they ask the world permission to introduce AI this way, while ppl in the wor…
ytc_Ugy0BEVXi…
G
Wow this is an extremely sad case to hear, it’s given me things to think about t…
ytc_UgxZ9XKSc…
G
Must be strange if your life-mission is to ruin man-kind. AI 100% will hurt man…
ytc_UgzCCp0dc…
G
Yea we got ourselves in this mess with technology and greed. People can't buy a …
ytc_UgxzF8a-m…
G
@dagmaranja888 yes, it's great. I don't know how many languages they talk, but…
ytr_UgxXAVUhP…
G
AI companies pls. Have other sideline of your business. Provide the humanity ver…
ytc_UgyZ1qmSX…
G
If they wanted to make a robot that makes art
Build a physical one that has a so…
ytc_UgzuxAqR5…
Comment
I don't understand the real fear/fascination with AI in general. Computer is a dumb machine. It will only show what the programmer instructs it to do, period.
Also when people sell the idea that AI will do things on its own, I believe the motive behind it to be that they intend to shift the blame to AI when they will program it to do something that's harmful whether deliberate or accidental. Therefore if AI cause anything harmful, just arrest the programmer and deal justice to them and all will be well. Nothing new to worry about.
youtube
AI Governance
2023-05-06T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw9TXVwjjSFNLCG5c94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwsy7OWz9o47GDwaTN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgznXMwzDCQC5nUncth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzXcPZau39GZ3uqJhp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2HtbVJr6LPyqMZ1V4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgygcFWmWWHKLPGi96R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwH5BXVyU4_EDmwvWR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzpRb2faheJKIJGxVl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwjhZbRVedHH8KG2X54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugycwnnv2ZzEbVJ8ioZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]