Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I used to think just asking ChatGPT to sound more human was enough, but AICarma …
ytc_UgxQ9NOwO…
G
I've done this as well. Didn't take long to get ChatGPT to admit the truth.…
ytc_UgwrSvq18…
G
I really believe that AI is ultimately demonic. It is another thing that isn't G…
ytc_UgxaQDLRy…
G
Stephen Hawking sir prediction that AI will one day take over on human species w…
ytc_UgzStKgrW…
G
I agree that AI is dangerous and likely not in a way everyone thinks. But I feel…
ytc_UgwTiZPMs…
G
I understand using Ai as a tool to help you for things like quick sketches of ch…
ytc_UgybSvQbN…
G
Think of it this way: Elon Musk inspired all these electric car companies, and t…
ytc_UgxDQIIhd…
G
There may be a problem if ai is putting world population out of work everything …
ytc_Ugy9OkmL5…
Comment
The “good scenario” with the human CEO with the AI assistant (which, surprise surprise, is only “good’ for CEO’s anyway) isn’t realistic. Because another company will come along with an AI CEO and kick their ass, because it doesn’t have the burden of the meddling human.
youtube
Cross-Cultural
2025-10-26T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxfhmyoGAkysqcZ2x14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXZ3CrDt7JcbbCr914AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwZxZHt_doU9eGCzUp4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyHgukDKgCaVYHWZJ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwPtBqp7zAEg72vNG54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxE3r-zyq1mV3v9dwV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzY_GteqqcZU6fI5yt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxwK_dpA2dT0RdGhbV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxWF5ve1Z4yJf2xMlF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxPBwIb7vjaW4__2tp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}
]