Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think you might be misunderstanding the Luddite remarks. It's worth rememberin…
ytc_UgxieyF0t…
G
AI is being over hyped just like the last doc said. The future is the use of rob…
ytr_UgwwAcpo5…
G
In my opinion most people wouldn't sacrifice the inspiration and emotion of real…
ytc_UgxxdOQfV…
G
If talking fluently made something conscious, then Google Search would have feel…
ytc_UgzYbddIM…
G
Shad hates artists so much. If he gets what he wants and artists only use AI the…
ytc_UgwB0oLqG…
G
Hahahhaaha it’s funny because the entire car was spoiled but the robot (that is …
ytc_Ugw2Uv26k…
G
Even I’m smart enough to not invent something that can kill me. Turn that crap o…
ytc_UgwdErKH9…
G
I think the male wants to take over the world and is very aggressive. And I beli…
ytc_UgzJSMWqh…
Comment
https://youtu.be/giT0ytynSqg 36:40 I'm convinced he is talking about Sam Altman, all his talks are pure marketing, I can't listen to the lies of this guy, he is just glorifying his own product (open AI). Hiding how much power he will gain (shortly), He will be able to displace millions of people out of work, hack every government, win every war, hack every bank, and spread every virus. However, he's presenting it as a victory for humanity, but it's not; it's his personal takeover of the world.
youtube
AI Governance
2025-06-19T19:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxNCaPt7z11rtKrGDB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxQl3Qd1EWZTlvN9PZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwACsVN_QZ2E5-Fi1F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxIpXMZV3J7grTpo6F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgygCYja-bSu55NHYS94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxFhNMTfW4NxqIZhrd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzWxbyjKtKd7QDT-lh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzCM2_JhqBy08TRMeF4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxGnK6bsfLiNrt4uSJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyriiNfiEYnt3cdWlV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]