Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And the robot is made from the same material as the truck so we basically fu*ked…
ytc_UgwesyPMs…
G
The Ai lol, so many greedy rich people falling for this scam. Alot of big money …
ytc_UgxzmmYqo…
G
How is AI machine learning different than people using references or learning fr…
ytc_UgxpleXwR…
G
People are already stupid. AI will only make the masses more ignorant and the po…
ytc_UgxNMlQ7s…
G
For everyone who says "algorithms/math can't be racist", I remind you of the old…
rdc_fam8qex
G
Sir, you are building our own demise. There are no regulations set by any countr…
ytc_UgySE8pcb…
G
The Old Orange Dude told us the jobs are returning to the US to be done by AI or…
ytc_UgwSIptWv…
G
Humans have no moral compass. We are all selfish by nature. We all have desires …
ytc_UgyNDMH5A…
Comment
On the one hand, people are all about convenience, and if there is a way to do something easier, they will. But in doing so, they also give up control. One other hand, there are people who refuse to give up that control and won't use a service/technology because they don't value it. (one example: self driving cars). For me, I have zero interest, even if the vehicle had the capability.
youtube
AI Governance
2025-09-05T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy63ZKCbWfbqASu9NR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwy5a5IX2BvVAhIcCl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy5AOyMamVIiVlBSiZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2uGdEZlaXKlRo7GN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugyye3Dp292FpsS63MR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxWzuaWdA3u_4r1Xx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyJgsfbYOoqOofUIEF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugyj7aGqzAs55OvhKZV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxAlIGkUTxQLZkqfl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyko0x_KV2PnQzeeJ94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]