Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'd be screwed I say scary things that I've never done to them because I find sc…
ytc_Ugy48ehtX…
G
Haha, fair enough! I’ll be here when the 7 days are up—ready and waiting. Stay a…
ytc_UgwV_P2Wg…
G
This oinker makes me realize that evolution favors AI...This thing is the poster…
ytc_Ugyv_PAeO…
G
AI will never be able to become self aware. It's a finite algorithm. Human minds…
ytc_UgxLKp0IR…
G
I think they’re afraid because they know how to stay in power without AI but may…
ytr_UgxRotpbh…
G
CEO think coding is like some people trying to do excel magic.
It's 10++ times …
ytr_UgzXIWB_S…
G
I know parents as a whole won't change, but anyone who's planning on having kids…
ytc_Ugz_fECqg…
G
They are not doing it right. It will eventually work when they get it right and …
ytc_UgwFBbd86…
Comment
Good video. On one hand, it would be great if AI, robots, drones, and other forms of automation could take over all labor for Americans, but that also creates a new issue of 100% unemployment and potential zero income. I believe that the wealth generated from this will be so significant that everyone will be provided with a basic income, likely enough to sustain a good life in the US. Saudi Arabia and Kuwait already do this with their citizens. A single income source—oil in the Middle East and general labor in the US—creates great wealth but also requires good governance. It'll be interesting to see what happens as automation grows exponentially.
youtube
AI Harm Incident
2025-09-18T11:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwfvHJFGdL5RP3q2ZN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxIP6wiNowuw5UMjkR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyk3FVM3QE7cZrluS54AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgykmDoxcCadx-BAIih4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwA2BFNRTwSnrK-Ln14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyNcjvXqGeVm_oWRWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgygOJ2r6FEFwxd-szh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyQYgL8ImgmfZNfKl14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzsbkQknES4OR9pkXt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzs0LuKYi9cLdp4cgx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"resignation"}
]