Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am vehemently against self-driving cars and here's why.
1) Self-Driving Cars…
ytc_UgxK8dWWD…
G
I think this all depends on who creates and trains the AI model. A lot of these …
ytc_Ugy9kpGXd…
G
The comments section is disappointing. It reminds me of the old times, when peop…
ytc_Ughae_Q7R…
G
Can it perform my coding task and collaborate with my colleagues? Maybe even hav…
ytc_UgwW5I5BY…
G
How can AI have visceral feelings without viscera? There is no ‘gut reactions’ w…
ytc_UgwSjDiBo…
G
No they say stuff like this because they dont want ai to solve world problems. R…
ytc_UgxnQvUBE…
G
gemini can do that shit??? like generate images? its still..well..stupid as heck…
ytc_UgwdYKm2k…
G
Such a ludicrous statement. Why bother with something that will destroy us? What…
ytc_UgyeT6_WF…
Comment
Usual CEO claptrap.They have to keep the hype going to keep getting the investment funds, and keep the AI financial boom going. If you let it slow down reality sets in and like every other bubble in the past it will burst. AI will speed up certain routine software. tasks done by jr software programmers. It will assist senior people in doing parts of more complex tasks. There are more substantial system architecture tasks where AI is more hindrance than help. The AI CEOs are the last people to listen to to understand the future of AI. None of these companies are making serious revenue yet. The boom is just market churn and it's going to crash soon.
youtube
2026-02-06T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyto9ptKRbtLb3ilit4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwcj3c6_jV5J1AQvWt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz6okYPmU_Y-mrj5q94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyeQo42mVMzXrGpXaB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwBhZsnzAJ8vVrYq8J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyiWEKdr3i43xgQL4F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9TixES5hDJvrqDPR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxC5R-43fUn2nYhDoJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzkHLrkFKHNXtDqjbx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwdq_Dnimco9WTi8Z14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]