Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
😑 0:35 Had this exact same response from someone who couldn't draw about how gr…
ytc_Ugx8VSvA1…
G
As both a historian and someone who is interested in the business and logistics …
ytc_UgwEfLD86…
G
the robot does have lovely teeth they have a good dental plan in the roboto fist…
ytr_UgwFr6Kbr…
G
Gemini is full of severe bias. Linus and Luke only scratched the surface of the …
ytc_UgwpGAr4i…
G
To be fair, they don’t need to be trained to recognize that the human species is…
rdc_o7pzjo8
G
It sounds like you're feeling pretty strongly about something. If you’re interes…
ytr_UgwWmlmYF…
G
The real threat from AI is to people who decide not to think for themselves. Stu…
ytc_Ugz0K5g_V…
G
Plot twist - we’re watching two AI’s regurgitate a well rehearsed script. Top t…
ytc_UgzVvRqlc…
Comment
I think it's sad that humans would not write code. I built electronics and wrote a program by tapping wires for ones and zeros. Then I used assembly. Then a high level language. Then my own language. I like to think I could do it again. Sometimes I look at the assembly. I like hand made stuff, all done by one or a few humans. I liked to grow my own food. It's called resilience, that's being a good human. Even tech guys have a word for it: "vertical integration". They think it is good for their companies. However, they want to destroy our integrity, our resilience, they want to make us lazy and dependent so that they can profit. With luck, we may use AI to make us humans better, and design a system in which everyone becomes more equal and has access to more resources. I would have no worries about AI if it and the profits it generates were in the hands of people, not companies.
youtube
2026-02-08T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyto9ptKRbtLb3ilit4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwcj3c6_jV5J1AQvWt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz6okYPmU_Y-mrj5q94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyeQo42mVMzXrGpXaB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwBhZsnzAJ8vVrYq8J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyiWEKdr3i43xgQL4F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9TixES5hDJvrqDPR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxC5R-43fUn2nYhDoJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzkHLrkFKHNXtDqjbx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwdq_Dnimco9WTi8Z14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]