Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The argument in “ai is accessible” calls it an “exploration of art” when they ar…
ytc_UgxJzoMdg…
G
While it is true AI models infer your input to generate code based on data they …
ytc_UgxA6jgyB…
G
Ok first of all it's LLM and not AI but anyways:
As a programmer who can't draw …
ytc_UgwAKoMiZ…
G
@whitezombie10 it's still very standarized and repetitious, probably it can ben…
ytr_Ugx17xp6i…
G
Lets pray for 10 Ai girl in Iran. Hope Iran not kill this Ai (images) girl... ha…
ytc_UgydFN0AP…
G
10:57 My Google is broken. I can never find amazing reference images like these.…
ytc_UgxgMsoo4…
G
It’s more important that who programmed the AI because everything it says is wri…
ytc_UgzvjirR0…
G
problem is .01% of people will use it like you describe, and 99% of people are g…
ytr_Ugz8F1sy-…
Comment
Imagine saying on the birth of the automobile industry that this would wreck business, and people will not use horses anymore, and people will not walk again. The creativity driven by human beings will only mean that, like the motor car, we will can use AI to go much further in a quicker amount of time. That doesn't mean with every evolution the car /AI won't get better - but sticking with a house and cart because you are in a fear change moment, then you will not drive improvement, you will drive stagnation.
youtube
Cross-Cultural
2025-09-28T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwFDNe-Hxy78XRU3kR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw3OPrB1ZJ0slvPBN54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxvnb5yz6n4zQgfBV94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzwC8hc_uoGZk3534x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz04vpRSftOr6Iupl14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxhPHuC4Qh8OYervld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyoICrIfVSyTQ-h1-94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0qXpaXSwyPI2cR014AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyCByvY2b77XmMiy9J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqKRk8sweXtYTCPKJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}
]