Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ok, why the hell do cops need to use AI and can't doctors just look over a patie…
ytc_Ugxgdvbde…
G
People need to have their own side businesses and multiple income streams. Also …
ytc_UgwADAF_i…
G
There is no way the robots are that far advanced. This has to be AI at its fines…
ytc_Ugwz-Mcnr…
G
It is boring and predictable. It just past time or just shit and giggle, tinkeri…
ytc_Ugz_02E94…
G
2:22 also the definition of “art” has the word human in it so ai can’t create ar…
ytc_UgySE9ITB…
G
I don't understand why AI developers feel the need to harass and physically acco…
ytc_Ugwryk8hd…
G
It eill take decades, after we stop wars, build quantum computers and androids, …
ytc_Ugwd2KOjs…
G
Ai steals everything and you live under a rock or are totally delusional to not …
ytr_Ugx7FgHkc…
Comment
What happens when all the white collared people who are out of jobs, start learning/applying for blue collar work and now that field is saturated? AI will be bad for the white and blue collar workers. It will destroy the economy and lead to unrest if it causes mass unemployment.
youtube
AI Governance
2025-08-16T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw8MnH6lBQj5yCeJIt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzj-w_uTNnSCERZB0p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz7XAuglxBOe3j-xfN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFjsWLpFxhdQ5GW3h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwghsx7NXoH1kA7wFt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyshW-sF7xXVtnzOEh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxzr9gwiwfwBxfYX3d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwk7drRFAAv4U5EUmN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwiPt6WWthRKTn3DGt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy2JZdm_zmjLFb0POp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]