Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI takes insane levels of power, ie burned coal to run. It’s a terrible waste of…
ytc_UgwvDzh0P…
G
@halycon404 autotune is a slight change, while ai images is taking words and mak…
ytr_Ugyd1XcSf…
G
What annoys me are the armchair experts that go around with a 5 minute google, r…
rdc_j8uu2fc
G
Im only ok with ai art if you're not using it to make money, using it as a place…
ytc_Ugx1CDZTt…
G
AI will make some jobs obsolete while simultaneously create opportunities for ne…
rdc_ktt2z7j
G
man leave them alone i know it's ai and all but come on at least say to stop sel…
ytc_UgyG5Q6T8…
G
I don’t think doctors will necessarily be automated by AI….but…with the advent o…
ytc_UgwZgolEa…
G
Bro:
If it’s a girl, girls be like ai and boys be like human
If it’s a boy, boys…
ytc_UgybQk5xh…
Comment
GH doesn’t seem to like Elon and is v soft on Sam Altman!
Elon has repeatedly called out for AI regulation but not Sam who has made OpenAI for profit!
I would like to know what “bad things” Elon has done?
youtube
AI Governance
2025-06-19T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyOcOrRWPRTaamoCfV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz_6BF-qrF0d3wK-Xd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxEv_dkZdKmKyNQdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxonceApChfecRu5Sx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz-V0oVp9gOBQh-P3h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHDx1FQ34JIbG_o3l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzn9GQJVKYRowjdIwF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8EDh61b0lrGVreKF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyexkVWruXB2a5eo6h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrmB9BMl4FlZiW_Md4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"}
]