Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
6:50 The AI is saying „Uhm“ as a filler and stuttering several times.
This is co…
ytc_UgzmVsFtW…
G
Meanwhile Mexico’s president said it’s too soon to congratulate Biden and he wan…
rdc_gbj2pf3
G
Great video.
Same with coding, the boilerplate will be provided by the IA, but t…
ytc_UgzJ9VnyU…
G
Wow the robot can say human is not concious? They r concious but they ran away a…
ytc_UgwEdgTlL…
G
AI is demonic. Its already too late. The devil made it to earth already. And is …
ytc_Ugx-3aVu7…
G
I went on vacation there.... Never knew it existed but totally changed my life..…
ytc_Ugx2LFrkU…
G
Electronic music is based on previously recorded or created sounds (by someone),…
ytr_UgwdoIsJ4…
G
yooo I was chatting with this toxic boyfriend ai and by the end he became such a…
ytc_UgxVqn8_m…
Comment
Why does Geoffrey Hinton think electric cars are good? Surely he doesn't still believe in anthropogenic climate change?? Since he already told us he's part of the BBC / Guardian / NYT classes, he probably does. Which means he's bright only in one extremely limited area. As for jobs, this morning I went to the supermarket and found they didn't HAVE anything I wanted. Talked to one of the shelf stackers, who said maybe it's out back, they refuse to hire enough people to keep the shelves stacked (and they treat those of us who ARE paid much like koala bears). Oh I said, and what do koala bears eat? Leaves, he said. I'm with the people who think AI is not something to be worried about.
youtube
AI Governance
2025-06-21T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxHVyeheBhcnEMSjEx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxK8dt5g5CtG9tusMF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyi6DO9Ca6WpfJkq1x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyoVn277vDIxMMZxi54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy64kS0BCVecSoiMJN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx13Xgi1vSpytmU8BJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxDi_trf9sYJX-rA914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwJnuF-SYAb0ejmCml4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxQZOtXqDMbp-czDtp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwmFOU8zkOqu3MilRR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]