Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a person who appreciates technology and is by nature a mechanical engineer I …
ytc_UgwzML4Kh…
G
i am not against ai art(not a fan either tho), but i totally NOT buying 'disable…
ytc_Ugy3pa6AQ…
G
Interesting how only the kids matter, isn’t it? Setting up a shield that nobody …
rdc_fapkgug
G
bruh ... its funny ... that AI thinks its human and its making elementary AI and…
ytc_UgxuWXSkz…
G
Wait until most stuff you see online will be AI and you won't be able to tell wh…
ytr_UgzWxyYz0…
G
People are functionally fixated about AGI.
You do not need AGI to destroy the pl…
ytc_UgwW9NyN-…
G
Oh, I am so surprised, says the retired IT guy who was among the millions of oth…
ytc_UgwxSTB24…
G
That robot keeps throwing the devil horns hand sign 🤘, that should be a pardon t…
ytc_UgziJvynv…
Comment
The problem with AI is that humans are training it, which means stupidity and irrationality are options for AI to spit out at you and you cannot necessarily trust the answer it supplies. I suppose where the worry might be is that AI gets to the point of training itself without human corrections and begins to understand that humans are the problem.
youtube
AI Governance
2025-07-22T13:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwCViCZtRvFR_Xna154AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqT4Sp7DeFy7K6PcN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwHx4MIwe3eqH42Wgx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwghoTCalLIceT9Hpl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_Ugw-Uvo-pJF27eJmM-F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzmgGI4arBv10qzXVV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwa9QSWH4TsNMzr9wd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxpvSWTL4CJK874eqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzoj8iMsH25FpkCYGl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5NiYQa3RnNfl2rhF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]