Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is just another form of violating people's privacy. And the people who made i…
ytc_Ugz5r7a0g…
G
I believe you are correct on this concern. I’m afraid the same people that have …
ytr_Ugy6QwZWp…
G
AI itself is not the problem. We are.
AI is neither “good” nor “bad.”
It’s a too…
ytc_Ugx5d1E0W…
G
It’s so cute that there are people that think that this will be a human & AI bud…
ytc_Ugyp74EnM…
G
I bet the AI knows more grammar than most humans like you. (My mistake was one l…
ytr_UgzJpj8wR…
G
I am no mathematician nor computer programmer or any other super smart person. B…
ytr_UgxnI_GYz…
G
id rather deal with a robot than some outsourced indian customer service whose d…
ytc_UgzQXMTFU…
G
See, that's not Google AI that has become sentient, AI " Spirit " Pre-Existed co…
ytc_UgwdLCbBB…
Comment
remember the game pong? and the first Atari that used AI? I thought to myself one day playing Missile Command that the world was doomed. The biggest problem is people not figuring out problems and relying on Ai to teach them and do things for themself.
youtube
AI Governance
2025-08-13T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwiiAT3w22tKQxrSWR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw-51nB-WYUhAFEG-F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJVv-BDmSBHH-wzcl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyjxucyba9vV9bIjKB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxYuqjHdXZ46TMRbHB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZ_JSUhjdREhewDE54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxKpVtuTsFzuvyanvx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzIUsoqmR2G_BbCTIZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxpIf_yr33iQkzEF0l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxl3320fGFeLEQICed4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]