Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
After having played with AI for a month, I seriously do not see it replacing art…
ytc_Ugx15RNkV…
G
*Me in the break room*
Yo robot I know it’s your first day but you gotta be mov…
ytc_Ugyxv4NWl…
G
OpenAI makes money. and media owners make money. average people can only rely on…
ytc_UgxROZtvx…
G
We need to stop using AI. It's slowly killing our polar bears, and other animals…
ytc_UgwqkDBfw…
G
Chat gap slows me down. My assistant has to do the work. Too much irrelevant i…
ytc_UgznGA-u1…
G
tbh no one cares, AI is fine. It makes existing technologies way, way better. Lo…
ytr_UgyP-hggf…
G
@user-mm5cd4kv2t Yes, LLMs are machine learning but not all machine learning is …
ytr_Ugz0OcuNg…
G
1. There is no strawman here. Deal with it.
2. Laziness is one of the main justi…
ytr_UgxmK6zjM…
Comment
□° My opine: it is a tad bit late for mr. sam altman & his likes, to be disIngenuously worried about the extreme already predicted hazard~threats that AI (artificed intrusions~aggressive invasions..alleged innovations..) pose to the betterment of life for the average, global; organic skinned, oxygen breathing human (and nonHuman) earthling!□°○ Private tech ceos (almost 100% nonScientific agenda'd or degree'd) are mainly concerned about: making money luchre; investing in shady dubious crypto currencies...shoring up "free-time"...so that they, private tech ceos; will have more time to play 24/7 mucho video meta verse games...while otherwise doing absolutely "nothing"!□°•deb out!!◇°
youtube
AI Governance
2023-05-17T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz9V4OHAROGAKiom0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz60Gho3GHoN7idMB14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxDBBNy2gR27gsQsH94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyGAHe0tauj1OsbyR14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzKGYlGuJ6okuIeZuJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgytykIe-b8DsmwskP14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxydU-TzdSCpt_nDah4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8Ew39V6gun_D87ep4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugyq1KsteNBOpHQiEF94AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwWQKOrY-3zd9n36KN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]