Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Would you be ok with ai art if it was made from a model trained from purely cons…
ytc_UgyDVPhyD…
G
How is this possible???…I only just subscribed to you today but what you’re talk…
ytc_UgxnEF1su…
G
0:04 These damn AI are taking our jobs, back in the day you used to prompt the A…
ytc_UgwVB0_qf…
G
First of all, this whole idea of “contributing to society” is a made-up con. It’…
ytc_UgxctfSLN…
G
So the self driving cars are going to drive who? The human won't need the car be…
ytc_UgyD1I488…
G
Yes but ai gets it wrong A LOT. And to whom do you appeal when your very subject…
ytc_UgyhzywV0…
G
And it's only going to get better from here. In 5 or 10 years, all warehouses wi…
ytc_Ugx_p83Df…
G
This is completely different way how fundamentally AI is changing or going to ch…
ytc_UgxoQEw8h…
Comment
He’s absolutely always acknowledged that AI could obviously cause problems in society. As always CNN is fear mongering as if he’s had a change of heart. Sam Altman now, and has always maintained that AI will probably be the most positive single force ever to impact humanity but that it could potentially be very dangerous. CNN is garbage.
youtube
AI Governance
2023-05-27T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxIOUhzOvxeMwSmu8p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKcWN23-q6AdJrUvt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyqyreWwpWDi-JyiyR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDAoiANXXAow-R7VB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyZijRLRZacFDp3X5p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzb7-dPm09rBHTAA4R4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyDjN6JedRBua0VqN94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2H16t3w9-PraEnjB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzymCbIOpEtb8PRloV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwpUQGT4TuJnnuCEt54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]