Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro AI is taking most of the jobs. It’s a universal thing. Become plumber. You a…
ytc_UgwqPZFHo…
G
I think you’re underestimating the folks behind you in college. My wife grades p…
rdc_jvl2604
G
I have an infa-red camera that can see like daylight .why do they not use them. …
ytc_UgwxBsy9i…
G
That is what they're trying to do. Re-opening 3 Mile Island, also I think one in…
rdc_lp6svvl
G
So cities should no longer offer tax incentives to Amazon and other businesses l…
ytc_Ugx8vMr1F…
G
So... the AI was right about McDaniels. So we should continue to use it. All tha…
ytc_UgxzltMzc…
G
current LLM transformer architecture never leads to AGI or ASI. there's no princ…
ytc_UgyJKc4gA…
G
If you are reading this, please learn how to use AI to better do your job. You w…
ytc_UgySmsICL…
Comment
Had Unabomber tried to kill all the leading computer scientists who were working on AI back in his days, which is say 40 years ago, to save all the humanities (according to his manifesto)? Was he crazy? Or was he right? Wondering how people will view Unabomber in 2125? Roubini, a renowned economist, believes that 80% of the US population will be jobless in 30 years thanks to AI.
youtube
AI Governance
2025-06-25T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzjHhZP_sVQ-HsYUUl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxieAyXjJpKA1Lvk-B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSLHwoPLGzsBwBM254AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz7YRLsoRkibqPlxKN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBpFJmWdO9-A2-poJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyDhR7t9dqJu1Mtuc94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgztSgqqDkDX4QMI4td4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxgDGyAc-KhT0sONct4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz6I7ZG3kfd772bK7l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgysNaICClKUlbKPkxR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]