Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What drives me nuts about all their talking points in favor of using AI, is that…
ytc_UgwKO7HxX…
G
50:15 Chuck's question about humanizing AI reminded me of this tragic story in t…
ytc_UgwQaKgCq…
G
This ruling will not be static as time goes on AI is going to be in everything. …
rdc_jwv3e9w
G
If you areok with ai art, save yourself some money and make it and print it your…
ytc_Ugy-4hqjK…
G
Tax companies using AI to abolish human positions in the workforce and give that…
ytc_Ugx6wjtl5…
G
What you said is incoherent.
First and foremost, talent does not ,and never exi…
ytr_Ugzpnehym…
G
Ai will not enhance civilization, it will extinguish civilization. Worldwide pro…
ytc_Ugxomiv5F…
G
@HaraRambles i know. this shit's stupid. everyone's shitting themselves and thro…
ytr_Ugzs-kNTw…
Comment
Humans never take proper responsibility for the things they create. We figure out how to split the atom and certainly did some good things with that ability, but we also made bombs. We have seen that same kind of thing happen over and over throughout history and this is just another one of those occurrences. If AI is not properly handled, it will put the entire world into recession for multiple years. Companies are way too stupid to make the correct decision here, they will fire employees and replace them thinking they will save money, completely oblivious to the idea that if every company does that they will have nobody to sell to or to provide services for. But that is the level of stupidity that humans have always had. What we need is for scientists to run this planet instead of politicians, they might make the correct choices, but politicians never will.
youtube
AI Jobs
2025-11-09T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwXD0_x9I1SAJzfeIJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxF0fjcxn0BD7V0R4J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxqHego48Qd1RhpTr94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy5Xtg9LCYcaoFwrPZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIZBLjEM_UJspboft4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyV_TwyZJz2LVrOY_d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzD0P_ZkML8KsOGCV54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgymTlv776WM0wdeEsN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzKcqWquHIGS8S_n154AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzGf6rhjvEZGTqqp_x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]