Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think a good comparison to draw from would be alien life traveling great dista…
ytc_UgwT5gqxm…
G
In a sci-fi society, would robot cart pushers really dent ai cars? Also why woul…
ytc_UgyUyPtJV…
G
Stop spreading misinformation, it is not 95% ai, it’s more like 20% as it’s just…
ytc_Ugz38cmdP…
G
/r/im14andthisisdeep
^^^^^Hello, ^^^^^/u/heebro! ^^^^^Thank ^^^^^you ^^^^^for …
rdc_lvao5ko
G
I’d like to know how the ai actually “prefers men over women or whites over blac…
ytc_UgxhtfNAa…
G
ty for saying this!!! i'd even argue the reason why generative AI is being given…
ytr_Ugw1ScEAh…
G
Regulators are for oligarchs to stay in power and allow third parties to get in …
ytc_UgzQcds4o…
G
So cars were meant for luggage, ai is meant for inbreeds. Am i right, you people…
ytc_UgxyQI-JV…
Comment
This video will come true, except not on the timeline suggested. AI is becoming better and better, but won't replace junior engineers in a year, let alone senior engineers and management. It's full of too many mistakes, can be easily hacked/jailbroken, and at some point the AI bubble will burst. AI companies won't be able to use all the stole Internet data if just some of the lawsuits are successful, and the cost of massive electricity and water use will catch up, increasing costs astronomically. Give it a few years or less, and AI will slow down its ascent.
Still, UBI is an important and necessary future and we should start advocating for it now.
youtube
Viral AI Reaction
2025-12-31T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugzf3k6QIbAJ_A7g-1N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxfOfbtDoLg17bbDg14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy-W6IGbftJDuvuNbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxoOlCv_acDMjnzTx54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgyWw5NuBps1XOGLlc14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgwVeZUBBOOuL1AtGyB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyE_hIwQEWBPDDVF-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxZVHbvQy-yQSMofYl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgwZpYvHgf9DoCHMuqB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzL5rNnCNIt68EKD314AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]