Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is exactly what I mean when I say that these inventors treat their own crea…
ytc_Ugw9EqY12…
G
the ai bro voice is killing me.... Appreciate the humor to help with such a SAD …
ytc_UgyqWwq2C…
G
@Maxi_Gamer_2011 A “soul” doesnt exist, thats just what you humans created to ma…
ytr_UgysZNFQh…
G
I’ve been thinking about our ai future a lot to and this is more or less where I…
rdc_jirthxu
G
If you ever want to rely on the Tesla "Autopilot" to drive you completely autono…
ytc_UgwzeoERG…
G
It's insane people think that doing art digitally mean you require zero skill. I…
ytc_UgxubixIo…
G
😭 why with legal field... I don't think so ai will actually affect it that much.…
ytc_UgwfbV66H…
G
This is an easy question to answer. If the work that is being used to train th…
ytc_UgwG6yqXp…
Comment
It's a good thing we're hitting the limits of LLMs, and dumping more compute power isn't yielding significantly better results. He is right about AI as a concept, but this AI ain't it chief. The actual scary thing about what's happening now is because companies wanted to replace workers with AI they made huge investments in it, and once the general population realizes how bad LLMs actually are for everything except research starters for a quick summary on a topic with citable sources, Nvidia will collapse and the stock market will dip/tank with it because of huge failed investments in AI.
Unfortunately, we now know what it's actually good at replacing, and it's actually very effective at it and we should be happy because its great for general knowledge if trained correctly: the encyclopedia.
youtube
AI Responsibility
2025-07-24T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxJG3T1S4U2_liIfUJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyeZyuya9W4j7GNunJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxVJXuYRAPDwJjxBMh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxlC7Hks6onRot2b9Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyS41CJBdrEDGg6YMZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy3mqOXEyw6qWMqyIV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxBhaQlcizyw2G-JOl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzgWvS2qzCwX-kNdyx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfL4NMK2R8k5QJdJ94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxsCYRhp6Ab_lfaTr14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]