Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Deep fakes are only a threat in a nation where you can’t trust your own governme…
ytc_Ugz0gtXd-…
G
This is why I always laugh at the argument that you can't or won't be able to re…
ytc_UgzfIuJ2q…
G
Exactly. Let's just say at the moment AI is at it's IMMORTAL STATE. The moment i…
ytc_UgzNHo5Kn…
G
So it should be noted that not all examples of bias in AI are the same. Some suc…
ytc_UgzvbpkVu…
G
I'm not surprised at all. I mean, what does a human customer service does that a…
ytc_UgyzNaDV_…
G
Okay but you still need a human there to fucking fill it with fuel when it runs …
ytc_Ugxb3y7KL…
G
I was kinda hoping this year was 2 years old, then I turn on the news to hear an…
ytc_Ugy663dyo…
G
Tesla has never claimed that autopilot or FSD can prevent all accidents. That's …
ytc_Ugy8K9nkA…
Comment
What he meant to say is, yes don't learn to code, in fact don't learn anything, rely on the tools he gives you, so corporations can replace every programmer, engineer, artist, writer, etc, with a machine. Now in a post labor world, things can go mainly 1 of 2 ways, either AI and automation replaces us and we are left to fend for ourselves, to dying homeless, and hungry in the cold, or we go the socialist route and guarantee housing, food, water, healthcare, education and we live a utopic life. What I expect is the former, and that's why I can't support unregulated AI. Capitalism will completely botch any real hope for this technology.
youtube
Viral AI Reaction
2024-01-17T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyNoEKSdgcplYpOOZx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgypUEk415ASpLbzlTZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx0v4hRBNJmTE9N-rV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzRJMeXeFQ_brKiR6V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz7WClmXj8CUypNbkV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwevkV67URjYQ-djOt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzVclOsXydSueLmSjB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyYOWXJCeDOrKrUhhJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzIKrcCT-aepiFk7IJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyU_QJuleZzqRwQWWp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"disapproval"}
]