Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let's hope that AI doesn't get feelings or this man will annoy the AI into a war…
ytc_Ugwkv6WaG…
G
AI general intelligence is (going to be) so different from human beings, why wou…
ytc_Ugzi5Vas8…
G
To anyone claiming to be an AI "artist": you are not making the art, the AI is. …
ytc_Ugwn35CDN…
G
Yeah, and the title of this video isn't Waymo. It's self-driving cars, so he see…
ytr_UgxDmubMi…
G
In a different world your friend might have talked with another friend for advic…
rdc_mul3qkl
G
13:56 What I hate about this topic is that, my answer to that question are artis…
ytc_Ugzt1tsyB…
G
Searle has heard and replied to the system argument that Hank could theoreticall…
ytc_Ughp0m-7O…
G
It was obvious from the start. Anything that makes decisions by human coding and…
ytr_Ugzwu6X9v…
Comment
A tip for those who want to skip talking to bots on the phone. Don't engage, just start repeating the word "Fuck". One of two things will happen:
One: You'll trigger a routine that makes the system assume you're an angry customer, and it will escalate you to a rep.
Two: Nothing will happen, which will confirm there are no reps to talk to; the company replaced them outright. At least you will know, and you won't waste time trying to reach someone.
This works whether it's an old-school system or a modern AI rep-bot. Cheers!
youtube
AI Responsibility
2025-10-31T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyYCCkUDoiupQUbFQ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzL0RywhgftlHDxsHd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwwobnwNiDZP1ayGAR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwIjhT66IhKp3sBniF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaOIahwIee156lwEl4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJrXvWbFlWGr55cOp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzkZ9Q7yMB9zCr3Ne94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxrIHXPScDTttI7bDx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxTVxpjdNdwL-uqW594AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzS3MgwfxbEvFBG5G94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]