Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'd say take a gander at this: https://www.youtube.com/watch?v=iBouACLc-hw. You …
ytc_UgzUcqlWI…
G
Thank you for this great interview. This is a great no nonsense talk about AI.…
ytc_UgxD3EdNf…
G
I think we need to realize that having a job is simply synonymous with access to…
ytc_UgzCu9zCw…
G
I tested this out of curiosity, not malice. Using only two of the supposedly “po…
ytc_Ugxi0dF1n…
G
"In a Nov. 18 letter filed in federal court, attorneys for The New York Times na…
rdc_m1xx5az
G
You could just use the ai art as a helpful tool and help people who struggle to…
ytc_UgyhYk93Z…
G
I, for one, welcome our new robot overlords. (If you don't the SuperIntelligence…
ytc_Ugzi7K2E_…
G
All it does it add a tiny bit extra work to the generator, when this video was p…
ytr_UgzbJMrK3…
Comment
I'm pretty sure beaurocratic paperwork was written by AI. Some of it is so oblivious to the human experience that it tries to make us fit more robotically than reality allows. It's hard enough putting up with beaurocracy without this garbage. It might as well add a checkbox to the paper forms for anything that says "I am not a robot."
We run some extra risks to our human experience and ability to live when AI seeps into these realms. For anything humans need to live, things like this should be by humans, for humans. I can't expect such machines to rationalise how to relate enough for our needs to be met.
youtube
AI Responsibility
2024-06-03T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxF2TeyK8PblkhSJAB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQoI9Xrf6UnyVeQHJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgylcPQm_lUOYN5MqUR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxQ9Yun6L8Q4UUjjbV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyvrkGjzDdzOR2hoBV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyRFJot5oaSKu4q29x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzIGopTON7GiiKwp0p4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzjelLRe6ydlyCoDbl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx-CVlaPUS3QoEhOpt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxC4q_KgcyrVDe3Qkx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]