Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am not worried about ai taking away from us the uniqueness of the creation pro…
ytc_UgzTUmaNj…
G
How about we just say fuck rich people and AI just works for us and automates ev…
ytc_UgyDFPJdN…
G
But there are biases built in that are hard to eradicate. Doctors can work on th…
ytr_UgzNgc5Jy…
G
Self driving car, sure let me look through my social media and not focus on my s…
ytc_Ugx4dS5JS…
G
I remember in wwz they had a short story fanfiction about someone exploring nort…
rdc_cthr69p
G
Next time:
user: "Guess what, ChatGPT? My wife and I are having our anniversary…
ytc_UgycPf-nf…
G
An hour and a half in, so far it looks like a very rudimentary course in game th…
ytc_UgxdCmB-P…
G
I'm an artist. If thinking the use of AI is unethical and actual theft (because …
ytc_UgwURhAs5…
Comment
Humans are better, they made the AI, the AI when creating stuff only makes the stuff from already existing stuff (Yes I said stuff that many times) and because AI isn't as intelligent as us, AI is literally stupider, untill proven otherwise, like if they actually somehow have an uprising that they win, showing to be smarter and stronger, or something, otherwise us humans are better (Mostly)
youtube
AI Responsibility
2025-08-20T06:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzUZjtJ98thuGZYyct4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz9mJIYWX4syXcV9D14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgzcAUxQSNPZiiP72aR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgwdxDPySc-e3jVuO2F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugx1h75BudZbSgB-P5h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgykuasUYU40h6GaTfN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgxZFpHAdrMZHunTONR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwkPh_pURx9fmzzbfV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugy0kyMBtpQc-fVrQoJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxqoAHpfg2PbmsmlY94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]