Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've never used AI directly. I don't like it. I assume if I needed to do routine…
ytc_UgyIlMKj-…
G
I'm so sick and tired of having to deal with AI and automated check out.
Give …
ytc_UgxHF-BDy…
G
Who in their right mind would give a firearm to a robot? Haven't you seen Termin…
ytc_UgzoXMngl…
G
I appreciate your feedback! It sounds like you have strong feelings about AI int…
ytr_Ugw-jI9CD…
G
This has a few flaws not impossible but rare. You’re using the same complaint we…
ytr_UgwK3vSTI…
G
with all due respect, this is the stupidest thing I've read in 2026. To ask if C…
ytc_UgyUbJ0K5…
G
it's a disrupting idea of course, but i feel like , it's really great one, i hat…
ytc_UgzC6EHkn…
G
We can't accept consciousness robots in the world we conquered. In the future ro…
ytc_Ugww_b_te…
Comment
*A PROBLEM NO ONE IS TALKING ABOUT* what happens to your Ai costs when the venture cap subsidy runs out and you have to pay the REAL commercial price - it MAY be cheaper than people now...
But when the cost goes up to ACTUALLY real prices? And you have lost all the staff and the institutional knowledge to train any new staff??? This is like UBER for companies but on a much bigger scale, the price is going to get jacked WAY up.
youtube
AI Responsibility
2025-09-30T15:0…
♥ 109
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwV0YMNfOGmWbygXCx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxUNWCtJfkIfH9xn2V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxDkv0wDN9C_nvSTu54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxTqfNVrIAY_VdkYc14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyk3TG4JMxd0jhBc414AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxexldGBVljDIFY5mN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwvT36vmXJbMYYjBF14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyP8uAcKr9kRIlbKvZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwF_7eUADLKO6jmtaB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzC3LMNI9Q6HHAL_lx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]