Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI users when they tweak their prompts to generate a better image:
"Wow, i'm im…
ytr_UgwUWzQbG…
G
When you hear an A.I. company's CEO talk about A.I. eliminating jobs they're adv…
ytc_Ugw9vI-Jh…
G
I really like this. At first, I thought AI would take over all coding jobs, but …
ytr_UgxPI-TuQ…
G
This becomes very philosphical. We need moral incentives to be good, clearly sin…
ytc_Ugzrd39tE…
G
Okay I am afraid of robots I would not want a robot living with me I'm afraid of…
ytc_Ugx1mq4Pg…
G
Yeah, I'm not trying to downplay the huge leaps we've had, but until we start br…
rdc_n7sux2g
G
We are seeing countries feel the effect of climate change.
https://www.busines…
rdc_fapm3a6
G
Yeah the main problem is ai is not considered human so the copyright laws are no…
ytr_UgweFOoQE…
Comment
American women, please get your act together or you'll be replaced by AI robots or your wife role will be outsourced to foreign women. Stop this "I don't need a man bullshit" because this is what your high standards and your woke depopulation agenda have pushed us American men to do.
youtube
AI Moral Status
2024-01-02T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyV4XGMaLnrMxD7w4J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzb48T89tyMpplVqMl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMGRsNcUfjYQfPEMJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"confusion"},
{"id":"ytc_UgwZO0s-RfgD9S-wVsZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwyX7wKu24C5pGw7UB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugyg8VaFyCFXyQW2kVt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyuUlaomMzKTV6GRhB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5mfP_QwccZbg6ODp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzHkIv5AWSQwEtStlB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzZizSB3iMelsEpn9V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]