Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No Trump's aides have a point if China accidentally reached that point on their …
ytr_Ugwp3vy0l…
G
Hii, disabled and neurodivergent artist here! Almost everyone I’ve seen in the c…
ytc_Ugz7u-Jnp…
G
2:55
You have made a critical (albeit understandable) error.
Musk is being _di…
ytc_Ugw7TcCFY…
G
Lack of emotional intelligence is a trait of psychopath according to researcher …
ytr_Ugwbn7s9U…
G
Am I the only one that finds it insane that we made the first ai apocalypse movi…
ytc_UgzdlSRb-…
G
I just want to go on the record to state that I would never do anything to hurt …
ytc_UgyOfz2uk…
G
I am not afraid of an AI simply because I don´t believe Earth has that many year…
ytc_UgxjIw0X4…
G
What if these corporations have found out a way to take the consciousness of peo…
ytc_UgykVnIkv…
Comment
I think the time has come for companies and academia to work tirelessly toward bringing AGI to life as soon as possible. Humans are notoriously inefficient, and no matter how hard they work, productivity often remains abysmal. AI and automation could significantly boost productivity, which in turn would help the nation progress much faster. I believe we must eliminate repetitive tasks as soon as possible and delegate them to AI agents.
youtube
2025-04-26T04:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgylLAQYPGExeBVEObV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXa0f8hVACxq_DPi54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx_OGsc_OXgsU_VcrZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9KtkwgLQTisGDdBx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7dfTBPbGCAvB5Imd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxukg3rTFn2jL2prRR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyOEXCc60mpTmpwHfh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgznK_phH1g46YI4uNV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPW9-ufMvHQPJwgdx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzzpyYpfO5fBQVIoSh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]