Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey @hardhittertvname, thanks for commenting! You're right, fighting a robot can…
ytr_Ugz_dAw88…
G
Education without human interaction doesn't work and will damage social skills o…
ytc_UgxOYuvGV…
G
This message's drama has been sealed by the fact i havent been using ai for a ye…
ytc_UgwHUbb4Z…
G
Yeah seriously? And why is she being singled out I remember there being deepfake…
rdc_kjk9m7u
G
These people sound like the people that are still pushing for coal as energy. It…
ytc_Ugz0TpfRu…
G
That’s so kind of you - really glad you enjoyed the different angles. I loved ma…
ytr_UgzhJdr5m…
G
@hippopotamus86 it doesn't reason, but LLM chatbots are extraordinarily good at …
ytr_Ugx8ne7XJ…
G
Imagine giving an AI program a job interview and it asks for pay and benefits. 🤡…
ytc_Ugy82Mb1K…
Comment
My latest findings: when experimenting with local llm, I'll always ask it to generate a prompt to instruct an llm to perform the task I want it to perform... It just works far better than spending time trying to explain it in god-only-knows how many lines worth of requirements and examples (programming or visualization / tasks) 😏
youtube
AI Moral Status
2025-04-08T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxgxcOqqML3wzck2HR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwnb7PVgNYsAmQFZrR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxCmj2HiqbnHtKMIDZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxqt57UJ55RgbAVSaJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5YgBBsCjGFiuyPbJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxeRFQXgCDfiF8XR9d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyB0CSrkljlh5-Fvxl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugzz_R2UEJmKoOZqwXF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvfNbDKWjv4JiVMw14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5_bcHeiVbKklv5bl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]