Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tell everyone you know to stop using waymo. They've killed a cat, a dog, they dr…
ytc_UgyUH039-…
G
The real question is why the technology of the self-driving car didn't advance e…
ytc_UgzFw6BvO…
G
If this guy believes it is sentient I’m gonna have to agree with him. Obviously …
ytc_UgyfWuKQp…
G
sad, we can’t even trust if anything is real or not anymore. Is it AI, is it lie…
ytc_UgwWaWePr…
G
I begged my boss not to automate when I was an employee. But now that I own the…
ytc_UgwPezcxA…
G
I tried this and it worked and I tried to take a screenshot but it said chatgpt …
ytc_UgyUFsqed…
G
The thing about AI is they generate fast but they also get stale fast, kinda lik…
ytc_UgyUS0poG…
G
no kidding I went onto character ai, against my personal morals, just last night…
ytc_UgylmmzyV…
Comment
Or maybe your idea of coexistence is a human morality and ai wouldn't have such a morality in achieving its goals
youtube
AI Governance
2023-07-30T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugwa_pEu-NGXMtxlHnV4AaABAg.9v5BalJMwKv9ww9P64yNnh","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx5rp2nNO24Wyhx7Ax4AaABAg.9v1xxXNYY7D9y0yIbtTW9C","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzUKGoRDmhtLT1mobN4AaABAg.9urNVOYwvOP9urbnbkmUu1","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgxSbuKA4gtHjZ8f25x4AaABAg.9upGYb5_YkE9wRXvjqeEe2","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugz6anPoXkBcAS2AAkZ4AaABAg.9uIicPT3p_M9uKuGS--yLQ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx89fmJqE0WW-pQROF4AaABAg.9tVokbG-Yo69tVu-u44pw2","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx89fmJqE0WW-pQROF4AaABAg.9tVokbG-Yo69tVuh2JHdZ5","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugxnh_0ObDi9fP3KkVx4AaABAg.9sqYTNFSDam9vcZfc0WONZ","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugy1GC0FuB6IksRJiAx4AaABAg.9smmNqIQPG89smqX-LAUjk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugyof9gBmUYi_Sin8jh4AaABAg.9shyiTPHsh39shzOyjaSJt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]