Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI drawings also can have a racism problem, my friend was playing with it and im…
ytc_UgxwrxOf6…
G
Now you won’t be able to get hired by another company because you tried to make …
ytc_UgyUXE4V-…
G
AI/ChatGPT is only a reflection of the one using it. In and of itself it is not …
ytc_UgxwA4eUg…
G
People are just afraid that they get lose their jobs to automation like it has h…
ytc_UgxpkVn_r…
G
Interesting. But, how can we be sure there is no bias built into AI as it is bui…
ytc_UgwBjweAE…
G
That’s because goodness requires a source. We’ve been imprinted by it, AI has no…
ytc_UgxtfPy37…
G
I think we expect AI to attack us. Because that's what we would do. AI has no re…
ytc_UgxXHgZwq…
G
@ uhhh not really, it doesnt think nor it has creativity, it doesnt take a educa…
ytr_UgyqrB6z1…
Comment
Elon musk: AI is more dangerous than nukes
Also him: let’s make are car fully ai with its own mind and hope it doesn’t just tern the week so you do fly of the bridge 🔥💀
youtube
AI Governance
2024-10-21T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw4wCWXzWIPS2Icfdl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx_a4ja6ddmeLZz1GJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx4VIYCHmWiY9O98QV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgztD5McpYouOkwFu5Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxP8f5OkEAScHuIux54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6tTPBk8IGTL3meQ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4clMtWzNLgvbVZbV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzqw7C1Gf-rXvVXNB54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyRNr2Ke8K52o_e0pB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzKfFvUA7Zs11yZONZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]