Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
still it's just ai, personally I don't think it should be sold like op was doing…
ytc_Ugwk2rRCx…
G
If thay can use the internet wye don't u get them to make a better robot…
ytc_UgzQRBoDh…
G
While certain pieces of AI law like DoNotPay might be good (IDK) the fact that t…
ytc_UgzjovAZk…
G
I asked perplexity if it was reasonable for Hinton make AI sound like the have i…
ytc_UgzqFd74-…
G
lol they're not even using locally run SD. With your own local ai you can pretty…
ytc_UgzloVwZ6…
G
I've been writing a story about AI. One of the things I came across was somebody…
ytc_UgxK-l2ZP…
G
I doordash full time and that job can already be automated but i dont think it e…
ytc_Ugyt7iNaJ…
G
Correct me if I'm wrong....lets say I write the lyrics for a song. Then I genera…
ytc_Ugz5NkOYv…
Comment
Ahh yes, good ol' bromism. As someone who studied nutrition, he should've known why Chloride is important for the body and that Bromide is poisonous in higher amounts or chronic use.
Also, do we really have to introduce something like an "AI driver's license" before we let people use LLMs?! Because it seems like cognitively, vast parts of the population aren't ready to use such a tool safely and effectively.
youtube
AI Harm Incident
2025-11-25T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxu9LMz7vCXppum27l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFPn8hBVJZRZdXqgF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw4pZtxJ9gguF5jhPN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzR9_PXFn9reSjXQL14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzfs8BErOKUt3tVLHN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyL-7MwJki1sBJBOrJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyiOqWqiD9HU9LKrdl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTedqfJsLTHd8Z1md4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwRSfdtBtdNs9tWUn14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxzXmMuLhgXS0v1N7R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]