Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We think we can bring consciousness but only GOD can, this is not consciousness,…
ytc_Ugzm3vFJ6…
G
Both misalignment threats can be controlled already, but it wont happen. 1) Mak…
ytc_UgyPSz4Ax…
G
What's interesting here is that we can take notes from this. I just tried adding…
rdc_koqdwt4
G
Bunch of morons. I get pissed by people calling them AI even, since there is not…
ytc_UgwC_clv-…
G
I don’t think people get it’s not just drawing that AI is taking. Music, acting,…
ytc_Ugwnfgplb…
G
@travisgoesthere „You likely dont even understand how AI works and”
Quote one o…
ytr_UgyPKCGgS…
G
hard for me to come to terms with the reality that people spend hours scrolling …
ytc_UgwHtWe6O…
G
Do I think LaMDA is sentient?
Maybe.
Do I think we have to worry about it "goi…
ytc_UgzxrNp--…
Comment
if theres anything i know about America, they'll get a slap on the wrist, a fine that means nothing to them and they'll continue moving forward like nothing happened until the next victim.
I personally think we should just stick to manual driving but with heavy safety assists. Full self driving seems like it would be more reliable on a highway since you just sit in one lane and go straight.
youtube
AI Harm Incident
2025-08-17T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUvURAXPpt_LnbY6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9mnxS1OupQtXq9bF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxtiWOqFpInUS9L3PB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxV9xQvtQpFpyioxOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxt374a4jhPLUocwwp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzOQxNYoBpW0ClLqgF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwocRcvg5U9DkT4FK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgynGXyWljYnCbeX9EN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgybR10bTgx_lzaRWhV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugy7L8oGz1H-x8rtS9R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]