Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you post it online then the AI is allowed to use it as reference. Same works …
ytr_UgxVticcP…
G
I feel like consciousness will be achieved by programing and training an AI bot …
ytc_UgxuHvgOA…
G
Turned off when you started correcting the expert you advertised to validate thi…
ytc_UgwpzPVl0…
G
AI should be regulated and stopped for stealing human knowledge and intelligence…
ytc_UgxMrEu48…
G
This would be the same result that every UBI test shows. The only thing lacking …
rdc_ogt4e9y
G
Thats a lot ai art in twitter,and they tell us to call them 'artist' ??…
ytc_UgytgvsQK…
G
You can’t prank me there this is fake bro. The robot tried to attack me she’s go…
ytc_UgzCHN1Ma…
G
AI art is pathetic all they can do is make copies of other people's hard work.…
ytc_UgwW3ANAe…
Comment
I'm sure other Tesla owners have said this but it's very important. "Auto Pilot" *is not* self driving, it's just cruise control. It will not stop at stop signs, red lights, etc. It just maintains speed and centers the car in the lane. If the Tesla had FSD (Full Self Driving) -- a $9K option, it would have automatically stopped for that stop sign.
youtube
AI Harm Incident
2025-08-17T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUvURAXPpt_LnbY6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9mnxS1OupQtXq9bF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxtiWOqFpInUS9L3PB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxV9xQvtQpFpyioxOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxt374a4jhPLUocwwp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzOQxNYoBpW0ClLqgF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwocRcvg5U9DkT4FK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgynGXyWljYnCbeX9EN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgybR10bTgx_lzaRWhV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugy7L8oGz1H-x8rtS9R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]