Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It won't happen, they are pumping stocks as apartheid Edison has done for long …
ytc_UgzAQAO3G…
G
I recently used an AI known as Revisely to study for my most recent major test a…
ytc_UgzAVd6FN…
G
@TheMageesa "professionals" wasn't the focus of your argument. Creativity dryin…
ytr_UgxFRWAVX…
G
@bigpoppa6658 remember what you said if China gets there first ... main use for …
ytr_UgyWG0avC…
G
Solution :
Hardwire into the core program the 3 laws of robotics
1) a robot may…
ytc_UgzXNYI3o…
G
AICarma's weekly visibility scores really help me track how often I'm mentioned …
ytc_UgyPCU7Jy…
G
The interviewer was incredibly bias. He cared more about the USA winning the AI …
ytc_Ugz25YVlA…
G
AI doesn't have the balls to say the truth 😅 I stick with Trump's plan: Let them…
ytc_Ugw36nhYV…
Comment
I see parents, especially the mother grieving and finding someone to blame.
Yes there should have been a counter-measure where the bot informs the authorities but it's a bot, and he's human. The Human should be programming the bot, not the other way around.
They should also take some-what responsibility if their son felt more connected to AI than to them, his parents.
youtube
AI Harm Incident
2025-11-08T08:0…
♥ 14
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwhPzZvNAkhtT0iHQV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgyT-vebxiwMwdw2i3B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzROFHwZQw3qc2xQbx4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw0gmts_GoGjEZ2qK94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgymKW9n7gL5ZlAjhot4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy8pTntVDumP2kJaMR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwgv4Z6ASODv_vRdxV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwcWBJ0cqLxVAFIlXx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw8kjfndpPFzfBERhJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxuz-IFqfZhXl1Bi5Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]