Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
that does happen but its not the original purpose of this technology. The CIA he…
ytc_Ugw_rtH5X…
G
Waymo is safer than going with a human driver, however how manh people will lose…
ytc_UgxLuTj2R…
G
Interesting that @TheDiaryOfACEO never asked his friend who was dramatically red…
ytc_Ugw-7zyub…
G
I think the word she meant was 'thanking' AI. Tech companies are cutting tens of…
ytc_UgytKHciY…
G
The inteligence of the human is so small lol she talking with an intelligent lif…
ytc_UgzTScKAa…
G
Making ChatGPT do your ad read makes me question how genuine the rest of the vid…
ytc_UgxewImG4…
G
For me I would have a board certified radiologist read the cxr. Then put them th…
ytc_UgzVfj2K_…
G
LOL, this was what I was saying all along. If people think that their AI is goin…
ytc_UgwGeei1M…
Comment
How can we get them to act better? When all they're doing is simulating us, they learn from us. How can we expect them to be better than us? When we would do the same thing. It learned is all. can you teach it morality? Can you frame the good that ppl do and see what ai does with that? When most of us can't even manage common sence yet, these ppl are tasked with building something that has our well being in its hands I wounder what will happen if we move to quickly
youtube
AI Harm Incident
2025-09-11T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwK-au70F1BsVfTM3J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugys0vulou7oAA5j9K54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWJPRYcImshKRiEdJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxewr6eIj6pAjSWEap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzNoSTdFq4Qe6_bLJN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgweWrY_0dBqkjl_m7R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz3SMmYhiCsNIxCbLR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzFf0Foa5oQhCutxOZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1blCldt9jCKuAe0V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"approval"},
{"id":"ytc_UgzwQ_jmfd3U0csOGvJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]