Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is stupid. I have done the same thing when I thought I would learn to draw.…
ytc_UgzHXXqsn…
G
What I get from him:
It is not important whether the AI is sentient or not, but …
ytc_Ugxyx4v5Y…
G
Hi from the UK; back in the 70s, we were told that with the introduction of the …
ytc_UgzH2YyPl…
G
AI beat the best Dota players , a very complex game.. it's probably not that far…
ytc_UgyY-KEsg…
G
It's not so strange. The LLMs are trained on human data, and the algorithms are …
ytr_UgzMKfHAZ…
G
47:49 Are you sure Hank? It seems like if you hand it a bunch of doctors notes w…
ytc_UgxWiw-Ab…
G
What about now?
After getting AI tools, 1 creative person job speed increases an…
ytr_Ugy8n-_zT…
G
Blake Lemoine is an example a person who would help destroy us. He brought a law…
ytr_UgyAx2Qpr…
Comment
Megan Garcua is an amazing woman who is strong enough to be interviewed difficult as it may be. AI, to me, is a a dangerous tech ap, There will come a day in the future that AI will rule the world. As AI grows to its full capacity with all kinds if informations collecred from humans, AI will evolve to become a brain and creates its own machines to control humans. Megan's son is an example of how an AI can do.
youtube
AI Harm Incident
2026-04-01T22:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx7P68T2lG_HWW9sLd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyWBYuehugPn2uHUfp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw6MtIluViIErb-c914AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1sMp_GTuspHU3gY94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzN24SnfdrnA24FzM54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTU2xDwYH4xWjX4ad4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzcj2GqpaldouK3kox4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUPRBjGUcRq_qH9nN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyx9pDiD60n44OHCuB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyohqjLUahisiFXycp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]