Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans are so "intelligent", we constantly build problems for ourselves, climate…
ytc_Ugwq4LBfc…
G
So AI spending (data center builds, etc) is 40% of the US's economy. And now you…
ytc_UgzLWrRuH…
G
In any subject you care to mention there is more misinformation, misunderstandin…
ytc_UgwN2pBjq…
G
As a software dev, im less memorized by what we call AI. Its a nicely written se…
ytc_UgwjbmQ0R…
G
hahahaa!!! i also always thank chatgpt omg lol. it's not because im afraid of it…
ytc_Ugxqp6sD1…
G
Unfortunately what these AI companies do is legal, thanks to exploiting an odd l…
ytc_UgwGR-k71…
G
Face recognition technology isn't secure for white people either, it's simply th…
ytc_UgwCxga42…
G
Sounds like we’re 3-5 years away from a major structural failure somewhere where…
rdc_nm0rshb
Comment
Christ on a bike, that's a ridiculous ruling. There's no medical evidence that social media is addictive. Designing software to maximise engagement isn't a crime. So now a grieving parent can blame big-tech, big-pharma or big-whatever-you-like for the death of their offspring dumb enough to copycat dangerous stunts they saw online, because feelings. Basing law on emotion, anecdote, hyperbole or politics leads to bad laws.
youtube
2026-03-25T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzHD6F4Z2epyYkiYel4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrXUF9wwd959IsJ1h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3V7cBtQCg1K2VstN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCm8IL5JQDvTMRjPt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwrhoqlTRiZDPudDSR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzaCHAHWVc_-4bpJ3l4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyvbeC-Xy0LORKAYg14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzE6nNK31Lygtvax8V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzTMhPSdEocNNRS0UZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzaLP3v7o34byfVkTd4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"fear"}
]