Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Glad to have been born in 1997, experienced a normal childhood before iPhones pe…
ytc_UgwAoGb7A…
G
@clridesagain7308 I Think Companies have though (just not all of then). NVIDIA f…
ytr_UgwwtAPoQ…
G
A.I. intelligence will grow
Quick. It will be a miillion
Times smarter than man.…
ytc_UgypROnkM…
G
AI-Washing. I was not aware of that term or practice. Fascinating point. Thank y…
ytc_UgyGmLpKc…
G
Ppl tend to forget that LLM is not true AI. Its just a large context memory reta…
ytc_Ugxq0XIUH…
G
This is bullshit. AI will never replace 99% of jobs. AI is just a pile of metal …
ytc_UgxpLMSeM…
G
Honestly, it's sort of a clickbait video. You'll hear a lot of scaremongering w…
ytr_Ugx9UBQQH…
G
@justasmltwngir1732 get the fuck outa here. You're gonna tell me you never flick…
ytr_Ugw1G5p67…
Comment
Sam Altman or any other AI creator doesn’t fear AI causing significant harm to the world. It is the goal. They think this will end with the majority of the population gone and the elite will have everything done for them by AI. I bet it’s never crossed his mind that the AI could decide there is no need for them either. I think they consider themselves safe which is why they won’t stop.
youtube
AI Jobs
2026-01-17T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyRt4SNqrL-HTsSi7t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRB_KDQGDjtPF3VRx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0N69ualgpZQNjFMl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxau6piwjz7zlpnlrJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwFszyINKuQzm_TFzx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxkInth80y6D6eBgXh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTD22jATtV_crR3Gh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgycgybhIdceBUDqEV94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwc3SSXfpcy-wikxjx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz1QyZu_vsnl-tMr8F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]