Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
3:54 the only reason to own a self driving car and not share it is if you get mo…
ytc_Ugyziw86m…
G
I have a team of AI assistants. Stupid as hell. I am firing them immediately if …
ytc_Ugxs420fz…
G
I wonder when people will realize the fact that it is simply mimicking the most …
ytc_UgxffxAI7…
G
Companies and people who blindly lease out all of their intelligence and relianc…
ytc_UgyKe-IQf…
G
Oh common! The logic presented here is just a plain misinformation. How can AI r…
ytc_Ugwq4S4kS…
G
Real AI will never exist. With that said, I loved this episode and all of your e…
ytc_UgxkxsqeK…
G
@AG-ng1ml you're assuming I don't draw because I use AI art, I use both, I use A…
ytr_UgzEcEJxZ…
G
I've found that being polite to AI is good for me. I just feel good doing it. Po…
ytc_UgzpBycUV…
Comment
Authority Institution AI
Every single time technology makes a new world dynamic, many people die. Why? Because sociopaths seize the tech and use it for murdering for money.
AI is different to any other technological advancement as we are giving decision making power to something that can only look from a non-biological base of perspective. A perfect tool and excuse for the sociopaths in power.
Chances are though; you're not in a simulation and Dr has spent too much time not noticing things outside the research lab.
This topic and discussion are very dangerous, as you are talking about it not mattering if you are in a simulation. Empathy is a feeling that doesn't need reality to be empathy! It's actually a response to imagining from outside one's own perspective to begin with.
youtube
AI Governance
2025-09-06T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzDuQlA6Q2uF8t2xox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxdRpzEx84ouFicFep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwiG5XWlZFeLBJqoBV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxft9LqCh3BwvBUuD54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwESo-QedIpVgCJKv94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzedWS9mi840Ddzjcd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrLoDNXMlwfLjaIRB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzck8VixyLG-FM8aod4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJXJ0IA5LHiy90hJF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz8PRKtJAvpaUNbk2Z4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]