Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
OpenWorm shows that we can simulate neuron connection maps. Brain simulation is…
ytr_Ugx7HbBXe…
G
What is intelligence? Emotional intelligence is not possible for AI. Human is un…
ytc_Ugzhyly-B…
G
I think that before A.I. destroys us, it will become the bridge between what is …
ytc_Ugzqb5q-H…
G
It's true. I work at a reasonably sized private company, and we haven't laid any…
rdc_czls05p
G
Wake up! This is apart of WEF agenda. The "predictive policing" its to create fe…
ytc_UgxKGYU2S…
G
ChatGPT are bots not advanced AI. Remember pre generative. They mix and match di…
ytc_UgxHGxDnI…
G
It still looks ai. Skin has this weird plastic texture to it, ans lighting looks…
rdc_mubt6br
G
Current "AI" is a piece of code that learns from predetermined patterns. Those p…
ytc_Ugzeffh_m…
Comment
OK, I was going to give this a Thumbs Up, until we get to the part (at about 1:01:00 time) where the discussion goes off the rails by discussing the concept that we are living in a simulation. This is a ridiculous concept. If it were true, then nothing matters. AI is not a threat, because we are all simply a simulation. Living forever is not a real concept because we are, after all, only a simulation.
I can't believe you actually gave this concept any credence by discussing it for such a long time. It's one thing for philosophical nerds to talk about this, and if you're one of them, then I'm out.
So actually, this video gets a Thumbs Down.
youtube
AI Governance
2025-10-14T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyFF4K4bFmtCDWcY3x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7OOh006PH0JX3JN54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx_GkakDDfwKgUPosZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwSIu2RtBytZaBXDQ54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzKDyyJvBGoUqBvqud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxry-2LrB9hJhy6Nmd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyAnAyRCEWwewKL8qR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzi2CWVhwSMw5hnk3Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz08X1uvlvCTaV6oVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9VX5XDystnFRTgP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]