Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who’s seen the new Wednesday series - check the first 2 episodes. FILTER city 👀👀…
ytc_UgxNBno7C…
G
Such a negative outlook on AI. Humans shouldn't work the amount of time that we …
ytc_UgzdBZLTz…
G
on some video i saw this and said yeah this is AI cause watch there mouth while …
ytc_Ugw9VM3iw…
G
some people tend to think people with artistic talent does art so easily,and tha…
ytc_UgwOZJWgf…
G
AI is gonna do the same thing all other things have done: Come along, cause majo…
ytc_UgysAI4NX…
G
You will never catch me inside a autonomous vehicle. I actually think people who…
ytc_UgxsWR0l0…
G
I just signed up to PRO a few days ago specifically because I want to use 4.5. N…
rdc_n7ke749
G
Thank you for your comment, @davidmuhammad844! Well, if the robot can't be hurt,…
ytr_UgwqO29vd…
Comment
As a big AI skeptic who works in big tech, this conversation left me really wanting more substance... Most of what this guy said was just some version "well I'VE been scared of superintelligence for 22 years, so listen to me!" without really making many real solid arguments as to why. The segment with him "roleplaying" as natural selection was baffling, I'm glad Ezra kept pushing back. Haven't read his book so maybe there's more solid arguments and he's just not a great verbal communicator, but he did not really come prepped for this conversation imo
youtube
AI Governance
2025-10-15T11:3…
♥ 68
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwgVNJgSLMJDLUBU8R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgynkSjGpEQy8-Kc8zh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-4krbJQUYK77HCYJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVP508yV27MiLU79V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugya6dVxuaLTosbbcnN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyx5P5xveaUApfBTcp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxSFDbALHW82C1XitF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwj_ej0JjPm54ddYm14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugxcc7er26T-uw7x5YJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYuB6uVGN6VsEjMuF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]