Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't believe the ai steals and reuses images. It learns from it, but yeah, pr…
ytc_Ugyf7hV1V…
G
Lmao what a waste of money. Rich kids suck. Not to mention ChatGPT still sounds …
ytc_UgzGgWVQf…
G
I have a different take on this. I have 22 patterns in physics and mechanical en…
ytc_UgwRmJaJ5…
G
She is right
But they are already dangereus. Can be used to create a false reali…
ytc_UgwngdTy3…
G
With Artificial Intelligence taking millions of jobs. The last thing any countr…
ytc_UgxLxXmpd…
G
AI can be so helpful in so many ways. I want it to stay that way ideally, that i…
ytc_Ugxn0gnFr…
G
I think Tesla's biggest fault was that they continued to let a user with many au…
ytr_UgxZ2O-k-…
G
Looks like most of them are covered in soot or have been badly burned. Well the …
ytc_UgzhffSj5…
Comment
Some consumers might choose to avoid AI businesses and stick with human ones. We’ll see how it plays out. I think more surveys of public opinion are needed. Maybe AI can also help calculate a better path for humans and AI to exist together in a way that benefits both sides. It’s a really insightful interview and I think everyone should watch it. The part where he said we can’t consent to what we don’t understand makes a lot of sense, because many people aren’t aware enough of the risks to make informed decisions or push back against possible misuse.
youtube
AI Governance
2025-09-04T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwLck2PpIh1dV3ppFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgytpBC4CfMcwPh2qaB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxrD4sp9g1NeOpDa8t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRpm9K7Fv4qSVoJzt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxWrfvc2j8ZQOmC6i14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwT-2ngswaVQIsrFy94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyEqljw_Ob7dsRQHVJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwX8HN1ZtNXkV4U9K54AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwbhYkmlP8VhkB8Uvl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwXntOW3bQ7LkP-Hzt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}]