Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A robot should not have fear. That’s just weird. I think a robot should be pro…
ytc_UgypXUXnx…
G
This is the problem with AI cars. Some who drive them shouldn't be allowed to dr…
ytc_Ugx6BQdjh…
G
I would prefer government to run by AI rather than human. As long as its purpose…
ytc_Ugws7xUDx…
G
Damn people are getting so tired of AI pictures that people are now poisoning th…
ytc_UgzAZXu25…
G
“Ai Arti—“ That motherfucker is the FURTHEST thing from an artist. Try getting s…
ytc_UgyJnVmRj…
G
Google. Charles Rankin Fine Art, this is my one-of-a-kind not using Midjourney o…
ytr_UgzVHSHzu…
G
@laurentiuvladutmanea I know that but let's be honest the ai images are very dif…
ytr_Ugyu0hLwT…
G
56:00 What do you think about the rate of information flow today and the scale o…
ytc_UgwAL8mA9…
Comment
@ Eliezer spent 4 hours trying to argue about AI safety, and Stephen reverted to philosophy every single time.
Stephen is smart, and philosophy is fun, but his claim “why should we worry about intelligence when ‘a babbling brook’ is technically intelligent?” is not a serious argument (nevermind that he’s conflating intelligence and computational complexity)
even if you don’t like Eliezer’s position (he could be wrong), but he’s an unbelievably thorough and careful thinker. Stephen is smart but did actually say anything (it was 100% semantics)
youtube
AI Governance
2024-11-14T23:0…
♥ 15
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxWbWU8HnwXTGqsyxd4AaABAg.AAiYeR1UICtAAimhHmYdLd","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxWbWU8HnwXTGqsyxd4AaABAg.AAiYeR1UICtAAjRPne2vJB","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxWbWU8HnwXTGqsyxd4AaABAg.AAiYeR1UICtAAjgxLAap2x","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw65QRNZqXybNcskx94AaABAg.AAiXbYlEStfAAmmuu4wRqq","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugw65QRNZqXybNcskx94AaABAg.AAiXbYlEStfAAokZRx4eik","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw65QRNZqXybNcskx94AaABAg.AAiXbYlEStfAAqA9Jy7zvA","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxJUObSq16yHsAotv54AaABAg.AAiV57RtZaaAAiWFsL8hXk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz_o_4UNQo_GhkZI294AaABAg.AAiUrQt1I1zAAidX4Bf-1N","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxWZtuw753_fj8wEad4AaABAg.AAiRc3k91HjAAjBSmlsX0M","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwqfDGCzxp8g181J_54AaABAg.AAiNXtAPvrsAArCz1Ewg7J","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]