Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I forget, how strong of an insult is cunt for you guys? On a scale from Australi…
rdc_js0q7n7
G
It won’t be a robot holding a gun.
It’ll be a gun with AI, a bunch of sensors a…
ytc_UgwmXzqr9…
G
Wow even though this is just a You Tube video, I found myself with a strong desi…
ytc_UggM17JJ0…
G
Do we ever allow AI to improve itself? Or looking at it from another angle, once…
ytc_Ugx_RCWe0…
G
You can't regulate AI like that. If the US tries, companies will just offshore i…
rdc_k0bqb00
G
17:31
People talk about AI taking all the jobs, but they skip the basic math: y…
ytc_UgwOoJdBH…
G
Tip from another youtuber: when you see an ai artist tell them to "draw them pre…
ytc_UgzCayyLA…
G
This is using AI for good. I love taking advantage of technology for good instea…
ytc_Ugxx0uX6J…
Comment
Hahaha ... the fact that we are created is a logical imperative. Our children are procreated in our (physical) image, and this humanist genius has been attempting to create a digital image of the previously created human brain ...which lacks the character of mind.
And the primary concern he keeps coming back to is the criticality of a moral code ...which is a lot more complex and necessarily sacrificial than he expects (and fears) from a self-preserving super intelligence.
And to solve the latter problem, he recognizes the importance of a "good" human establishing the rules for AI ...which runs counter to his expectations that super intelligence will make obsolete the reality of metaphysical / spiritual emotion and mind vs a digital machine simulating the biological brain.
youtube
AI Governance
2025-06-17T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwAiGC1TXKVxyNvxGZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbcyXm0zYItadye_N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyRI6Y6WkAi_70Q1nh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHwjplJBxe6H_PSwN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxox7PzZIeaD6kmIRl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1Ha24gD6NZVGjrOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBWSDQMfUL7Ckyb9p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxBWnjvOeu5pxzdg894AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx43vFV8_0-3AfjulF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwe9SEfMvZAxxSVDxx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]