Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Instead of using ai to make NPC's interactive and lifelike, they use it to remov…
ytc_UgxdXe6zs…
G
Incredible. I am literally in the process of thought on this topic and run acros…
ytc_UgxJgjlj3…
G
When AI books into therapy and during a virtual session under some deep fake; i…
ytc_UgzTFRADU…
G
The only AI goal I haven't heard discussed is; Will AI achieve enlightenment? AI…
ytc_Ugw9Gjv-Y…
G
no matter what you say to them, they're still going to be coming up with argumen…
ytc_UgwrIs-RY…
G
When does the truck define the life of those outside it are worth more than the …
ytc_UgypifvXd…
G
It matters much less what the AI nerds are looking to do, and much more what all…
rdc_kt5yzc2
G
Well slowing down ain’t gonna happen. Especially the military grade AI being use…
ytc_Ugwf_O_Wd…
Comment
what about John Lennox , he is a brilliant christian and also expert on AI who wrote 2084 , how come you never interview him. is it because you only interview atheists and not a brilliant mathematician who believes in god and has a lot to say about AI ? seems you have a bias against religious geniuses. i even googled to doublecheck - afraid of god are you ???
youtube
AI Governance
2025-09-05T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugx0ZsdHPDXJxIShiWp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvjbMxd81f2M3MUSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxCGOJjkng6dDbDht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWWrDfErp2Nd9WpQZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwBGjTJIBYttPQ_KfV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlD30wXrc8WzIbVmZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5zftANSmRpq7xXyd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxy5fOl0byMBtz004t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJfySQEIn2bf8nK1l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8r0Xx0xO8tVvq9B94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]