Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Will we be able to vote for deepfake candidates?
They'll probably serve constit…
ytc_UgxhINaTI…
G
Full price for soggy food shoved into styrofoam containers is kind of a tough se…
rdc_gkqmioh
G
love to see them try to replace truckers lol funny how people talk about it but …
ytc_UgynHzOPe…
G
Amazon's already delivering items using drones. So delivery to doorstep using dr…
ytc_Ugh56y7Nc…
G
This is what Google Gemini 3 says: _The scientific consensus in 2026 remains tha…
ytc_UgyUu-ccC…
G
Y Is there a need for AI..what is the need to give super intelligence to somethi…
ytc_Ugz7hT0hC…
G
First things first, love your videos. You are a great inspiration for us medical…
ytc_UgyNHEBER…
G
The first part is not exaggerated.
My colleague had to mark an essay which had …
ytc_UgwRYWuBD…
Comment
I agree with him or we adapt somehow our brain and thoughts to level up in some ways, maybe with organic computers that can be adapted to our brain but also to our psyche or we definitely must not make AGI. I rather would go with specialized AI development at first and put them to work help develop our selves, our mind, our thoughts and then maybe develop AGI. But maybe at one point if we focus on our own minds development we might not need AGI at all. At his point our brain psyche is very vulnerable to AI. It is like slowly letting in an Alien into our own life made by us but not comprehended by us. Our own programmers will be overwhelmed at one point and wont understand what is happening.
youtube
AI Governance
2025-09-04T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyn7k5h5p5TlWN2crx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAtQSAdClBCT1_2tB4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx33zjruhzxftogvbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxOp9KQk5iOWWUx0Vp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyo2aAKXEQHJLi40E14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGQ0ERxd6NZ-AjVMt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxqFHqxgiZRrPtUyJ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5DhW4OeZYSczdPlh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwEA6qfAGkPvAaOxEp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxKJbwVwUG6fPLT22t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]