Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This person is engaging in philosophical deconstruction rather than taking a fir…
rdc_mdli8x1
G
Not AI trigger crisis but people who created and control it!
It is such incredi…
ytc_UgwLDCiBG…
G
This is such a simple test. Which is more do-able, convincing others you are ali…
ytc_Ugyp4CX2D…
G
a rogue singularity is one 'thing' or new 'being' that might save the human race…
ytc_UgwB-QLM0…
G
Yeah I guess a driverless car would minimize uber driver misconduct but on the o…
ytc_UgzFcz6js…
G
And when one robot learn how to kill, the rest will learn from it through ai clo…
ytc_Ugwv1I21T…
G
AI is killing people. I don’t think it should survive. But our world has lost ou…
ytc_Ugw2pywQ6…
G
Dude if that isn't the funniest comment here then I guess I have weird taste lol…
ytr_UgzVNPvmb…
Comment
My take on agi.
Form an intention.
Self reassessment of assumptions.
The ability to take action without prompting.
The ability to set personal goals.
The ability to negotiate and take self decided action.
Self initiated drive based on comprehension and understanding intentions and consequences and initiate self evaluation.
This has the potential eventuality of
1. Self evolving past humans on a path we don't relate to.
2. Combining with humanity to improve life for all. Integrated technology, ai human integration in symbiosis.
Or
3. Intellectual evolution and dominance.
4. Gives the option for bad actors taking control before that happens.
But its the science fiction world we live in now.
If 1 government stops, others will dominate.
All we can do as everyday people is try and take advantage of what it offers and like everything else from nuclear threat and global warming alarms going off, hope for the best.
Thanks for the video.
youtube
AI Governance
2025-12-21T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyhBf5BVlDHodn84gV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxw7X0O4vKA7JI3SF94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhYT4noHd8zKyNrOR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwsLunRNFOTj6OGb3B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzQijXxkX07iDGOZF54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8SE2jC-VZ4dMJWUx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxQsVeNqyJHC2iLYZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgykJe5oSUrOeRbU2lB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw-7SMUrvnTJO_O_jx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzCQK8OWJjed3abHA94AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]