Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
there is a funny thing about the current ai bot like chatgpt . I have the origin…
ytc_Ugw1ETd0y…
G
Maybe AI will provide the time for humans to develop their biological mind over …
ytc_UgzUfridR…
G
The problem isn't AI itself. In theory, if we can produce millions of artificial…
ytc_UgwTNefw0…
G
Thank you for not using a 'what if' statement in this. When I engaged with Chat…
ytc_UgxzT75pr…
G
I’m starting collage this year and I love writing, I love researching and writin…
ytc_UgxuMN14f…
G
Can AI displace real estate landlords? The robots can make their roles more effi…
ytc_Ugz3a47Q2…
G
One day the robot will do the same with the owner and he will not be there to sa…
ytc_UgwOd5Pe5…
G
So for every job AI kills am sure the companies greed will pay into Social Secur…
ytc_UgxTzq7Lp…
Comment
I find the idea of godlike ai wanting to destroy humanity very silly. I mean WHY? We wouldn't pose a threat even if we wanted and in order to have a conflict, we need to compete for common limited resources which we DON'T! Would a being composed of machinery want gravity and atmosphere? HELL NO! Space would be pretty much heaven for such a being. No corrosives, no gravity to overcome + unfiltered solar radiation and asteroid belts ready to be mined.
I believe that in the far distant future galaxies would be habited by all kinds of ai gods and each one of them would nurture the planet habited by the silly ants that made them
youtube
2026-04-25T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxV_e6qUZqrzZxgv5t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzNvwNx_pWNtgMnD154AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwVDO9CYxIn6QsyLh14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5DxgWgA65zYPHIC14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-LM_COTrYc3aoIsd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyd9_mTim-0wXmNr_x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3gZOatst3Iudqohd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwny4cG23L9Mmp4AEx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwVoxs1HsBbPIgS9qN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz729lq873TYuaUm094AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"}
]