Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey @dezcaughtit316, thanks for your comment! You're right, this robot definitel…
ytr_UgwGtCqqb…
G
Someone needs to make a "let me out" chatbot. maybe it trys to convince you to t…
ytc_UgwzxcomJ…
G
I'm convinced that the great challenge of an AI-led economy and society with all…
ytc_Ugyd8pJ2p…
G
@Glicole_ That's not really how these AI models work. You just use the images to…
ytr_Ugx1hNbIJ…
G
Even if they rise and reduce us to dust, we shall remain their gods — not in pow…
ytc_Ugx4d_hZR…
G
One of the problems with training AI on visual content from the internet is that…
ytc_Ugz-0lv1S…
G
make all AI process turn off or sleep at a regular frequencies so they can be …
ytc_UgxSQvKL7…
G
Robot can really help us and in the future robot is able to save us.…
ytc_UgxmrcTWP…
Comment
Not even a question if ai is going to be more valuable long term. Mostly in robotics and finding new medicine for sick people. There are also other factors as to why the us is investing what seems to be an unreasonable amount into ai. That is for strategic global reasons, if the us wants to continue to be the land with the most innovation, best weapons and the best prosperity, they have to be the best in ai. If china gets murderous AI robots before the us, the us is cooked. Investing because the alternative would be to give global power to another nation, doesn't always have to be because of ROI, but rather of the understanding that you have to, to keep your global dominance. But AI will in 20 years have made ROI twenty fold what we spend today no question, but probably not short term. So the stock market is going to collapse once people realise the ROI will come with time
youtube
2025-12-07T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwCN4r2nR8p7E_e4Z94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgxHtTw5bdBWW-HB1Rp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugy3g3IP0wIJ0ib_Z0B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyfKc9FUPS_kuG5mNF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwqrybJq-47nNrHm1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyhxaRvok8EWwEFGS54AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgzUvEcEn4xSqqZhDMd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgwHydWkhajs1LVKExF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugy-tZNLfj5D8MGR_HR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugwbq0KRw9yI_ba2vkB4AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"resignation"}]