Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What's the point of 16 year school if you're gonna make chatgpt do every assignm…
ytc_Ugw7haeue…
G
They should of asked him to show the code and asked why ChatGPT leans way more l…
ytc_UgwNJgSX0…
G
Women are already marginalised by AI. Just listen to the AI narrators for a sta…
ytc_Ugy35GzvJ…
G
Ask yourself this question.
Would YOU get on a "Self-Driving" Aircraft running "…
ytc_UgwGKyUFF…
G
If AI takeover 40% of the jobs in this short span, please expect a revolution fr…
ytc_UgxiSeder…
G
Human strength is no longer required. Now human intellect isn't required anymore…
ytc_UgxSwdIO_…
G
With so many talented people, the same people who created the giant profitable c…
rdc_oaecwag
G
AI and automation will be short lived for one simple reason. Products require up…
ytc_UgwDH4ZCO…
Comment
One of the biggest problems in AGI is that it requires far more computational resources than LLMs which are more or less super-charged search engines that can talk. But as the video says, AGI is not necessary, we already (for decades) have AI systems that can out perform human experts. The biggest issue with these systems is that they need to be designed by scientists and engineers. You also cannot talk to them, requiring some programmer to create an interface.
youtube
AI Moral Status
2025-04-26T19:1…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugxo2GjUyERtyIV9xA14AaABAg.AHORAZC0DOLAHQQtQNdBpB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugwmrb9Qi9ECXEUn8sJ4AaABAg.AHOOeUxaGMIAHPT2Ppj-BH","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugwmrb9Qi9ECXEUn8sJ4AaABAg.AHOOeUxaGMIAHQZVsu6sZu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwmrb9Qi9ECXEUn8sJ4AaABAg.AHOOeUxaGMIAHUZ97MRZWl","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHOR6mKPKOh","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHOTM_eqCs7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHOzGIUf6B2","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzqvP_89QFiSZeh0NN4AaABAg.AHOLyeu0h1lAHP-2c4TwGz","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugxj5C5mPY4raUGzyW94AaABAg.AHOKYrbTpAmAHQB9eajQfs","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxj5C5mPY4raUGzyW94AaABAg.AHOKYrbTpAmAHQFM9hvrDb","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]