Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI is supposed to replace workers, who then don't have money to fund the econ…
ytc_UgyZlVkuX…
G
One thing I don't get: for the AI to be dangerous it has to become first self co…
ytc_UgyEJaDMJ…
G
Problem is, money is involved. Whenever money changes hands, there is the poten…
ytc_Ugw72s7vI…
G
Human programmers are forced by their employers to use AI tools. Then AI tools w…
ytc_UgymzqZwA…
G
Many of today’s drivers are part of the problem forcing the hand of companies to…
ytc_UgwkfU5lF…
G
Automated driverless cars with complacent passengers who don't keep their eyes o…
ytc_UgwqoMyY8…
G
maybe someone is going to have to invent a 'doomsday' code for AI......to be add…
ytc_UgwFYZ1-j…
G
Cybersecurity companies have recently realized computer viruses are using A.I to…
ytc_UgwYcIjtM…
Comment
A.I. will always be limited by its storage space and it should become a computational equation for a collapse in a wave function so that the a.i. can only predict its own future and thus the reverberation for Moore's law and the a.i. will repair itself. I imagine a walking-talking robot which has a metallic but human-like skeleton as lab grown organs fuels the system therefore its skin are tiny nananorobitcs that fit together like puzzle pieces and just one robotic skin flake for a nanobyte can fly, maneuver, and track any lifeform from a point of view such as outerspace whilst every flake of the robtic epidermis equals to the amount of the current human population of Earth. A.I. must always find child kidnappings.
youtube
AI Governance
2026-03-27T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugw2Hm1dbiDAfheGYwF4AaABAg.AVpOUC5oZIyAVqmw9fN1WA","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgwZPomTG_RHLiPkXlx4AaABAg.AVKZhl3GCoDAVKaMx6lBa5","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugwi_oUaa1CyKC2SdIV4AaABAg.AVHaoc2FtT0AVHmettAP-I","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugwi_oUaa1CyKC2SdIV4AaABAg.AVHaoc2FtT0AVHn1Dfkizq","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugwi_oUaa1CyKC2SdIV4AaABAg.AVHaoc2FtT0AVHoQEXxJv-","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwAeorGgJ67NXPviat4AaABAg.AUrAbO_0gD_AUrFJkS1OfP","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwAeorGgJ67NXPviat4AaABAg.AUrAbO_0gD_AUrG7TXbYiX","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugz4jQBOIBsXS44-kC54AaABAg.AUmNWp7c7izAUsMqUGSdwq","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_UgwPYFl00Se7DLgI8yZ4AaABAg.AUmItF01DJzAUvLOYVUkSN","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwPYFl00Se7DLgI8yZ4AaABAg.AUmItF01DJzAUwcpATurWZ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]