Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These AI bots are pretty shit. Supposedly my IQ is only like 110, but these aI b…
ytc_UgyxKMJeD…
G
the situation isnt even moving it was stopped in its tracks the moment it happen…
ytc_UgwL_58rZ…
G
We should ask AI artists would they go to real doctors who studied medical for m…
ytc_UgzWITfG3…
G
The problem is these ai machines are language models trained from real world dat…
ytc_UgycixXcr…
G
Technology didn't kill your job in any time. I just shifted the skills you need …
ytc_UgwXXb_ZV…
G
On the corn art part, I'd say that AI corn art will not survive. Not only are mo…
ytc_Ugw7xuVNr…
G
Seeing a machine gun on a robot hand is the most scariest thing I have ever seen…
ytc_UgwLmoBxQ…
G
Train em on CCP PROPAGANDA, get one result or FAR RIGHT PROPAGANDA, you get anot…
ytr_UgyS1ahr3…
Comment
The key is energy. Human are far more energy-efficient than computers. The brain uses several watts to function. An electronic AI with the same intelligence requires thousands of times more power. We simply will not have the energy resources to replace 8 billion humans, even if we continue to improve integrated circuits and reduce their power consumption.
youtube
AI Governance
2025-09-05T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyOrZ_M_PMVAo0JyKN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyTuGSBUl5h69IAHJZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugypk2pN6fTst_VSLDp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw0P5hzgah8tf99fgV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugx10RkPeGg9cWqBpkV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_UgwlRdhppltx_IqKU4p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzkTDkQ-awHVmJ7hP54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},{"id":"ytc_Ugy3T7MnAgI9agOfqtB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},{"id":"ytc_Ugxj2WNCaKkfb6AyEEN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxcpfWkD8iyRWA3rW54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]