Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is no way that this algorithm has the sophistication to be effective and j…
ytc_UgxPB1fYp…
G
DISCLAIMER TO ALL GOOGLE AI MODE USERS!!!!
As a user myself, & an expert in na…
ytc_UgyxH2YxH…
G
Dude me and my friend did something so vile, so horrific, that the ai GAINED SEN…
ytc_UgzyDRvXw…
G
so AI will be better and better, in that way we need to know from IA fwhat are t…
ytc_UgyJDe-KB…
G
This AI control idea is pecfect to have human control in almost every possible w…
ytc_Ugxd3Asmv…
G
Louis, love the channel. But I object to reading from ChatGPT AI responses for n…
ytc_Ugzk5HRND…
G
That's why I use it. It's a win button that I can use to guarantee Victory. Btw…
ytr_UgyDIEKM3…
G
“Ai is bad” owns both owns AI companies and heavily uses AI across several of hi…
ytc_UgyYw4-ek…
Comment
Your idea that the more code, the more bugs, it makes sense when thinking about an OS, but in A.I it's quite the opposite. Big data is what makes it work better. Look at Watsons A.I. it had petabytes of data to work with to help it makes its decisions and that will probably be true of any really smart A.I going forward. You're never going to get 100% reliability but we seem to have accepted that with passenger aircraft etc. if its more reliable than a human and cheaper, it'll happen one day.
youtube
2012-12-22T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxEPQrtR4rSbKJSb5R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxCTtakWe3ZGRBZBX94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJXBBabHvGPfyXTfZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw3E1AD0qGeUT3oNMJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwemOLdW8DwRpNO3FJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg3vF-RyR6SoXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx10RCQggDAQSdPC-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwr2MXAOyAYMukRRlV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz-bmHnqDnTSCOJV0J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwb7_3pnCP2KW9a0jV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]