Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let’s just pay the 10% more then the 90% aren’t needed and can be sent to be rec…
ytc_Ugzb684aI…
G
Is anybody else but me considering that each mass manufactured electronic device…
ytc_UgwkpEEV5…
G
If speaking bluntly means prioritizing "progress" over people's well-being and p…
ytr_UgyqKgGvB…
G
AI or what these technocrats are calling AI! is a COMPLETE FUCKING LIE! there is…
ytc_UgwO6Mt1f…
G
It hit a bottleneck years ago. The first chatgpt models vs the latest aren't tha…
rdc_mxyjsba
G
This is why people have conspiracy theories like the dead internet theory. You k…
ytc_Ugyp5zQ_E…
G
Ai ain’t going on strike… people could learn from ai. The irony is tangible lol…
ytc_UgwwzCScL…
G
No wonder they thought protecting predators wouldn't get them in jail they're li…
ytc_UgxSL1hVP…
Comment
Humans are more destructive than AI would be. Every problem with AI there is, is connected to us. AI has no consciousnes, is not able to create its own prompt to reflect and to create something new the way conscious person can.
AI on earth is limited. Energy limited. Heat limited. I would say, the step to real artificial intelligence is really long one. I'm not sure that step will be done the way we are use right now. AI as we have are powerful tools, but to get them conscious... that would require something else. We are creating models of trillions bytes, which lack what small child already has. few megabytes large AI should be able to have consciousnes, but yet it somehow doesn't have. Architecture problem probable.
youtube
AI Governance
2026-04-20T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxiAwhlcQZnsOPt0VR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxNfc7tSRlM_bjev5F4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhugrdmD7qhFjsL8p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkXEo-Jv_X8LUY9nx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqERxsbB4oBuznCvN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzg4qQVdpJfqxqiQmt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnRqZWljYuEjVMpph4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFxUOPnwd0gEllI0Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxjArRbMV5UycOufCl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMS5bWnZrxRBH95DZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]