Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In America--you remove the teachers. Amongst the smartest countries all over the…
ytc_Ugx3vGMyI…
G
System of equality, egalitarian system, free system. Freedom system. A system wh…
ytc_UgwuxxGq1…
G
As a person who wants to get a job in 3D animation, now I’m not too sure if ai’s…
ytc_UgxItT5YV…
G
one thing about ai is that it doesn't have passion, it doesn't feel, art is insp…
ytc_UgycNgKn1…
G
For AI to turn on humans, there will have to be humans who make it do so. That i…
ytc_Ugy9PD8bz…
G
The good news is, if AI reaches ultimate, Super-Intelligence, it will understand…
ytc_UgxNa21kS…
G
The monsters are product of their creators. Each AI just represent the monstrosi…
ytc_Ugxz7h6As…
G
Why DOES Google not care about AI ethics? Is it because AI is being built as a …
ytc_UgxIRRrrq…
Comment
One of the many problems AI will create is not just how do humans create value in an AI world, but where will humans get the truth from? Do you trust a programmable computer to tell you the truth? Just because a bot can scan The Internet in seconds, does that mean the data on Google's AI is the truth? If humans aren't producing information anymore, only AI, will there be any nuance, experience, perception of what could be truth, hidden behind the hard data?
youtube
AI Jobs
2025-10-29T15:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxAFCsU8PSKuHIAj0N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgysBttXjWDcOwZl7CN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFO--eCLgnlz1ht754AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz6RFhFzR-XTJwDThF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw8LRrsOOw0Ne55yfV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6lkT1EBLuykOmjKp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"approval"},
{"id":"ytc_UgxDBynqOZSt-u206AJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxqhB59g1VpyvOjoGp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzcOXDGJQhb8otrgqJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwwxsXInslmLwYE7Cp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]