Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I know it's just an endless loop circle of code that ''if this, then respond wit…
ytc_Ugx8t-5Bv…
G
In a world where youtubers are building AI turrets to point lasers in their eyes…
ytc_Ugxheg-m3…
G
Media commercials TV shows all of that have a negative effect on kids, but no on…
ytc_UgyBLETvw…
G
I'm driven to think of ourselves and most animals as the super intelligences of …
ytc_UgwyCl8PY…
G
What you need to understand is that the 'Godfather of AI' was working on Bard te…
ytc_UgxPxfMHw…
G
I have been communicating with my meta AI free on my Facebook messenger app.
I c…
ytc_UgxW0tOqw…
G
Human driver would have at least slowed in this situation. The technology failed…
ytr_Ugw8kR5pj…
G
First and foremost my sympathies and condolences to Nyel Benovitas' family - suc…
ytc_Ugw5fRClR…
Comment
Humans driving everything everywhere against humans. Musk was worried about AI threats and yet today he is supercharged. Speaking of laws is just another affront to human needs. Will laws anywhere ensure AI will reach out to those who are economically deprived because of various kinds of displacement.
Therefore, these debates are great but I doubt there will be a meaningful solution.
Thank you for organising . 👌
youtube
2026-02-07T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwoXsJ8CyjpeEBxVzx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw2BeeWtYDTXDgD6jl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzJy525o4uk1w82cuN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgznHUSheQH6F3n7Ax14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxFFabcKC_5Z6HbKD94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz8hfacgTX1MD5xG-J4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwsEEpJ8IufH9nmqW94AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzlzf_pNsr_xM91t7d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzsxJyXfmmOllUhnDB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw2D9w2kKvEuv8D39p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]