Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wow new low, meta never ceases to amaze me . Not having humanity nor respect for…
rdc_o62covo
G
It’s frankly not incorrect for ChatGPT to consider a earlier version of itself t…
ytc_UgwfP-sb-…
G
I'm a positive thinker. But more so I'm a realist.
The existential danger for …
ytc_UgzgWhHzI…
G
I only got hesitant with the lady and her dog. The rest i got right. I guess ill…
ytc_UgwxhsVJq…
G
I wish people would explain the "black box" concept around AI more clearly, inst…
ytc_UgwzazSrG…
G
The ability for an app to access other devices on a network is a thing. The cong…
ytc_UgxTOtvVF…
G
I have been saying for years that humans are the most damaging thing on the plan…
ytc_Ugx1tbRel…
G
I fed rules to an AI so it could replace words it wasn’t allowed to say with oth…
ytc_UgwSSrY08…
Comment
Instead of relying on humans to do the right thing (they won't do the right thing), what this guy and others like him should be working on is an AI destroyer. AI designed specifically to think of all the ways to destroy another AI system. A safeguard.
youtube
AI Governance
2026-02-14T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyE3XIxgPwLKbVrnrt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIoUxXtXJi8z7_wNh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz5lGfkZon0AcqMP-94AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwcnhxTssor6ZhcfyJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwyLYjFe8fQUOiOxPp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw3VoemOFRHY-NbqO14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzuVPJdhFD-bynTts94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz4T3654Vjd56IvrQd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwDrdi9O5fR58iNvQl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwfSuJHZso0ywbVrot4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"fear"}
]