Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What if AI decides to make us all wealthy. Well, let's bet it's 50 / 50 ods. Thi…
ytc_UgzU0S7hr…
G
If what he say happens anti AI political parties will win elections throughout t…
ytc_UgyVx2l3O…
G
Yes. The huge AI investment by the billionaires is very troubling. But there is…
ytc_Ugx4RvFAr…
G
@freerangesimp Writers aren't creative? Or just journalists? Because that's the …
ytr_UgxFRWAVX…
G
Could you build a narrow AI whose only job would be to keep a super intelligence…
ytc_UgxWHOxZ6…
G
This is so sad to me. I get work is getting worse hours are demanding and life o…
rdc_mvjek86
G
The first one has clearly ripped off the design from the Spanish animated movie …
ytc_UgzGU3i8S…
G
THINK IN TERMS of ai human vs human a.i. however, to benefit humanity and A.I.…
ytc_Ugze3e9pd…
Comment
I think AI could be great with airplanes to reduce accidents and software problems and eliminate issues as seen with MCAS on Boeing 737 MAX 8. Of course also have it obey pilot instruction/pilot override and auto correction with it's learning under certain scenarios to allow eliminate false positive from AA angle of attack even if it is redundant.
youtube
AI Moral Status
2021-12-13T23:4…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxFnIFVj3wnqQIEsiB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx_bp_1_5C46PBRPQJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwDoUj7V7a651bl_EJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxbZGiVjrID0Pp3MQ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy-AFqz_CI8Vvz3Kyx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyuYuX9uJCB3u0xFpV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxKfQFAiPc7e6M-Df14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyuo4hN97IYT-20DXR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwOvmyGnuRyaQsXLOF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGsXsaJrWABArjzjp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]