Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I get where you're coming from! The distinction between AI and human consciousne…
ytr_UgxjdYZ71…
G
@williambell356 you got it wrong. AI is not "meant to be" an assistant. it liter…
ytr_Ugzc_HNPm…
G
Is this really ChatGPT? The robot voice sounds Way too ´young` to Sound like Cha…
ytc_UgyywR_yQ…
G
Mankind's greatest fall will be prioritizing efficiency and profit over human va…
ytc_Ugx-2XqlF…
G
Unlike previous technology revolutions, with AI all those "next" jobs will also …
ytc_Ugw1bHqQr…
G
This is simply not a complete true, AI do the job. In fact my productivity has b…
ytc_UgybbEbTm…
G
l'occupazione e la disoccupazione non centrano assolutamente niente. L'IA genera…
ytc_Ugxcua7Zs…
G
That's cool but why does this need to be a full on conversation when this intera…
ytc_Ugzq6MLb_…
Comment
Ellon Musk said before AI is far more Dangerous from Nuclear Bomb
Y'all still in awe abot AI, that's so Sad
We need to comeback to Victoria era, where there is no Computer Entertainment
Living in the present without Ameica problem
youtube
2024-08-05T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyEQAieK0QJ9z9-H7N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugys44KbT-KHVNofgqZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZQ2S2JbDMXYgfmJN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_aseTYtF2yIqQxnl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzj0vq5QwBSpTjiCtl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwZnMj2txeZFC66nOJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwM1tBxZXR4U5EJ14V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyMAZlavB8l5_04mjJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxG1HyJ0FnU4HEClyF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxn_lCQNSfyY_-Y65J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]