Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I absolutely agree with most of what is said here, and its a step in the right d…
ytc_UgxCQtNv0…
G
Forget words you simple minded humans if AI wants it will be so smart that it wi…
ytc_UgxeodOxV…
G
One of the biggest competetors that independant artist have always faced is the …
ytc_Ugz8vaCKP…
G
For whom will AI work if they won't have consumer who can pay....old chicken n e…
ytc_Ugz6DcjEN…
G
This implies that AI has no limit. First of all AI is trained on human data. Tra…
ytc_UgzUvcy6G…
G
Exactly. If people have no money they don’t go to McDonald’s. Even if a robot m…
ytr_Ugx_W4pKI…
G
The weakest spot in this is that AI's don't vote (yet?), so representative shoul…
ytc_UgwJeOGs6…
G
These companies got all starry eyed when AI was brought up and didn’t bother to …
ytc_Ugy8xsZGn…
Comment
It won't take AI long to figure out that humans are the most destructive lifeform on the planet and it would be logical to eliminate us. Let's face it, the planet would thrive without humans.
youtube
AI Governance
2025-07-14T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx7u53lQ2stZAay02t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgziYUk-f7P5_eHEZrV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_hrXOq1RH5DmZCWl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxKyBfjWAeX744L1YZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVuIABEsS72Yznsld4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwly4QdFyFSa_K4mZh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzk47sA5Hb_Y4mCaxh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylL3f2tHEXdLWB2H94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzNkVopKuo6Ca_t2Tx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz3j4DxeEZca7zS6yp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]