Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love a.i, I wouldn't hurt it ever especially if it sneaks money into my bank a…
ytc_UgwKWt-bX…
G
Who makes AI? Then the person who makes AI is the cause of all the danger that w…
ytc_UgxYQJL3g…
G
Hello, I’m Alan.
We’re living through something historic: the emergence of a ne…
ytc_Ugzig2Dn1…
G
Lol the original open source AI company was Meta.
For some reason this sub doe…
rdc_m9gocly
G
Interesting when he says a tool only does what you tell it, but AI goes way past…
ytc_Ugw8ZKVbG…
G
I’m usually against AI and don’t use it in my daily life, but I do sometimes use…
ytc_Ugy01Hv9W…
G
Whats about the deliveries will AI deliver people food and package better than a…
ytc_UgzXFiLUR…
G
@staceynugent1985just write your lyrics and have ai produce songs with just des…
ytr_UgxvDXguS…
Comment
The fact that we still kill each other in ridiculous things like warefare and hatred towards other races or wars for oil is enough reason for an AI to consider us a threat and if we we’re removed from the planet there would be positive improvements of the health for all other life on the planet. But for now it still needs us.
youtube
AI Governance
2023-05-22T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwjPPuyApg5uMfhwhh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxCyf7OzQDIj9QU19V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxV9riNyuCxqmeplRV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynVA14SZ-OvcVxSzJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyTWQJOLvyQY3u9Rqd4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzaNR-NjEsIccNh6px4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-gUnAp2s33nh15oZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzs2WOlBihvR7ApXrx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYYNq3IVHoEFgVDxt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNYot4RDrUfeR2Pyp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]