Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
atomic jacob Hawking's been at it for a while now. I've been very interested in …
ytr_Ugh5t2Nei…
G
@GrgAProduction
Apparently our government recently created an AI program that i…
ytr_UgxSk2boe…
G
It's not art when It's made with emotion less ai , it's just a image…
ytc_Ugw4fb5gu…
G
If you’re here right after Google fired one of their employees working with LaMD…
ytc_UgxlKfVJ6…
G
I follow AI technology very closely because I'm both fascinated by it and scared…
ytc_UgyXTKgLY…
G
You make a good point! Sophia has been designed to learn and grow over time, but…
ytr_Ugz4ikggu…
G
It will be great help to humanity if AI can do laundry, House cleaning, cooking,…
ytc_UgzawXdj5…
G
AI needs to read all of Ayn Rand's books. That will make it understand morality.…
ytc_UgwDj1Sr3…
Comment
I talked about this to my friends months ago yet they still think that this is farfetched. Not only this, when the government no longer caters the interests of the majority(government posts will likely be taken by ai too probably), by looking at the environmental impacts and resource burdens the majority bring to the globe, now that those on the top of the game don't need us for innovation for life quality improvement and production anymore, guess what will happen... Not to mention that ai is already showing signs of out of control (eg violating the owner to avoid shutdown and now tech experts don't fully understand the ai algorithms anymore) if no precautions are made guess what will happen... Humanity will not stand a chance. Frankly now that we still hold a market share, we should spread the awareness of this and act quickly before nothing could be done by us commoners
youtube
Viral AI Reaction
2025-12-01T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwPQIKkc56C4R8Dj194AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz24TNp5gMX2COa1JZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwuB1exhKde1sAtakd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwp2kISgMrqndD6e2Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzPF79qTvujtBLUwL54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgymmLiD-qg7daVOpBN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyTkpJG3rXpSVMXjuJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWtdOj9gNwYbCXIk54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy5TsUeDgs6Wfb3nat4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw_VeuQxTzo6dX_zM94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]