Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The company helping with the AI has tech for yelling at distracted test drivers.…
ytr_UgxZfNcRI…
G
If people loose their jobs and don't have money then who is going to buy those p…
ytc_UgyxOSYuW…
G
Is it really the same as tracing? What if you asked an artist to do a piece simi…
ytr_UgwloSHd-…
G
Where I'm from we steal luxury items because it's financially irresponsible to p…
ytc_UgwIHknPJ…
G
society is biased in the same way, that's probably why chatgpt is biased in the …
ytc_UgwKPPnfq…
G
So we should not use procreate, Photoshop or illustration and just adapt to use …
ytc_UgxtKYoDS…
G
People don't realize how far away from AGI we are. Current AI projects are hyper…
ytc_UgweuiImB…
G
this is better than moving pointlessly, last time all these rebot constantly mov…
ytc_UgwOk-EWR…
Comment
The big problem with the idea of millions of robots doing every imaginable job boils down to energy consumption. A human (or any animal) has a unique biological energy source whereas a robot relies on electricity and therein lies the problem; we simply don’t have enough global capacity to power millions of robots, EV’s, trucks, airplanes not to mention the entire agricultural sector assuming the drive towards net zero is maintained.
youtube
AI Governance
2025-09-18T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx_Pw7qtZXhNiyH9KJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxv7mNRe4BLKoYd1p14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxNMWN6TvTksxD-dx54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_dBVoWL-2YtoC2qp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgykWNCpQ6j-iTwQStB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwTcBMyr1PHQGEOSud4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxsYuKTClcFo1ZgCOJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_t_dM2txSGaXX5hl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugx93OTCBbbPHFa_6Mh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxxqpZHDX3FN6MRhWR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]