Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Test one of these driverless trucks over the rocky mountains during the winter. …
ytc_UgzasvS68…
G
If it hasn’t happened already, it will be soon, girls in S.Korea will start bein…
ytc_Ugy6PgZ_h…
G
Haha, this is my constant fight with chatgpt while trying to talk code with it; …
ytc_UgwtFxwkI…
G
You know someone has reached an unhealthy level of brain rot when they take a ch…
ytc_UgzGF_gwP…
G
as a SW developer who used to be interested in ML and AI topics, I honestly feel…
ytc_UgwjhonI5…
G
I gotta be real honest with you and sorry its mean. This is an insanely stupid t…
rdc_mzxxz8b
G
I wonder what AI is up to in it's own little dark corner while us humans are ask…
rdc_mnq04lu
G
It feels again like Ai's just repeat old arguments from the great mass of inform…
ytc_Ugygo0A4a…
Comment
could would should.
let me guess where all this scare mongering ends up in...oh yeah we need government to make regulations to protect us right? and also "accidentally" protect the big players from competition.
The longer the video goes on the dumber it becomes and turns into complete childish fantasy at the end.
By the way, Yuval Noah Harari is a middle ages historian not an AI expert and the Large Hedron Collider is a great example of government wasting resources.
youtube
AI Moral Status
2025-04-27T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzCtQIPTm4fmNsnARZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzdlSRb-r3f_oVfo6F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwDHPiNc9X72O22xol4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzzZigMgzXB9hgsBVN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxzBB0GDd4vvgrEZhZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxG-RE0CFPtjVyO_8d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxdfPK0EcYHv3q9f9B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8Ngkc0lZk8iJgybp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyaKNwITpKxmX2Roy94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgznxXErosFcdrvV7Ax4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}
]