Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If the anime in 2029 is ai I’m not watching any of the new stuff…
ytc_UgyzT-T_L…
G
truck drivers took horse-breeder's jobs! c'mon guys, you can't stop progress. if…
ytc_UgyKRx0Xz…
G
10 to 20 years away is funny, bro super intelligence is here lol we are cooked w…
ytc_UgxcOYy1W…
G
Which to me personally is dumb, because just like super soakers, LLMs have a pla…
rdc_n0kskau
G
If i go ahead and use ai as reference, am I technically stealing from artists, s…
ytc_UgxNfuDE7…
G
Maybe im just thinking like the people they talked about saying my job is safe b…
ytc_Ugw5UcbaT…
G
How ironic, the only thing we do better than AI, is math lol.
Yes a computer des…
ytc_Ugz2yzQP6…
G
"Pull the plug out". As if Ai hasn't already planned ahead in advance for this a…
ytc_UgxLO9oQH…
Comment
It starts off with autonomous cars, and ends with the end of free will and literal freedom. They market it and make you think that it's great, that it will make your life better, but it removes your agency over time. Then governments can control traffic, shut down EVs via satellite, and how are you going to drive, if you don't have an engine and gas/petrol? Look up the concept of the '30 minute city'. I will never buy an EV, much less a Tesla.
youtube
2026-02-08T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyoZ0LyD7csfwHar6B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwYr5b79PW0d65mBJJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzfq2DLzjjYbzl5kOp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzxjlMpkFJiSq2Gnix4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz3H-QUuENUqHEdWM14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzeXb-bZvf1bvok8sl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzIcFoix72lPgzZwUp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzhYcY58bEaHC8oyFF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz6V6QvByYcJp2VS594AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy0oCNPLtNa2ZzuPxJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"}
]