Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI isn't the problem, the problems are the people in the comment section making …
ytc_UgybX6J4c…
G
@joeblow2286 yeah she's definitely whining because she doesn't want her work get…
ytr_UgwZNIviA…
G
They should have to make those AI robots gree or purple or something like that.…
ytc_UgzG_pOro…
G
I know AI & automation is inevitable. And they are going to take far more jobs t…
ytr_Ugz7kPHT0…
G
We are screwed. This is going to happen because as long as billions of dollars k…
ytc_UgwHGJkjB…
G
Dear senator, you must initiate an international treaty with other nations to im…
ytc_UgwvfkBaH…
G
Hahahaha he first left open AI and now wants to buy it😂 grok isn’t doing well…😅…
ytc_UgwqSGLEV…
G
If AI works in humans' best interests, the question is whose best interests AI w…
ytc_UgwJU4nFR…
Comment
They can't even get adaptive cruise control, collision warning sensors and lane departure to work 100% of the time.. especially in bad weather. Autonomous trucks won't be around any time soon if at all.. Even on set lanes, with a driver on board in the driver seat, they are still crashing.. sensors don't even know the difference between skid marks and black tar on the road and actual lane stripes..
youtube
AI Jobs
2022-10-06T16:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz64BcS0clq8B7E7O14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWTa0zF79AvLBq9Ux4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx4ju9cVWYs9XKazAF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzIaqltBgdA4bt87Kx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz4eXdQjS7pWVn4YfR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwgH4GGlm3LN3hAmzJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxGnDfFCOLV6-CfLeF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxv9PygJ_SVRja83pp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyjN7leuTuHuVL7rYp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxEcM_97CEsyQ5fi094AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]