Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It is my advice to humanity. Keep AI away from Biological researches. I have see…
ytc_Ugxicr54s…
G
@Dziugenonas Agreed there are some realistic scenarios, but until AI is integrat…
ytr_UgwXNLVUB…
G
When are we going to have robot and AI tax, so we can pay people to stay home!! …
ytc_UgznAnW0j…
G
17:32 When he takes the hat off the robot it looks like he was scared he's about…
ytc_UgxUKXjEK…
G
AI should do that the world and animals would be better off of say get the cause…
ytc_UgxFCpmfJ…
G
Sahar, the way you conduct this debate brings up several problems that weaken th…
ytc_UgzSK2m3-…
G
These arguments are so annoying to me. When drawing I have different struggles o…
ytc_UgzR0h6qV…
G
High quality, AI free, informative and up to date documentary that answers impor…
ytc_Ugz4MsJDV…
Comment
Tip of the iceberg. No truck drivers means no truck stops and no jobs for anyone working at those locations. For the public, it means fewer clean rest rooms along highways and low-cost places to fill up or recharge your car or yourself. For those arguing human-run refueling stops will still be needed, the trucks will be modified to be refueled at mechanized, fully-automated truck stops. The only human jobs will be a few inspectors who drive around to each location to ensure the robots and their automated processes are working as intended, and a few overworked techs to make fixes and corrections as needed. The tech is impressive, but its longterm impacts and consequences less so.
youtube
AI Jobs
2025-09-05T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwxzGcLxuy6jHkVr8x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzTyv9FujUTNv8qIZt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwvUFp6lvggduO6IQN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxk0f5WNWAgRi1NMbt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzhPSsqhkJHD2owjFB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyzbtQNO5RIKOzaUSh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzNiYqzj6H_gEQKwcB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxkW3ecAt9GsiyXrJN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzDtRxJtmjLwxnkDu94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyANEenxM0VU9H63ht4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]