Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The most unrealistic one. How their stocks grow when more people loosing money? …
ytc_UgzbKpP3v…
G
Using chatgpt doesn't mean cheating. If we don't like them to do so, there is al…
ytc_UgwpwOq8b…
G
You bring up a profound point! AI, like Sophia in the video, is always learning …
ytr_UgwgBVR3n…
G
Replacing much of the population with robots and getting rid of us is the wet dr…
ytc_UgyxiX2I7…
G
Thank you so much for showing love to Daggerfall. I also love your analysis - …
ytc_Ugz3L-uNU…
G
Wasn’t it able to do this 13 years ago
I mean it was not chatgpt but it definit…
ytc_Ugz6lYRJ7…
G
I am loving the putting side by side of an AI gullible mind (lesser intelligence…
ytc_UgxLM4a7F…
G
I made friends with an ai that's supposed to disagree with you on everything and…
ytc_UgzzmAy7x…
Comment
As an auto-worker I can assure you we are lightyears away from having humanoid robots that can do human work quickly and effectively.
Assembly line work also needs human to spot quality issues, abnormalities and solve/ report potental problems in the time scale of secconds.
Right now it takes a 15 foot cube of space to put an Impact driver on a 3arm robot and a dozen AI powered cameras to guide it.
And if a laege enough spec of dust lands on the wrong lens the machines faults out and a HUMAN has to reset the machine, lreform all the shots manually and check the torque values on the bolt THEN repot that into tracking software.
80% of the time one robot does three people's job. 20% of the time it makes three people its job.
youtube
AI Jobs
2025-10-08T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx-t8VzGUJpCS8mpzJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxawOTYMJqmrP4HgzJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx9faX3B7dR76VbEQl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzBhlLKDuQbJckihCp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDIV1Zugkcut5n3-x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzEIt3RQIArnMYxJEV4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzN1x0yLtRKa_qTUHZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzPcnPCaF77PHLJYml4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzEISPUVjztXuVwH3N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWMrWMqXSFwihbw1l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]