Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robot: 'Human cant create smarter robot then human brain"
Human: "Why"
Robot: "I…
ytc_Ugweb0APd…
G
As is sadly evidenced by South Korea's skyrocketing suicide rate, South Korean s…
rdc_lj9vb52
G
That's why Tesla drivers buy Tesla because they don't know how to drive a car an…
ytc_Ugy0kzjaJ…
G
I do use Gemini sometimes, and indeed it is helpful. But sometimes, as a program…
ytc_UgyobSBcP…
G
I dont see a robot or ai doing plumbing , fixing cars , roads , filling shops , …
ytc_Ugyr_CVzD…
G
As of early 2026, the most current data indicates significant safety concerns re…
ytr_UgwoqrQfS…
G
I'm really not worried about automation. I'll just go find some nice, undevelope…
ytc_UgyepbART…
G
@dr.move37 well, you can make a robot that will speak, clean, cook, drive a car,…
ytr_UgxwiZzxk…
Comment
There is something that can be done through law that can slow down so take over by forcing the rich creators of these machines to be responsible for the loss of jobs. Make any AI Robot creator pay tax for doing work. If they replace one job they have to pay that one employees tax to government for a life of the employees and life of the job. Workers pay tax so should these intelligent machines. If a machine replaced 100000 jobs creator/companies selling/renting those machines should pay tax for those 100000 jobs lost.
youtube
AI Governance
2025-09-18T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyiN45K8yeuj-vTDfJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyUpC2_K7Ux9s-OWJp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzec5iOWG4ziutVoMJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXdsUkV6SAYblS0LV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFBHe4uZN3g-_phgR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwWRsW2OUw1W9NaNkN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw6XVvyA29cMK3uhIt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwTRawcPxaoxhKau-V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyJUt8fntcih5PGYQV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx5FwHrelD-VgLI2yp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]