Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I care about AI because as a technologist, I love learning about and applying em…
rdc_o8gno10
G
Replace it all, then. Humans have choice. Companies with "no AI" will be the pre…
ytc_UgwSfK2xp…
G
Here is the real problem for AI. If you replace all manufacturing and service jo…
ytc_UgyXzoK_J…
G
Autopilot isn't full self driving mode its an assistant to lessen the stress for…
ytc_UgxRV6RhM…
G
10:15 what youre saying here also factors into the difference between someone dr…
ytc_UgwRRH12i…
G
What we do know is that the current system is broken and should be discarded…
ytc_Ugzs7bxBK…
G
There are invisible forces that keep us alive which are certainly not exclusive…
ytc_UgyE5DFh7…
G
Yeah, don't worry, AI is hype and a bubble- the UBI dystopia isnt coming anytime…
ytc_UgwpX0pcu…
Comment
See you all here in 2030 saying the same shit 😂 unless they are going to laser us all.down, it won't happen, complete collapse would happen! So do the robot's and AI pay tax? Do they give a damn? No! Will they consume products also no! Are the big companies going to pay more tax to soak up the gap from nobody working? Also no! It is possible to replace humanity with robots, yes, will benefit the world in the long term, absolutely not! Killing your tax payers and consumers is suicide. Trying to predict the future is the same as telling someone where the stock market will do this year, they are lying nobody knows just guess work.
But if im wrong im going to purchase magnets, stick them on all robots and watch them freak out and lose data 😂😂
youtube
2026-02-09T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwilHFLtyP555PM-ux4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwIPj32Xu22O5lIhKd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6Klcr99V1oMnpM6R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz8qcq5k9dzE-4Xbzd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0yECUaBdPa6d3YJ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIwM8PPCx4kiTRFT14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwjiXp3xnQXI3AXDzt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgziwyL6EuRvDSCMj6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwvw2mB9lo9Tk2gOhR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxsPNHqcSKTF-57PWd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]