Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The best outcome is that we are domesticated pets.
"Colossus: This is the voice…
ytc_Ugz2SR22F…
G
Well I do not know about other kind of people I just known for myself that I hav…
ytc_UgxIedwSa…
G
I am not as concerned about surveillance as I am about lack of transparency. If…
ytc_Ugw49ktAi…
G
Which is likely since they don’t age like humans and don’t have that sense of ur…
ytr_UgxPQTWkw…
G
I hear what you're saying but could I suggest that you not look at where AI is n…
ytr_Ugw-xt2js…
G
Read this simulation lead by some of the top ai safety regulators and engineers …
ytr_UgzeTtmM3…
G
Google search as an example, which didn't need regulation because it was "just m…
ytc_UgxCZrTS8…
G
I feel so much better seeing this. I just got back from a lecture about how an a…
ytc_Ugze9A2ce…
Comment
AI and robotics are designed to benefit only one class of people.
When they say it's like the industrial revolution : no. Because the industrial revolution wasn't about replacing every worker unlike AI and robotics.
If it was up to me they would be taxed a percentage of what these technologies are producing to finance a universal salary. People still working would get that salary, maybe a little less on top of their salary.
Anyway if they take all our jobs and we don't make money it's not their pracious technology that will buy the stuff they won't be able to sell us.
youtube
AI Jobs
2025-08-29T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz2dYvgGswbT1wfeA94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx5Lnup1HbOZVOrsH54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5T4VCdFTyezKM5SZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwvkBMxNRXh0Kyzpzd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwyc3HnjsEYnY0UDHh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwZx43WoBHYhiwYlxt4AaABAg","responsibility":"government","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzwwjIEbNMLcucT8KN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzM_8_0V-AQ_uC5gTh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJGc_MnmlRWyYVEKl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxHiK5erURZoatGT014AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]