Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just the start of AI, give it till the end of 2024 and you will see how life is …
ytc_UgwW1yZUb…
G
Hopefully robot drsx are smarter than real drs, because real drs are dangerous, …
ytc_UgzyiuH83…
G
You realize Elon is being sued right now because the self driving on his cars ar…
ytc_UgzPYT920…
G
Just remember. AI cannot replace every human worker. There is a physical limit…
ytc_UgxOx3TJk…
G
Re protesting, climate change protests are for all intents and purposes in effec…
ytc_UgyBftMZW…
G
AI is not art and shall never be art! It’s not even poetic! I am a poetic writer…
ytc_Ugw79ydUV…
G
I know people want leaders but AI is not a leader. It's a Yes Man and usually wo…
ytc_UgymwhrnS…
G
82 likes rose colored eyeglasses wearing not very smart people jeeez so obliviou…
ytr_UgxbPcOP1…
Comment
I’m going to be honest—if jobs disappear because of AI and automation, governments will have to figure out a new system. Why? Because without people earning money and purchasing goods, most economies would grind to a halt. If people don’t have jobs, who is doing the buying?
People also underestimate how desperate and even violent things can become when basic needs aren’t met. Without a system in place to ensure access to food, clothing, and housing, things could collapse on a global scale.
The real concern isn’t whether we’ll have these necessities—it’s whether we’ll have any control over them. Will we have a say in what food we eat, where we live, or what we wear? The question isn’t if these needs will be met, but whether we’ll still have freedom and choice in how they are provided.
youtube
AI Jobs
2026-03-18T14:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz4w2QCFTfabkUFOWJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQDz7DcjDQsOBfFnZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwPHODEC8ZUXIjeOud4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzsyQF4Q5FCOb09XF54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyshTSDcR4ta7vjO7B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxNpfilvh0oDK7HegV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzOpsRvJz2mr75CMI54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwNVY8cW-CNJ_S_z9V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJSgjIVA3ojO-s2dV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRzD-0V9NDAUTWJp94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]