Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Waymo had an operating loss of roughly $1.23 billion in Q1 2025.
They belong to …
ytc_Ugwc8cfdg…
G
Great information thank you for uploading this video. There has to be a way to p…
ytc_UgzLx8NNv…
G
I'm less concerned about AI taking over like the matrix or terminator....my prim…
ytc_Ugw9PE4CZ…
G
As a non ai user that has idolized your art as one of my bigger insperations, I …
ytc_UgxHUvEU8…
G
All my kids are Gen Z, but they run the whole spectrum in terms of AI familiarit…
ytc_UgyQgtSAC…
G
Would it be more ethical to apply AI as an exclusive “work amplification” tool i…
ytc_Ugyu5C8Cb…
G
There are plenty of modern things that have ate into their real world counterpar…
rdc_n7txhcm
G
Then, what is Artificial intelligence? This is not absolutely right answer, howe…
ytc_UgxOTQifu…
Comment
the solution is quite simple. tax the robots. for every robot at Amazon or Tesla, the company must assign a basic income to each robot eg 40k/yr, apply 10% tax to said income and remit 4k/yr to the Fed. The company would still have a cost benefit of each robot on the floor as the yearly cost would only be the 10% tax. Fed then redistributes as UBI or apprentice training or negative tax. Let Andrew decide how the funds are distributed.
youtube
Viral AI Reaction
2026-04-26T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzeXbUL950H6V5GK2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxR4eosZAnd9da5cJ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6wxgMZ9daub2A7HR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_t5pevmQVikD5RvJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxLBKlZi_lNvjRKnTJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzFjAkmqxk_AWvQBTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzknjjqyve8hfyXC-N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZM9V6wgmcWz4W-f14AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxdlhLl0yWDzo-oxI94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyYp_oGm8IXsGmMMvh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]