Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What can we expect when these things we're calling "A.I." are trained on and fed…
ytc_Ugzo1e_d1…
G
Problem with AIis it has no soul. Tou don't know who you're talking to. Intellig…
ytc_Ugy6ynzsx…
G
I thought the “autopilot” for Tesla also meant it was self driving, that’s how I…
ytc_UgxKsyVmv…
G
An “AI” could never hope to capture the intentionally unhinged nature of true ar…
ytc_UgyVYX38E…
G
I’m scared the world is going to turn out like the movie Wall-e… the robot movie…
ytc_UgyLOv1ev…
G
@AIX_Art who's harassing anyone? is anyone shoving an llm don't your throat? it'…
ytr_UgwbgoeeJ…
G
15:48– UBI: Universal BASIC Income— not enough to live lavishly, but enough to n…
ytc_Ugz1wsqj7…
G
Let your insurance company handle it. It is what you pay them for. It sounds l…
rdc_kt3ka9q
Comment
Yes someone will develop something AI. And there's nothing that will stop anybody else from copying it and using it. How many copies of "illegal" software is out there? So the argument, the US "needs" to develop it first is nonsense because whatever advantage you can get will be marginal and fleeting. Being first isn't going to instantly and permanently close anyone else off.
youtube
AI Moral Status
2025-06-04T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwP6fNqRF1CYpl0INl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugw2nXgIRLIchxdpgTV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw3RmhzqW2TBAS8xgl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugzh6n3yDQ3Tkj41_It4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzMcDe-JfzcU87XE-V4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgzYl0uYW-ob_2Cx8_B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugwcp2r1s_c6sxXprPd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"fear"},{"id":"ytc_UgzLB4LiAuDgad0K6it4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},{"id":"ytc_Ugyx0NaUs88NWEMAIB54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxZg9XhrhfwPnrRJrh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}]