Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The fact is if Ai can be installed into a machine that can be used for mundane i…
ytc_Ugz2YXQhj…
G
someone calling themselves an "ai artist" is basically saying that the person is…
ytc_UgxPjEgRN…
G
I hope John remembers this as an AI-controlled robot is crushing his skull. AI d…
ytc_UgwRViyy9…
G
47:49 Are you sure Hank? It seems like if you hand it a bunch of doctors notes w…
ytc_UgxWiw-Ab…
G
With AGI, somebody soon will be tired of putting up with its garbage. "Blow is a…
ytc_UgwlQeMe_…
G
The A.I. made up faces but what was the number of men to women and what was the …
ytc_UgzmCezwN…
G
From where do I download this chatgpt my ai just said that she can't show any im…
ytc_UgwgVqEco…
G
3:20 Brandon touches upon Alphabet nerfing Google Search results to pump up Goog…
ytc_Ugy_IJt2M…
Comment
I own several 3D printers--machines that are much smarter than your average toaster. Mine can even send me text messages if there are problems or when a print finishes. They're still drooling idiots, and there's no reason to bestow them with consciousness.
I actually see little reason for machine tools to imitate humans such that artificial consciousness is required, other than to fulfill social needs. And isn't it sad that we live in a society where 7 billion people are collectively lonely enough to want robot companions?
youtube
AI Moral Status
2017-03-12T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg5T1x8wX517ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Uggl45nMs8MlTHgCoAEC","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugj-3QUzcmWzS3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghdcyUz8iBopHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgjASefucwldGHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjU66U6Snjyy3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugg3cKdgztUVnXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UghWZYUYGXURZXgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjANcp3q9CJu3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgikXoM6UZZV3ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]