Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Okay but it objectively isn't here to stay. The amount of electricity and other …
ytr_Ugzqb9lt3…
G
How could people live before A.I. was invented ???
In fact they are all dead...
…
ytc_UgzUZJp7K…
G
This Discussion is also going on in Europe,Tech is going so fast that we cant ke…
ytc_Ugz8rblHZ…
G
Dont know why i watched this..... i will never be able go affort this car😂😂😂 an…
ytc_Ugy-f007X…
G
So who do we hold responsible WHEN (not if) a driverless truck is involved in a …
ytc_UgzyO1qDP…
G
The safest way to go about this that I can think of is for the employees of the …
ytc_UgyhCEJeK…
G
I asked Google AI how many children books on LGBTQ are available to buy it said …
ytc_UgyPS2ivR…
G
Vision 1: Continuum
A superintelligence serves global corporations, creating a t…
ytc_Ugz8pN2mP…
Comment
One limitation of AI they haven't discussed: it requires resources. A lot of resources. Hardware, power, and data. It can't grow infinitely fast like the singularity people predict, because it will always remain limited by the growth of these resources. Arguably be its access to these resources too; if we want to stop it, we can deny it those resources. At least until we put it in control of those resources.
youtube
AI Moral Status
2026-03-17T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyqxt5IJdXB9eelCf94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5yPGi8gAxbasB_-F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz2ErNQXQH8qkVLRQB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlWgW6XRvwqNIr4Ud4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwiEmsfSvp8hElW1Vl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRLP7ffkwSQwPfo7d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCa9XeQ6kg8kKmZcF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRBfcWInC8fby0QUx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwzMmUTWxajU8mFlpN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzv7FBttz_OetAoEgB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]