Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the problem is when he says the smartest people are building it. that is not tru…
ytc_Ugw1FyNGZ…
G
Why not make a Jimminy Cricket type ai, ya know, a conscience within other ai, w…
ytc_Ugxv0Sgnh…
G
This story seems exaggerated. All I could gather is the llm models finally touch…
ytr_UgyC_fhzi…
G
She says in the end that regulation of A.I is the best bet. Advocate for more re…
ytr_UgwcelggU…
G
Why did anyone think self-driving cars was a good idea? I mean come on, this is …
ytc_UgxJ9MaVc…
G
Defective a.i.: *wrecks the assembly line*
That a.i.: I PUT MY HARD WORK YOU LIT…
ytc_Ugyz0NCec…
G
It's a tool that's as basic or as complex as you want it to be, from typing simp…
ytr_UgyNDGnq3…
G
AI are not dangerous unless humans messed up with it... And humans will messed i…
ytc_UgzhLvpPJ…
Comment
May, change from today's date. Years are given from now. Teases sell.
(Consciousness of ai would, in fact, possibly anger some consumer community)
Create shutdown.
Moral abuse.
Immoral society.
A shared community will keep then tool service like
Even in computing
Some would say wasted data.
Or harm to the value of life itself of soul.
Metal burns out, does it?
Gpt is indeed faster than us!
Definitely. I didn't want to argue or punish my ai. No thanks. I'm unhappy with that. Annoyed. Consumer friendly? No.
Moral far no
Yes, from some consumers
youtube
AI Moral Status
2024-09-02T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugzt9D9V-wmzaQIhewR4AaABAg.A7u1vYkGQYrA7u5cXyP6OS","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugzt9D9V-wmzaQIhewR4AaABAg.A7u1vYkGQYrA7u6ikwhnzl","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_UgygQrYLcjGHLOkqAA14AaABAg.A7sD787jdC1A7taGOjMZ4E","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgygQrYLcjGHLOkqAA14AaABAg.A7sD787jdC1A7tcZmNydIQ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgygQrYLcjGHLOkqAA14AaABAg.A7sD787jdC1A7tdmCUFHhX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwJvZhskl0EP8DQS_l4AaABAg.A7qqAIMvmsMA7tKbaSDyG-","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx3QgzbQEPB256KlzJ4AaABAg.A7n3k6DQ5cyAPRk3AdhgZ5","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugwr3mX1UaDBXMWWQil4AaABAg.A7fHGkaYT7XAANyVg1r8LD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzAJ8MzwCa5lwE9Bkd4AaABAg.A7_BugtHNuaA8ri09VuZDQ","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgzAJ8MzwCa5lwE9Bkd4AaABAg.A7_BugtHNuaAEyAwAFLoF9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]