Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Of course, The minute little robot gets scared and runs away. It’s because the l…
ytc_Ugx5PXIZ6…
G
People forget we still have ape emotions. The depravity of humans is a result of…
ytc_Ugxwdv9Yo…
G
they need to add an AI voice that tells people what each car is trying to do.…
ytc_Ugxkf5_0T…
G
Why do you compare ai to microwave? You may think that ai is like using other pe…
ytc_UgwO0Jdcq…
G
I hear that they are planning a fantastic summit at The Hague, he should conside…
rdc_js2bqdh
G
Do not use this product. They are giving the Pentagon the access to their AI. To…
ytc_UgxzSxj-Z…
G
The most ai art use I see is in R34 games and art. No serious artist or game, us…
ytc_Ugz2NnKvy…
G
Trust a computer program only when you’re willing to accept the consequences of …
ytc_UgyoRXHhl…
Comment
If there doing this enough is enough, obviously they are not completely control able so just stop but no you can't to many engineers and scientists have to find the edge of out of control push push push, turn that magnet up too much and zip the earth is gone we have just eliminated ourselves, but it happens in a blink of the eye, whether iits a company or a government, just work on the quantum PC without AI, no chance, but honestly quit letting Siri and Gemini to work together maybe not help them learn any faster I wonder what the apes thought when the first humans walked with them, do think damn one day these people will drive cars and they use a toilet
youtube
AI Moral Status
2025-07-03T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugxlrfo7rEUKfem1lot4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwSY71IHPfJGmo10k14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgcztCvQ9INI3ZOrF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyaP-gE-mQjlwCE1ZN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxUfz2kh8iolXfetax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz--WcI57oTw1QW2UJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgySZvUV5rvYopwSqpF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwVE4P-mOPfK6Z85Xl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw3bd7NK_M37eDBjO94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxCoXPbftu6kwAtDjN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"})