Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would definitely buy a self-driving car five years after they hit the market a…
ytc_UgyMmeiKS…
G
I think in addition to the turing test there should be a test testing learning o…
ytc_Ugh3jypUX…
G
Dude i am learning music theory, music production, songwriting, singing and guit…
ytc_Ugw2wGdkU…
G
So in the future ai will take all the job & human will have to rely on art to su…
ytc_UgwrelljQ…
G
@_WhyIsEveryHandleTaken.actually a lot of AIs can draw hands perfectly like PixA…
ytr_UgzGmf_En…
G
An AI picture, is the computers interpretation of the written prompt.
The writte…
ytc_UgzSTu6ju…
G
🇨🇦 All innovations are created by humans for the benefit of mankind, and AI is j…
ytc_UgzyyVhuF…
G
I think as long as YOU record it and it's in the final product, there are no pro…
ytr_Ugz4insF0…
Comment
Also like....unless you're an IDIOT and program the AI to be able to alter its own programming, or in some other way undo the confines of what it's programmed to do, you can LITERALLY just turn it off or program in a failsafe. I'm always shocked by Science Fiction movies where scientists exasperatedly go "the failsafe isn't working, manual override is useless!" Like....bitch...if you programmed one in and didn't allow the AI to alter that part of its programming, it shouldn't be able to "resist" being shut down. You can ABSOLUTELY design an AI and give it restrictions that it can't change and that give you ultimate control.
youtube
AI Moral Status
2023-07-06T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwUNIAo4RB4iwO7vlF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugwcvi6e66Y3YZjDLkZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwHZd0AFqTkwMawewt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgzDWIA0k10faQ0GpHh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyoOIaCvZwnw0XfpZJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy_cbM_SDt6FajwGed4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwvsmlhYNKhD-AgPkJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzF_F4QwPMYKbeEdSZ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgyYXVUS8lxIasbBaad4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwRaw-ALGdiqhZyL1d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"})