Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We're exactly on course as described in "AI 2027", so the point of no return is …
ytc_Ugxfl_UcZ…
G
I'm seriously considering leaving the Internet completely because of AI. I've sa…
ytc_Ugw0nyXLN…
G
Not personally against AI art myself. I'm even thinking of incorporating it into…
ytc_UgzTRagB_…
G
If a solar flare hits like the Carrington event we won't have to worry about AI…
ytc_UgzaySKhH…
G
and people thought it was imigrants taking there jobs. No lot of people need to …
ytc_UgzEQPle1…
G
In 5 years people will be able to say with a straight face, "that wasn't me, dee…
rdc_izkxscj
G
Twaddle. Hidden cameras controlled in the back room...total twaddle ...the maste…
ytc_UgwdUeoQc…
G
13:53 thats when oldschool hardworking strength 💪 and stamina with great human d…
ytc_Ugzb_NJHm…
Comment
this is uch a mistake making these robots...tehy have no conscienceor emotions and faulty programming they will turn on humans and kill us all off if and when there are enough of them...i dont mean to bring up I Robot but the rate these things learn at tehy will be ten times smarter than the smartest human alive in month or years and no human can keep up with them ,...this is creating a replacement for humans ...humans are building tehyre own destruction. and extinction. this is like playing God..these robots like eveyr other technolgy has faults and bugs and will go bad...if enough of these are made theyll overtake humans such a stupid idea.
youtube
AI Moral Status
2022-10-24T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzNx4NdBCP8jW7i30d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyP8MUTiXex1so8WIR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgybfeUD5r1OCwC8Fqx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgynEzxGJRIz4PWw5bh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugylbyl2WPF1h57V-f54AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzpzvAs1VViS8vPdmV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZfX3VHVwoExSnUhl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy_0vW6mvy2Eic7p1J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxtjZO28CDaszgErnJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-HQtDHSdT2YHXRhF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"})