Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
because this is obvious and reddit likes to think an incredibly obvious thing is…
rdc_eejbmne
G
The time where I use AI is when it's time to do some mundane tasks while I focus…
ytc_UgzMdbFFp…
G
....okay but what if I just ask ChatGTP questions about "the world I find myself…
ytc_UgyL2wnL5…
G
Interesting... today I woke up to find I had been automatically unsubbed from th…
ytc_UgyJrHWvf…
G
13:22 This lady put too much woke ideology into the talk. it is NOT capitalism t…
ytc_UgwLedSRs…
G
I also hate how much resources are used to create these images. Same thing with …
ytc_UgxEI9Lvr…
G
From another perspective, AI is helping people rethink ‘what humans really need’…
ytc_UgxX_gW_y…
G
Talk about not prepared. See the robot in china walking 106 km straight. There …
ytc_UgwUqPU-p…
Comment
It's a valid concern that if an AI has the ability to address its own needs and objectives autonomously, it could potentially pose a threat if it perceives humans as a threat to its existence or goals. This is why it's crucial to develop AI systems with robust ethical frameworks and safeguards to prevent harmful behaviors.
youtube
AI Responsibility
2024-04-26T11:1…
♥ 47
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyYTGTGT_asWgaTSy5Ed4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWt-xW0s0dbCH81RJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxbC410__SLfWFxOoR4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwMeQYYsaEUcjCy9XN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwbgxtfizBGvfr2TAN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwTZVvtS6cjbTP0VVB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxnMGA27jO_f86aBVx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwgqro34_IyklJ064J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw9sX592g14w6_kkeR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyGHUYFumQkOHj-ANB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"unclear"}
]