Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@RunaroundAtNight As someone who has a tesla, 100% drivers need to pay attention…
ytr_Ugxznbzo-…
G
Who cares about AI. Go read revelations, we have been told by our creator exactl…
ytc_UgyRGgshG…
G
Agreed. In fact: Actually believing our sped up algorithms that learn from us an…
ytc_UgyK_3URO…
G
Fun fact, you cannot copyright ai so please do go ahead and steal from these laz…
ytc_UgxoOObi3…
G
I start to think that AI Stan is here to destroy humanity. If AI can take art aw…
ytc_UgyjOzBeR…
G
What if you are wrong and AI really does eliminate jobs and there are no new job…
ytc_UgwqcLsgy…
G
I'm no longer surprised at the delusional revelations I hear from people enthral…
ytc_UgydW9zaw…
G
this is exactly where AI shines tbh, not just speed, but helping you actually do…
rdc_ohv6u54
Comment
They're just playing off of the a I fear and all that stuff trying to talk about how dangerous it is but it actually it's going to be helpful and it's going to be less dangerous because of its smart it's going to not going to do anything to hurt the planet or destroy the planet or develop biological weapons or anything like that I'm looking forward to everything being automated and not having a job and then getting money from some source maybe my AI will make the money for me so I won't even have to worry about that I'll just relax and let the AI do all the work
youtube
AI Governance
2024-10-12T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxYsoAlDaADw1VcBIF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz6epbQwoKbExutoNN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxIHbyxsoU9tF3LlMh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyVRugzJrwHXYK8LOF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-0nXrsyCEzfWQ0Kx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxAZ8Quj46DuIzS-Yt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzX_OZ5B0GSoboV1Ep4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxG5XW0p-3iIZEB7114AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxsffKjnYTIkODacEl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxx2Zse9a5qLe4xd5R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}
]