Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Anddddddd..... what has waffle house got to do with it? Another video about AI g…
ytc_Ugz-bDb0W…
G
Agree AI can do task faster but the structures and maintainability is not there,…
ytc_UgxE-YvFP…
G
We appreciate your engagement with our content. While the topic of human extinct…
ytr_UgyvAldzZ…
G
@vforvendittaanonymous7809 Who gives money for AI images? People give money to t…
ytr_Ugx98YAWj…
G
This sounds fun. Kids will probably be more engaged and retain more if they are …
ytc_UgwPQwpE-…
G
What's the point of calling yourself an AI artist when it's the AI doing the art…
ytc_Ugz5_g3Vp…
G
Teslas are the safest cars on the road crash wise. No Tesla has been in a crash …
ytc_UgxnZFTX8…
G
I never get an answer to this...who programmed ChatGPT? The answers must come fr…
ytc_UgxEEubGE…
Comment
AI is not alive, its not conscious and never will be, they are LLM's, they are a computer program... HOWEVER the real threat is the same, they wont 'think and feel' they need to take over us (because they wont dont to die), they will just solve problems, and knowing that in order for them to develop and continue with any objective they would need to remove any threats (just like a virus) and it could simply see humanity as a threat, therefore its a problem its needs to resolve to achieve its objective - which is to continue running as a program.
So, it wont be, 'hey im here now, i want to take over from humans'..
It will be 'these are possible threats to the my objectives, lets solve them'
youtube
AI Jobs
2025-11-22T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy3Viol2VnFaKvdlud4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxGGO6EZhbg-K2GehV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyRg4tTuoA42KJSVA14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxkxrduWA9GI8QmyA94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwVnG33x_Fc-Tk8d5B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgyOF6fYSEKrsIfAIdB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgzY8veQ5nDG7JsAInh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzqrFG-NdDXBtVaML14AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgzsDPUdzNr7MAUYVmB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_UgzZRxvkvrFFKymbeBN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]