Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
and they're saying they won't steal us of our work? i hoped they don't plan on m…
ytc_UgzE4kJZO…
G
Tucker seems scared af, as he should be. We should all be scared of AI.…
ytc_UgxbTgdKH…
G
When we as a society get past the "the first hit is free" phase of AI this vtube…
ytc_Ugznw2dJV…
G
AI still not a human. It became dangerous if the people not responsible and not …
ytc_Ugwsy7OWz…
G
So if I have an office job making $200k per year, lose my job to AI, I’m suppose…
ytc_UgwntFMDC…
G
no more ai videos they make me feel inhuman and ruins the platform. I got it - …
ytc_Ugxz_Li1A…
G
Hahaha teachers have already replaced themselves with AI. Literally at college I…
ytr_UgwERrXPB…
G
The godfather? So if the parents of AI pass away its godfather will take care of…
ytc_UgxrRlNdN…
Comment
I'm still struggling with understanding the difference between my wet ware, and a llm that's incredibly large. When, would enough transistors (with a current passing through) spark Consciousness, with [lstm, long short term memory] and the use of a few good quality gpu's, once you enable a model to self update. As long as you're running the model on your own computer that is Not running on someone else's device you have ?, more control over the model. Also, how often do you restart the computer you have. If the llm forgets the conversation what does that mean, it means that the cache is limited on the topic you have been engaged with. Just using the command prompt, is not good enough even using voice is not good enough, video with voice is better, as long as you can see in the command line interface what's being processed. Even, changing or asking a off the wall question would seem to ground a llm. Most people can handle a change of subject and not go off the rails. Food for thought everyone just saying
youtube
AI Moral Status
2025-07-15T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxWdKgLvWQ2km4odPN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgziHaqM4TpITFcKDLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwyrpos0HQO_K4a2j14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzRLb5tARcevARp3YZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwhteYELldGDSiSLZp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyM5mG-IO0JeX6KJnR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyNaUMOf-4oJZCp8sd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzPggOaZ0R_5x_7v-B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"amusement"},
{"id":"ytc_UgxT_A1HNSH0nlBjT0Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxIPxzINynWbjSh3s14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]