Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Man didn't not create AI or quantum computing. Man created the tools to access …
ytc_UgwaHr7Ct…
G
In character ai i do "what if _____(prob p##### or w##### im sorry if any people…
ytc_UgxCvXnk7…
G
It sounds like you're reflecting on the deeper implications of AI and its relati…
ytr_UgyYwpezZ…
G
AI make me and other Artist lost client... I feel like I wanna cry... I haven't …
ytc_UgywB33N9…
G
Isn't it funny the subject is about AI taking over (human convenience and comfor…
ytc_UgwYtxs9W…
G
Have you read Hyperion??? This is the direction i think AI goes maybe? Im not su…
ytc_UgyQhKze8…
G
Thank you. I’m a long-time listener and really value your curiosity, passion, an…
ytc_Ugz_SVHba…
G
All for automation and advancement. This dead empire can't do anything right any…
ytc_UgykD5D39…
Comment
AI will be the death of innovation and human progress. Anything AI can do reasonably well cause humans to stop learning how to do it. The problem is that AI depends entirely on the data used to train it. Without new evolving training content, AI will be stuck in 2019 forever. When a person sees a machine doing in five minutes what takes ten years to learn and hours or days to produce, he or she is more likely to just give up.
youtube
2025-09-26T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxkS3inLp0ptcl81Fd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz14-vslDhFDxnxhQV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx2Q0tzUAkCxTINGth4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyRlxGZ-u8vnb2HaYZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzeawe3QgEw2cqM5bV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"disapproval"},
{"id":"ytc_UgzsUtSgOnPdYxOOkod4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw6QO792ygaK_4cckN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9tFv1SqaXIh6qohh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyZj5vo95cgAYeCBeF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgycwkLK6PiVOpr0xh14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]