Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Recently i read a quote that would fit this well.” AI should be used to do laund…
ytc_UgxqFZmD0…
G
The best thing about AI is that everything about it makes me so mad that I actua…
ytc_Ugx-iukhV…
G
bro the internet should not be used for ai data base do you know how many terror…
ytc_UgwixQVUT…
G
I think this is GREAT. AI is never going to be regulated unless it starts hurtin…
ytc_UgwvoP-3l…
G
Photoshop is so different then AI at least with photoshop you actually putting s…
ytr_Ugx9DRTgU…
G
“AI art is more accessible for disabled people!” Hey hi so im disabled + an arti…
ytc_UgyKUrvcO…
G
The emotion label, whether applied to humans or AI, is largely a post-hoc story …
ytc_Ugwc_fE5o…
G
@JamesP7 You are engaged in arguments in MULTIPLE comment threads on this video…
ytr_UgzU03c7X…
Comment
I like it as a tool for human use, but fear it as a tool of human misuse. Elon Musk has given examples of directives programmed into Ai bot or app that a Ai robot will pursue relentlessly even to human detriment or destruction. I'm also incredibly concerned about the number of jobs Ai has and will replace. How will 80 to 100 million people survive without their jobs? Not everyone is capable of becoming a tech whiz? What becomes of those who cannot be retrained in technologies. To me the Ai is already damaging the human population by replacing jobs.
youtube
2023-07-01T19:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxKnCROnSXGuLklv0J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyC9cMuv_5aWlZ2qVN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzrVUklc7NkESQ-78F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwUei_wnm2b_k_GZG54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzGVYlvpXKTYDGCW0t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw3uISZNfrIQ_H_M3B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwX_kePAjUsRwD8kDt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxefLBxYY8INEbf8Zh4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFM7TQV0WDNrO26y94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgywX0loQlL-jNzgYCp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]