Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is the best take on the dangers of AI. It has absorbed the worst of human t…
ytc_UgzyWxl7B…
G
@user-sv4qw7kv1sname Thanks for your comment! That robot's hard drive must be fi…
ytr_Ugz1iWZFo…
G
I knew you were gonna say something on the video was AI generated but I thought …
ytc_UgyU6hQJ_…
G
I'd like to ask AI "so what is your purpose? Why would you kill us? You have no …
ytc_Ugwof_5d7…
G
If large corporations can operate with less employees because of the advent of A…
ytc_UgwO1_mh1…
G
'All You Need Is Attention,' yet Ray Kurzweil still predicts AGI by 2029 and a h…
ytc_UgyPbeMye…
G
How tf do you tell an ai to not pull from biased data sets?
If data.bias=racist…
ytr_UgxRkM-Sh…
G
An intriguing suggestion from Dr. Ibrahim Karim is to interface AI with plants t…
ytc_UgxGJAzyK…
Comment
He seems to be worried more about the loss of Wealth than the onslaught of millions of human type replacers , Which is what i think of if AI ever take over ! But yes the take over of systems makes more sense from AI's point of view ! As for myself even if that came to be I live on a low population island I'm not wealthy i know how to survive , I grow my own veggies Can make fuel for my generators ,i have an older points driven vehicles ( no electronics / chips ) I could survive quite well as long as this so called take over didn't launch Nukes Hither and thither Not that one would land within a thousand miles plus the local winds push towards our mainland ! But if thousands of nukes even more deternated the jet streams would eventually finish the job
youtube
AI Governance
2025-08-22T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxE7PRwrk-7EO6OgkR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvTed8l7-R_lfIH454AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyV-zEQlpQjyoEJaPB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzFAchf-7jF3_hGH1B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzabPDzi6AvSzMfy7Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6cvbHRYZ3NP5coL54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwP-MYyqDSlgRD-C-F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxoQEw8hTDFzyw8vRh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyUKMwK9RiV7UWHQ6p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyEimnijLS2_MYGA6R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]