Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you can find some, look at China video of self-driven cars and you will see t…
ytc_Ugwed3My1…
G
Several art teachers told me my art wasn't art. I drew cartoons, and cartoons ar…
ytr_Ugye6I2f9…
G
The sheer amount of defending you’re doing for AI is ridiculous. We all know tha…
ytc_UgzdsKrPH…
G
I do agree that there needs to be tags added so that AI won't train itself on an…
ytc_UgzhGsD9s…
G
I’d distance actual art that expresses an artists thoughts & emotions from marke…
ytc_UgwnGITR6…
G
Bro is calling for levies on ai companies. Love this. Policy makers need straigh…
ytc_Ugz6j06C4…
G
Studying ai outside of how it actually works isn’t studying anything at all lol …
ytr_Ugyz4W4he…
G
If AI takes all the jobs it will have to pay all the taxes and it will choose to…
ytc_UgxvOEY8B…
Comment
Irrespective of whether one believes AI should be regulated, this regulation would be practically impossible to achieve. As intimated in the video, any successful regulation would require global consensus, which is simply unachievable in the current climate, or within the foreseeable future. Regulation simply won't happen. Both companies and nations are highly motivated to maintain progress and that's unlikely to change either. So, for better or worse, it's really something that can't be controlled and, no doubt, the coming years will be interesting.
youtube
2025-06-27T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw2sht9SQ9I_KgNdVl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQ9fChpgNuKUGifYh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwNc6hIA0ui7hKEBox4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXpqO7_CK9-EXtI0p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwbpR1qEB53zBFDOR54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxkOXcwNfDJ3v7NScB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyov_7PLUc3PGLfPsh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxZUBP1TYnYhuqfzR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzh5OW7WwggWKfvxfV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyNSovBUeDRJOUp9HN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]