Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ik im late but atleast us artists actually put something called effort into thes…
ytc_UgzN1yO5J…
G
Just spend a few hours learning AI Image Generation. I swear you won't call it "…
ytc_UgyfgTg_6…
G
Because in most cases the data is already available to them. Why spend time tryi…
rdc_eejc8wc
G
I can tell you one thing; when FSD is about to crash, it automatically disengage…
ytc_UgyyZghA6…
G
That's an interesting point! Sophia does touch on the idea of efficiency versus …
ytr_Ugz0_wBWy…
G
And the CEO of Microsoft, which has been doubling down on millions of dollars pu…
ytr_UgzEjWqC_…
G
Use the comment section like there's no tomorrow too
You probably know ai better…
ytr_UgyN7sOxP…
G
Artists are missing the issue completely imo…
The speed and direction in which A…
ytc_Ugx4ks8C8…
Comment
While I understand the concept of putting a bunch of brains on a potential problem and they will solve it. I keep thinking about what really smart people have been ALLOWED to talk about and fix, like our CURRENT CLIMATE CRISIS of global warming that everyone keeps denying exists. Those in charge keep undermining any effort. Will they really allow regulation on something as powerful as A.I. that they can control?? I do wish there was a happy in between. I do not want science, technology, and progress hindered, but it DOES require regulation from an INDEPENDENT BODY!
youtube
AI Governance
2025-06-24T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzggaTBHzHZbffaHBh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8gYeWsViy1EkLlYl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3yJi8bMVXGtxoQLd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxioJ6OWIryOkvsEvF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyip9cpna_ev3nrny14AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyVDTQ_Co8759GfV5J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwI1SJdGhu8RAb7j0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBVjtBRhd2mO__Zf54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxUUyw7VGem5NU3G_V4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwBBtMaEtkBGk9zHrZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]