Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If people think that healthcare is gonna be cheaper because the AI you’re a frea…
ytc_Ugzdb2o1n…
G
The thing is, when you post a certain artwork on the internet, you are consentin…
ytc_UgzIEE3k-…
G
Timestamps (Powered by Merlin AI)
00:00 - Sorry, we don't have enough informatio…
ytc_UgwrnbmTN…
G
@11:20 Why would you use an AI slop interpretation of St George and the Dragon i…
ytc_UgzboNL0T…
G
Planning is enough for smaller stuff. Even smaller stuff (but well documented co…
ytr_Ugw5ztn8d…
G
Imagine your life work and your art style that you work hard on it posted by ran…
ytc_Ugw_b4vaC…
G
They do not want to regulate it, they seek to train it, and have already begun d…
ytc_UgykYaq9H…
G
what im hearing is just "the ai will do literally anything to give short term sa…
ytc_Ugz12nIqB…
Comment
maybe not integrate AI in every field or work rather just some fields where it is needed most like medical field ... I know it will cause trouble for the medical profession people but it's is still manageable atleast I guess more then integrating ai in every field ...
and maybe if the ai is capable of solving some of the biggest problems like poverty or something....
But still I guess if ai of the level can be made it would be tempting to use it in every field
Buttt I still refuse to drop the whole thing ..rather first prioritize the safety then continue... Coz it's too much to loose .. imagining the boost it can give to the humanity in every field...
youtube
AI Governance
2025-12-04T16:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyYYLacM0YRJHRXXe54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwSvM64Yp2FM_0zRHF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKqfrV16YItYINF_l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwGIzChvluB3KdjLI14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxECJw8Eem0RNQOV9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzp0GOiiZaJmCgNpOl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyd2brReOaKLgZBrxN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx6d3Ih_7GYFiZTq2Z4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzV2wY2YZMkVFXZHgt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyOnzx7t5JwmBiDvq14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]