Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@drluswala Yes but AI is also in it's infancy. Once strong AI emerge then all ma…
ytr_UgxFEGUku…
G
ai definitely helps with school stuff, but i always run my work through Winston …
ytc_UgxA8DacS…
G
That's an interesting perspective! The interaction in the video really highlight…
ytr_UgxKjANSG…
G
you can see they are robots BC they have a robot thing battary attached to ther…
ytc_UgzOGBgq2…
G
So the kid had just got home from a trip to Catalina Island with his friends—tha…
ytc_UgxVnaCMh…
G
Great... More or less a how to for budding psychopaths'- I hope the cybermen ar…
ytc_UgyFhCBSe…
G
As a Multimedia Journalism student, I have experienced situations where my profe…
ytc_UgwD_MhGl…
G
Eueopeans "omg we cant even over regulate with DEI and then theyre going to ask …
ytc_UgxuYJ6-N…
Comment
You told almost exactly what I was thinking, but gave no solution, limiting AI to one particular task is knowlingly limiting the power to live in a world where humans can keep their superiority, If man is capable of creating super intelligence there should be some better way to use it. Unfortunately no one knows or can give a proper answer to it. Super and Hybrid Intelligence (fusion of best of human minds with AI) will outperform most if not all humans and no one can fathom what this mean for ordinary humans on earth, what will be left for them to do.
youtube
Viral AI Reaction
2025-11-28T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy9FptBAcsuXIWhn414AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzcPRDJhs_YjyXLIh4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyT2RGBEsI34HwtVJN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgypFyeRmBTkqEe3Nld4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxr1mU295SYBM_DMsx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzkcwb7GTBY7sH_jIR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwh3LpVOjrPtgGN8454AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy2DGSWIirOqs2V2Vd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVTc3R85mmsEK5crl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyrjmj_k4_SaxPLi794AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]