Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Finally, I can love a person for his imperfections and evaluate a robot by the d…
ytc_UgzH7e-SG…
G
does it actually matter? with the sheer amount of funds being pulled from resear…
rdc_n59qn25
G
I thought this was going to be an actual consideration of AI consciousness. I w…
ytc_Ugz-VmhhP…
G
consider the fact that AI is not intended to replace people, but to eliminate th…
ytc_Ugyu_szrV…
G
Nah, I'm not humanizing Ai. It does better when I actually cuss it out lol.…
ytc_Ugy_0KnCC…
G
I won’t lie, I used to draw on pencil and paper cuz that’s all I had, back in my…
ytc_Ugx4eCox9…
G
Poor artist they are fighting an already lost fight, in a decade AI art will be …
ytc_UgzBo3n48…
G
@tasa4904 I'm not saying it can't eventually, but it does need a tech breakthrou…
ytr_Ugw5My7Yc…
Comment
I’m really not losing sleep over the AI-pocalypse. Worst-case scenario, we just… unplug it. The so-called “superintelligence” still needs an extension cord and a data center cooled by overworked interns in Nevada. Until an LLM learns how to pay its own power bill, I think humanity’s safe.
Yes, the tech’s impressive — in the same way your Roomba is impressive when it doesn’t eat a sock. Is it transformative? Sure. “Sentient”? Please. You can’t fit an apocalypse on a laptop, and ChatGPT isn’t sneaking out of its server rack to harvest uranium.
So, until AI learns to move, power, and fund itself, I’ll keep worrying about real existential threats — like billionaires, elections, and my Wi-Fi bill. 🤷🏾♂️
youtube
Viral AI Reaction
2025-11-04T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxmVgqmftxYwVUry7R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwA3q-hZ5nI-KcDvVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-eGcQg27YCqC0UwF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxIOjKGx2pb4xm_83N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyUH5KCHYN-MrMFFV14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy-vUYaR9JTeUeZFI14AaABAg","responsibility":"elite","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy3Jr8Pb5EDrzAZx1p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyHrGWURXq7CHN1pmd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKLsdkltquEGgSWc94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz_2V54eLV4fK5-Snd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]