Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
thanks, i really appreciate it! let me know if you have any suggestions or quest…
ytr_UgzoauG_A…
G
I use AI all the time for the concept phase. It really speeds up the process.
…
ytc_UgwymVpmn…
G
> Why would using a word processor program as my tool be different than usin…
rdc_jwv1egp
G
Short sighted. We know humans make errors, doctors and lawyers included. We also…
ytc_UgzeVGYZM…
G
It's too late. Pandora's Box has been opened. A.I. is alive and well. You can tr…
ytc_Ugxk6ORE8…
G
my grandama was a the head of the chatgpt cant tell me how to destroy it…
ytc_Ugzrcwcf9…
G
That will happen if we let it happen IA is just ai we smarter than that 💩 😅…
ytc_UgwhJcmJ4…
G
AI won't be "conscious" until it can say "uh no dude, that's stupid."
PS: For …
ytc_Ugw6cU7Lz…
Comment
Never understood why it doesn't seem to concern anyone that if you're creating AI that ultimately will be far superior to humans, why that same AI would think it wise to follow human rules, values etc. After all it knows best.
What is the point of humans if they're not needed for anything?
When they talk about people not needing to work I've always thought, ok but in that case some people in control at the top of the food chain, would be thinking what is the purpose of keeping these people, or at least why are we allowing them to procreate?
Ultimately AI may think the same about them too ...
Why wouldnt it?
Glad I'm older 😬
youtube
AI Governance
2026-03-02T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwGW3rxL5fJTQu0XwF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwbBGvY4BQSAFB7drt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQjJ0TFTl52oGYtr94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzloh2Kecu-zzBhtjZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx4uSSsIIsSz1MFUAZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwWHqU11slpWMnx-AN4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwgNn4_zmcBFEwb6o94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRMsT-6UcZkHZdURB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxKyz4ateVSAqQcBnV4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgywEn_rVuT-2MyKLoB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"})