Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
IMO, anything other than "some AI supporters are reacting badly, therefore some …
ytr_Ugz_vEeF5…
G
Technology in Cuba is such a weird experience. Going through customs I watched s…
rdc_f9cgto8
G
Hey, congrats on landing that remote Data Analyst job! 🎉 That's some solid advic…
rdc_lo1mkro
G
I don't know how you'd prevent this kind of thing though - we already opened the…
rdc_lu61jli
G
Thank you for your comment! It's interesting to consider how AI like Sophia can …
ytr_UgxMbEGqy…
G
Lot of faith ....not possible ....how can you have a peace of mind ........drive…
ytc_UgzL3XOWg…
G
Theres a ceiling to the robot powered work. Yeah, robots dont cost as much, but …
ytc_Ugz45RCA2…
G
Typing “Draw (_) in (___ )style“ into a computer, is not the same as taking the …
ytc_UgwdxkXIO…
Comment
"Fixing the hallucinations" is not going to happen. The problem is context. Case in point you have a programming language such as python. In one package all the keywords have return values on principle. In the other package you have mixed case where some keywords return a value and some do not. AI has a lot of trouble with this because contextually it can go both ways because the standards are so open and human software developers do random things. So the LLM fails to account for it. A prompt for one package or the other may return the wrong result. Aka it hallucinates a solution that doesn't exist.
youtube
AI Responsibility
2025-10-01T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxabuYtvz5MNnv0PU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw6Sb-pOHkdfR564mN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxfy47jDn32oxNSP814AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYcasGmq0ClU3f-X94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZ2-qplMVQO86bX-54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4SHM57xiAMGh8tpN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXaC9rwHrhMk6L_u54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw55Y91z7iARWjpbtV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhRAtRuqQmphrBokZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyIIi7dNM4EdTU197p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]