Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Simple rule: if an AI tool needs humans to correct ~5–10% of outputs, track “net…
ytc_UgyY1XgFv…
G
if you are a chatgpt user there's a free Chrome extension called chatgpt booster…
ytc_Ugxf2RaJf…
G
I think the big part being left out of the discussion about AI is CONSENT. Did, …
ytc_UgyB3AZYh…
G
robots have sensors, if he hold the box and the sensor kept 'reading' the box o…
ytc_UgzGjG2cY…
G
Well at this stage AI still doesnt understand context. I reponded to a person on…
ytc_UgxVTaBSG…
G
Let AI figure out how to routinely give hands 5 fingers, then we'll talk about t…
ytc_Ugw0Yk9fM…
G
Wow - it’s clear from watching this video that you have no AI expertise whatsoev…
ytc_UgxZzqZBg…
G
It would benefit them tho, unless you want to post all the math you did that pro…
ytr_UgxPLmrJW…
Comment
AI will never be solved because of hallucinations. Eventually the AI will produce so much data that it will shadow human made data. It will have to train it self off its own corrupted data and hallucinate more on top of that. Eventually it will be like dead internet theory and there will be more hallucinations then real facts
youtube
AI Moral Status
2025-06-11T04:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw4_0jrPEWzN0wLAtx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCTYcsa2_iQfw_xW54AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwL0LD5TH5V4HV3bDJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxL5zzp4z6TC3YoLrZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_5atK7A8m67tymTZ4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJLdqC8G7J4boEzZt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwoK-5RApA_9dHm4Nh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyQfHtmGnw9kL0sbsp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzhm7qrR6zhgk5EWDd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx0eTvpA1ZU2D1fc6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]