Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I keep offering those who think AI will replace all software engineers a challen…
ytc_UgymqUTTU…
G
the dudes arguing that AI is already conscious fail at the first hurdle of stati…
ytc_UgwooZhiV…
G
A crash will be imminent no matter what. Either AI crash. Or our planet, civiliz…
ytc_UgxEvE1wk…
G
@FrostyIFrost he had been talked to in the past, but people didn't believe the e…
ytr_Ugx-OnQ2v…
G
I seriously doubt that AI is going to replace all human labor. There is no histo…
ytc_Ugy9htr6L…
G
@RewindOGTeeHee ChatGBT was already shown to give advices that it mixes together…
ytr_UgzoJ9wx5…
G
Years ago I thought that potentially an intelligent fridge might order 500 bottl…
ytc_UgxFLt8fV…
G
LLMs plateaued years ago. You can look up the data yourself. Tech Bros are selli…
ytc_Ugwmn6Qfs…
Comment
1:42 The AI WASN'T WRONG. It's technically right, it's dealing in minutes with the terminology being used. So it gave the correct answer based on when the missiles were a minute and a # of seconds apart and not two full minutes. Try asking it the distance when the two missiles are EXACTLY one minute apart. I bet it'll give you the correct answer. Almost everything comes down to human input. Like a bad teacher molding young minds lol
youtube
AI Governance
2024-03-22T14:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxQFqCUYkSSwUnCf6x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8mc8lUy8dBWDNfaF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyyE6e9kz8_JwZjceZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyNa32gZdrpue0NwC54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyZm4PD1zqQR8R4MGB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw5nuo8zQ-DMwgHoDB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBhIovgvn7zIlb11J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwE-RswdhXOuU7jxZt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwIwhme3ym4GbBICrt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwPdI7SVoE5DwIBvXB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]