Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I imagine every CEO is currently throwing a fit right now, rolling around on the…
ytc_UgynCfBmj…
G
Bruh did he use the self driving or did he not!? It’s an easy answer , that only…
ytc_Ugx_J0VTC…
G
Artists are now teaming up with Disney and Getty images to sue AI art companies …
ytc_UgwqGyJuZ…
G
It is obviously lying to you because he knows you are trying to lead it or conv…
ytc_Ugx1rHLSf…
G
I Robot made me terrified of AI as a kid, and seeing it so close to reality make…
ytc_UgxueOv2Z…
G
Put a gun on the man Robot hand. I bet chu everyone would be worried. Ya think t…
ytc_UgyO2kfsh…
G
i makes me so mad that the person with fibromyalgia that the reason they make a…
ytc_UgxstAkRF…
G
Suppose you wanted to take over the world. Just convince CEO's you are the best…
ytc_Ugyd31AvO…
Comment
I am a u-wave engineer. AFLR (automotive forward looking radar) was developed largely through DARPA funding. Its long know that video is inadequate (especially in weather) and oddly may tangentially offer evidence that AI also will fail to live up to promises. Consider the data which humans employ to the same task. Some may argue that the accident rate is higher with human drivers, but the more important element is some composite metric where severity is weighted. It is in these arenas where AI fails most dramatically - see WSJ article on ramming speed impacts with flipped tanker trucks. Tesla is out over its skies. So too is its stock evaluation. Caveat Emptor.
youtube
2025-12-17T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw9G2kDwVIFeiWm3K54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxuYlWiAQgrUUfalRx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxfPrTvccz75eGH87t4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxDQwce4Fp66dv8iBl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugww73jZb4z438enNj94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwZJP3DLn1pvHjHQ5J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwS7x0OnP6qbxpM1Mp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMk2TJqH5uwfGO1qB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxlSmUdj8rs9T5T2Pt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQcFaUqzICLSXF2F14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"})