Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This, this is one of the reasons we dont want generative AI, not only artists, b…
ytc_UgyknPmTh…
G
I recently went back on Deviantart and the whole thing has AI, EVERYWHERE. I cou…
ytc_Ugyjv6qKC…
G
I was about to say "his voice sound like AI" when i realized he is a robot😭…
ytc_UgyA2Wy07…
G
Dude dropped like he was a cartoon character. LOL 😂
Also, a robot that can fig…
ytc_UgzmzitaW…
G
The concept of forcing artists to remake ai art just for copyright seems deeply …
ytr_Ugz5g_x-q…
G
So, it says agencies are watching us, but it can’t tell us it can actively see w…
ytc_UgyJdrDbn…
G
its... May be destroy the world. Next time robot will alive...... 😢😢😢😢😢 i am afr…
ytc_Ugxzheo2E…
G
_★_ I believe we are meant to be like Jesus in our hearts and not in our flesh. …
ytc_Ugytfenvk…
Comment
I do validation testing of autonomous cars. In most cases I find the software, with its better-than-human sensors and 3D modeling of its surroundings, to be a better driver than I could be on my best day. It never gets emotional about its encounters, it just keeps doing the job. But it can also get confused over simple things we would think nothing of. Ideally it should be supervised by an attentive, technically competent, well paid human in a large vehicle like a big rig. It's a great tool for humans to use to make driving safer and more efficient. It's also very expensive. I think it would be foolish of regulators to approve fully-driverless large vehicles. Even though autopilot could fly an aircraft flawlessly most of the time, I still want an experienced human pilot in the cockpit supervising the technology.
youtube
AI Jobs
2025-10-19T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwcqwWgqeFGdHrwYfV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxl03K-ty9rc85LDa14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyDK4X2xLNWeRQnCH14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzUkg-KlXWRzCu9c3V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwP5aVgCm8rHt82qVd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyqnbWBl5JESw3srDt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyF3qCdjVzO4XdsaQl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwC-lycXEmBc_jpoN54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwlQFvRczgiZKCMHYl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxriczs2NTQcszeXPZ4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"mixed"}
]