Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doesn't have to physically kill us. The internet and social media has already…
ytc_UgyquY6oP…
G
This guy didn't do an adequate job of articulating the risks of AI; it will soon…
ytc_UgwBN2XHD…
G
AI is NOT intelligence, it is a stollen information processing engine posing as …
ytc_UgwDbG--2…
G
I had an argument with one of them, I stated the definition of "art" refers to "…
ytr_UgzUKKZHZ…
G
This...........I want to be violently murdered by my AI "companion" robot,all wh…
ytc_UgwhAdJpc…
G
This is inevitable. One day everyone would be able to build their own terminator…
ytc_UggYWv6ER…
G
One thing not mentioned here is the environmental impact AI data centres will ha…
ytc_UgxVmmTkP…
G
Excellent material right here. For a broader perspective, turn to the book. "Fro…
ytc_Ugx3AMVzM…
Comment
I mean, AI is what made these things possible at all, but I understand you, probably you are saying 'AI" meaning to say LLM(Language Model), of course LLM wont make the car drive better, because its a different kind of AI, but it can make the user experience better.
What will make the car drives better is: a better software; better hardwares, mostly GPU's for training the AI; More training data, provided daily by the users.
So in the long term I think this is the future, GPU's will continue to improve because it's not only used for that, people are driving Tesla's and other EV's and providing training data, and there will be people developing a software because the potential financial gain is huge
How fast will we get there? I have no idea, but I think in 15 year maximum
youtube
2025-06-27T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugh3FzoJfzQ_AngCoAEC.8FscLZYdIxJ8FveOC3HoGR","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgjJoH8IWktnQHgCoAEC.8FnjUL6dBut8Frn_W6sr9d","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgjS9qdGSMCpJXgCoAEC.8Fml_Qkf09K8FmoH9gIC3e","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxUk_n_t9NOdtvNmp14AaABAg.ASoh2Wy9fwPATeKeA88838","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzEZKiaNlygPhAUhdt4AaABAg.AL7JQ1B81BYATHsaoqs7b0","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwMUsTZFlwbfFAuvP54AaABAg.AJcmpwxLdIFAOwx-8gprNY","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_UgwPpfyV3WeYUoFX1Jd4AaABAg.AJU9m920xQSAJstodmPgG2","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgwPpfyV3WeYUoFX1Jd4AaABAg.AJU9m920xQSAP7JUj9lc8P","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwPpfyV3WeYUoFX1d4AaABAg.AJU9m920xQSAP7KMRjckxh","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugx4VXMB2aN3VFUxROZ4AaABAg.AJEM5IQckWQAJMZ3nYVpoK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]