Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
& in a weird outcome we might just get the time to do just that because of AI 🤞…
ytr_UgzujQHhP…
G
The danger of AI is humans using it against each other as humans have done throu…
ytc_Ugwp7nqHm…
G
Wish ai was more of a tool to assist then having it basically take over…
ytc_Ugxaxv3fk…
G
How do you say make an AI model that after 30 seconds and one warning from the a…
ytc_UgyJrKKiW…
G
Raise your hand if ChatGPT stopped you from offing yourself (or at least crashin…
ytc_UgxVVhMFt…
G
Man, the junior to senior pipeline death thing was so obvious as soon as the AI …
ytc_UgxTso8uw…
G
Stephen Hawkins said that AI was the most serious threat to the existence of hum…
ytc_UgxQ3ttoe…
G
Sometimes it hard to win against a smart person but its impossible to win agains…
ytc_Ugy6zD5_M…
Comment
Tesla Autopilot is still supervised till date (November 2025) all news media hate Elon, people don't know difference between tesla autopilot vs tesla full self driving (both still supervised) and they just hate tesla with all fake information they hear.
I drove from San Diego to Philadelphia (3000 miles) using tesla full self driving NOT AUTOPILOT and tesla did all the driving. Mind you I was alert and supervising my car.
It removes 99% driving stress as you just sit and watch for alerts.
Tesla full self driving is like a miracle. It drove me with camera vision only through heaviest snow and rain without issues.
If a supervised tesla fails blame the driver not the car or autopilot. If you can't supervise your car as tesla recommends then don't buy Tesla. It's the best car in the market right now. Best car software. You guys have no idea how crazy good it is.
youtube
AI Harm Incident
2025-11-26T17:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzSJcfGnAXeQUWTZ9x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwNP496-h11W56vbVp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxnSaCuouDBKWcKn894AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyT3ku48ypFiLf2knR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwo9KNjZRM2KHUpD8l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbwS0sXSAQJAhMI4t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwuDTWql7__gEC-WTx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyT3rC9oJ9hLZqWC1x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzgW8KjH0bvE07SbnV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyXD-W_KaXLw_Wo1qd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]