Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Please don’t stop informing us of vital information! The more come together on s…
ytc_Ugz5jfU1e…
G
Just wonder, halfway in vid you mentioned AI researching AI; you'd be amiss if y…
ytc_Ugwr0iQXf…
G
they forgot to tell everyone that no matter what you're soul is better than AI…
ytc_UgwfqWmwC…
G
Why would i want a robot to help me that’s what my kids are for ?!…
ytc_UgxsyRD1v…
G
But the issue of token economics - just using cost per token and number of token…
ytc_Ugy5jXlP2…
G
florianschneider3982 because AI art isn't art made by someone who put in effort …
ytr_Ugy-81kVJ…
G
The problem.With robots, they're going to be like electric cars. Can you find th…
ytc_UgwebSOIC…
G
@@vectorhooves7970 "Ignoring the larger issues", my guy, I am not here to addres…
ytr_UgybHLAUc…
Comment
NonverbalShoe noo... how dare you :O
hahaha... then lets try a safer approach... i have 2 ideas :D
1. What if the self driving car is smart enough to assess the dangers around it... for example... if the car detected that the car on the front on it holds loose containers like the one on the video... the car should be smart enough to change lane or at least give some space that give it a time to stop safely when those containers falls...
i know that it will takes a lot of effort to code this kind of intelligence but its possible...
2. Another serious suggestion is to have a separate lane for self driving car and a normal car... in case if the 2 lane should intersect, there should be a speed limit that is enough to avoid accident :D
youtube
AI Harm Incident
2015-12-10T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugi-ra97OFAYf3gCoAEC.8A2x-6Y9iR39_jA1MigRw-","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UggcjG7wPcXM-ngCoAEC.87ksLSYwmAW87lRqc_5nOt","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgjbGUooE19fn3gCoAEC.87ae9OwYcWP87aeSIvS21k","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgibjtNUDEehjngCoAEC.87_AnhDBK0Q87_DW-CiC2P","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Uggm5BdzwhyWVngCoAEC.87ZJkl4btdC87ZRqdCDAIY","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugj-Xh3Fxwz1RXgCoAEC.87YwkNlHcCU87Zv0jNj-Ag","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugj-Xh3Fxwz1RXgCoAEC.87YwkNlHcCU87Zx3NNhY6U","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_Ugj-Xh3Fxwz1RXgCoAEC.87YwkNlHcCU87ZxquYYaiQ","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UggP-iFt14eaaHgCoAEC.87YkvCWMel-87Zi3ixQABR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UghURWjOQRHtGHgCoAEC.87XLJSTRT9v87clu7Fezdn","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]