Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
$175,000 USD for that ? No way I'd pay that.
And the other robot is a mostly re…
ytc_UgxJ6K3c5…
G
Bittensor $TAO solves all issues of centralized Ai. Study it and thank me later …
ytr_Ugw7NGojg…
G
Imagine self driving cars all over the place. Imagine robot waiters in a restaur…
ytc_UgzEdFUvR…
G
As a former truck driver, I see this as an unavoidable future. Trying to fight a…
ytc_UgwAayo4d…
G
Maybe we should just stop pursuing this line of research. Maybe we can find oth…
rdc_n0gpywn
G
If this happens will all the people you're talking about losing their jobs, bull…
ytc_UgwD7B_03…
G
AI is a satanic innovation, destroys human identity and trust gradually. The la…
ytc_UgyOluyen…
G
You know as well as anyone that when it says the legal field it means paralegal,…
ytc_Ugx8c1Dvg…
Comment
Funny when the video has data from 2016 which is light years ago when it comes to self driving tech. The guy who crashed into the semi truck laying on it's side would have done the same in regular cruise control or most likely even driving without as it was dark and very hard to even see in the video. Watching the latest self driving tech that came out end of 2024 blew my mind and is much better than the average driver. Driving around my City is a gong show watching humans try to drive or change lanes usually driving way under the speed limit which is even more dangerous.
youtube
AI Harm Incident
2024-12-23T07:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxBnqmODJDM_HO51dd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzb0GUhZozuty3TW4R4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzvKvgCCoyMUT5qfUl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxT-IblPzm2C7ZsrYN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwNDKH4XNH72Yl9xkV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfyvRKOxtRY8LXVbN4AaABAg","responsibility":"user","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyiMbC89ies_uKvVv14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwg32k70hzI12t-NEV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxoZfYo1M6AJ0n4r2l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfXFI3xNL5rsOUPFR4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"}
]