Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well let’s go down the rabbit hole. Just so that you might understand what I’m a…
ytc_Ugzw1bvh5…
G
if ai art makes him feel artistically satisfied why is he nearly crying when arg…
ytc_Ugz4bBWOf…
G
On one hand, I think that the progress doesn’t stand on place and some jobs can …
ytc_UgxQ64PoJ…
G
I fear the people that use anti AI images as an overlay are the same people that…
ytc_UgzjQC45L…
G
The only way to get law makers act on deep fake issues is to make it personal. t…
ytc_Ugx7ukWwg…
G
AI is going to be good at scouring information, taking all we created and filter…
ytc_UgyLmeXw7…
G
empowering copyright is i think a dangerous route to go as it has already been m…
ytr_Ugz4fB_3d…
G
I was all in on this guy untill he started spewing woke dogshit like uwu putin b…
ytc_Ugy3KoKFT…
Comment
You can't have self driving cars along with human drivers. People do soo many things that are considered "illegal" but makes sense to others and are rarely enforced. An example if a car is blocking your path you can pass a double solid line and re-enter your line but would have to make sure that traffic isn't coming in, from my understanding self driving cars would just stop moving.
Hence the only way driving is safe and efficient for everyone is when every car on the road is self driving or is not. You shouldn't combine the two together as it might make for some bad results.
Im hopeful we'll get to a solution of fully self driving cars in the near future
youtube
2024-01-09T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxlDauPln9US6EIuPt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw8kPSMHARD2HOqVQx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz6qNzsGNahxUUVU0B4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzH-mPq63xDRrjDfol4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyRUb5KCizEJ1Bs43F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzX94dOZIwiEyySP6p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxEGOtPXLipjXDA70F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwg4ZpyNbyPhjFhRx94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2VwkTSMvkRCsZ2OR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz5dqAPvaPbYc3bItV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]