Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Self driving cars are fucking stupid if you ask me, stop inventing problems to s…
ytc_Ugwi2Z3NO…
G
I know and I don’t believe everything I see and watch but AI has to be regulated…
ytr_UgxXdcU7a…
G
That's what Jesus told us in the book of Revelation if it wasn't for God shortin…
ytc_Ugz_5jv0H…
G
AI could become a super virus that is unstoppable and spread to every device and…
ytc_UgzEqTtZ6…
G
Honestly, the whole shift from free AI to paid services is tough, but Rumora has…
ytc_UgxiaqTjt…
G
basically it makes small artifacts on the artwork that are hard for people to se…
ytr_UgxoLGcig…
G
Seems more like people are scared of a new thing that isn't quite perfected yet,…
ytc_Ugy13_P-c…
G
If humans can evolve a conciose over time, why cant robots create one by harvest…
ytc_UgyogwI3h…
Comment
You just made the case why robotaxis are not a good investment on the short term. It requires to educate the public first, and you can’t even get the public to agree on vaccines; even if experts say it is. (Even if Tesla expert say otherwise)So it will require 10-20 yrs to educate. 🤞and the investors don’t have 20 yrs to see returns.
youtube
2025-06-03T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyavX6Egk-rS_jacl14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxtV3hpG-wjCaYofSh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyvDgzX8rmfrFKW6114AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugybfoa_WO7lTOUiWBh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8f_LaJ0YYqr_DrJZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugz8l773WT9wDfrBuLR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxID07JbIE3t7bK5Ql4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyIESFDiXfLMKPRh894AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKaGEFLWJUP7DRQkx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzykjV9gVzkg5rGoG14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]