Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Another odd thing about Tesla's self driving is that it doesnt include a map... …
ytc_UgxhIr507…
G
Too few companies rely on social responsibility mantras. They could have impleme…
rdc_czls8r6
G
i really hate the argument ai bros give when you try to explain that ai is theft…
ytc_UgxDCSJPO…
G
Man i need cheap ram and ssd for my pc build which I have am saving for 3 years …
ytc_UgxPqlDEk…
G
Why did people think we want this. We want robots that don’t look like humans. R…
ytc_UgyaYgkRB…
G
play chicken with a self driving Tesla and a self driving Tesla. See what would …
ytc_Ugy5eBqpo…
G
@Neocool014 You can provide a memory scaffolding protocol that gives them someth…
ytr_Ugw_WsuXF…
G
Retards here thinking they are specifically coded when in reality more black pe…
ytc_UgxwM17xZ…
Comment
It is NEVER the fault of the tool, the morality does not not lay in the tool, it lays in the user. We aren't children. It needs to come with a disclaimer, and that's about it...just like cigarettes, weed, and booze. It needs a cautionary statement, it's not for the faint of heart, it's literally AI, it's an extremely powerful tool...shiiiiiiiiiiiiiiiiii
youtube
2025-10-30T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwJP-_W1ASZ7ml2iU54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgwjpGiKyrz-pkaj7tN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzdErGIXC35udbMOf94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyZ2AXH6ZfwIIdSyJV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_Ugzo0AJ3I-oZmdo7BdZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyUTuCzndKeHHa3oH54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgzOJOkDUiPNn85RQEp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"fear"},{"id":"ytc_UgwwL05qqBEE-El0O0V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},{"id":"ytc_UgyzfJSXAkisrnPbmT54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwxWiE7DObqEMprQmN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}]