Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And there's so much talk of "AI" "destroying" us in all the various negative way…
ytr_UgyQ6cX3v…
G
Have the robot shoot up in the air like a mad man, oh and people worried about t…
ytc_UgwQ3MHk4…
G
either ai is still stupid or its actively lying. i constantly get it to admit it…
ytc_Ugyj-X_JP…
G
Out aging parents are in trouble. If yours are alive, please have a sit down and…
ytc_UgwasKXnt…
G
Yeah the market will definitely react negatively with a lot of these industries …
ytr_UgwiOBJIG…
G
ChatGPT may have been created with "parameters", which means a version can be cr…
ytc_UgxSigzBB…
G
I'm not too worried about AI just yet. I am more concern for when man starts to …
ytc_UgzHN4ioj…
G
There are tons of fake videos out there impersonating well-known people. Watch o…
ytc_UgwlyVl3Z…
Comment
I didn't say Tesla autopilot is safer today, I don't have that data, nor was it relevant to my point. A quick google search makes me believe you also do not have data to support 10x worse. The problem I am highlighting is this piece is doing nothing more than fostering fear of technology, distrust of corporations, and at least on an emotional level seems to even connect the two, asif Tesla is making autopilot solely as a marketing ploy. I would love a WSJ piece investigating the truth of deaths per mile. I would also love a piece talking about how we might tackle the remaining challenges. I would even welcome a piece if it concludes the improvement rates are reaching an asymptote that suggests Tesla will not be safer than humans until we have X breakthrough. I hear none of that here, just fear, sadness, distrust.
youtube
AI Harm Incident
2024-12-22T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgwH9rBqCu6WYkd_xWN4AaABAg.ACMiscRkwcjACMswH6XvGq","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyV0asjD2kHvRIlRHJ4AaABAg.ACMiMgOYGLmACMtKS_Fl-X","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugweu51Acy7m0rcPPLh4AaABAg.ACMgvI90ZlCACMty_k6Mvv","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwqiD-5JDC-1-1X6kd4AaABAg.ACMdWe6ZplkACOj7gdTheS","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyRt0HR1KhZz8EZIFh4AaABAg.ACMWckkaKW-ACQz-ZQ5vBl","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugz7oggRa_XIr75ASat4AaABAg.ACM3TsS9s8vACMzCwr_E4A","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy2yh33ZzDoR1hii-F4AaABAg.ACM-RpWpiZtACM2V0adZq2","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzr6JIuyym3nDLWXFF4AaABAg.ACLYBmQCCUwACLgoo3T5ji","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugxs4ouTmLOmgIbbyt54AaABAg.ACLSGnj52SBACLl2pgCPZr","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyg6xFT7Qd_Q1bMRsp4AaABAg.ACKoysi4TbVACLxQuvSoAX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]