Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Was it real, and you are seriously near a robot holding a fu&king gun crazy🤯…
ytc_Ugyv9B_z6…
G
Thank you for putting a face to AI. I'm old enough to have worked on mechanica…
ytc_Ugwf3ciJw…
G
@tanya2660 It's obvious you didn't read all my post. I did say that if you abu…
ytr_Ugzyj7SNc…
G
Also, ChatGPT as a therapist runs a high risk of simply validating you without h…
ytc_UgztKGBWp…
G
Hello Mam, i m not related to software background.. i want to learn AI what are …
ytc_UgzymuNII…
G
1. Shareholders ban govt from buying.
2. Same shareholders set up 3rd party usi…
rdc_ekus85t
G
Corn uses 20 trillion gallons of water a year and 40% of it is used to create et…
ytc_Ugw3hJJrZ…
G
If we are to eventually create true artificial intelligence and mass produce it,…
ytc_UgyhDAfx6…
Comment
possible solution: if all other cars on the road are a) self driving, b) connected to a single algorithm that allows communication between vehicles and c)can recognize accidents and make the surrounding cars react in a way that would cause 0 accidents (ex. all cars slow dow to allow affected car to break), there would be no dilemma because there would be such a little chance of making a collision in the first place. the only thing that could cause such an incident would be an external and spontaneous natural force (tree falling, lightning,etc), or the system shuts down and all hell breaks loose. but this argument can hardly be realistic in the near future because of many reasons (economic and social acceptance to name a few), but its nice to think about. let me know what you think!
youtube
AI Harm Incident
2017-07-16T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UggJqTTxAgQpuHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UggSR5TBSlFvAngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugg3Ooi6amKBaHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgiIyXvapa3ghngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugg6S0mO_XY-BHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugj2e2pqV-vGsHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UghPxCznJQ-7DHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UggvYPle2Wkb6XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgjlhktnJUrllHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UggvUq6OXIbKO3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]