Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI could be used to solve big problems and break codes like animal language and …
ytc_Ugw-BanBg…
G
AI is just a tool. What you do with that tool is up to you. It's similar to weap…
ytc_UgwZN2BKe…
G
This is ridiculous that something like this has been made. It is designed to tak…
ytc_UgivtIdgV…
G
It sounds like he got into higher-ups level courses by skipping prerequisites. …
ytc_UgzrM30mp…
G
I think that the people who keep parroting the "ethical issues" line really have…
ytr_Ugx-rHtmd…
G
Ai is NOT to give humans any ideas Nor backing up solutions from Ai perspective.…
ytc_UgxxyO7q9…
G
Well, 3d stable diffusion is currently getting developed by NVIDIA. So you might…
ytr_UgwDl2K4t…
G
If automation takes peoples means of making a living, then guaranteed income can…
ytc_UgzsYtIYS…
Comment
To get a fully self driving car you need a AGI, since it needs to be able to understand what other drivers are thinking and doing, be able to adapt to new situations etc. People think of driving as just a single skill, but it’s problem solving like most skills, except solving the problem poorly can quickly result in death.
Which sensor is used doesn’t really matter, we do fine with just two cameras, it’s only a software issue.
And I hare the ”What if we put a large mirror on the road” or image of the road etc. That would trick/confuse a majority of people too, and roads is not the wilderness, they are designed to be safe, that’s why the police will come and use violence on you if you decide to put up large mirrors on a highway, so demanding self driving cars to handle that situation is kind of absurd.
youtube
2026-03-27T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_UgyZKxYpKqqI1erbu-B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqLfzihmWjY05dBc54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyCwGNDSwzMTo4cbzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDc_g1v7rtsVNNYqt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyChxqy8x8d0wt-vJZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]