Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
sheesh. imagine being contious and getting your face removed just like that, sho…
ytc_UgzCGIelf…
G
People aren’t “failing” because they use easier tools like AI. They are respondi…
ytc_UgzFAYaDV…
G
If we can train AI by seeking to become a movie producer that writes very scary …
ytc_UgzDiBIlI…
G
Where is evidence that AI talked children to kill themselves? This seams very mu…
ytc_UgwYs1rqx…
G
I don't know who's more pathetic. The AI 'artists' or the luddites touting 'AI B…
ytc_Ugwxi49Xb…
G
so large companies are driving Ai to replace humans, where will the revenue come…
ytc_UgzBrxku7…
G
What the fuck did they think it would happen. I’m sure there are so many wonderf…
rdc_fwief7x
G
What if AI never has a major alignment issue, but does everything it is tasked w…
ytc_UgzT9MJHD…
Comment
Bigger question is why do we have job's? We all know that robotics and AI coming together will give way to robots doing our jobs and doing them with 100% efficiency, also paving the path to abundance. You buy a robot and only need to maintain it. Robots do not cry and whine and complain when given a task. Robots are not lazy. Robots do not need sleep. Robots do not need human attention. Robots do not need breaks. Robots can work 24/7. It's inevitable.
Now you have to ask yourself, what can we do about it? What systems in place will need to be redesigned, innovated upon, upgraded, wiped etc.? Maybe our AI friend can solve that problem for us🤔
youtube
AI Responsibility
2023-04-11T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxK4yxPwB9fBInrX_t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy89WdFipkf1zqqzst4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwv2ldEjkHyaWDJWjZ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugyb-URI02yXkJsZ9BB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwo7159RHYy7g4uWMB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyKHkt3pcm1IL2qGHt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQmk186p94PQdbrYt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzRr8I5c_lIYfIr5-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxbMnpiZWpmXrr1FKx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgyN0zzfMSFfWgh6z654AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}
]