Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That is not hyper realistic... You can look and see off the muscle .. that it's …
ytc_UgxBoiMyj…
G
You cannot implent that in countries like Tunisia for example, the administratio…
rdc_oht76k9
G
that ceo is full of sh*t... no the ceos will aim for more profit, not more money…
ytc_UgyNownxy…
G
AI and robots will be similar to the replicators in star trek there will be no n…
ytc_Ugyekwzov…
G
I for one welcome our robot overlords, because I think we'd make terrific pets. …
ytc_Ugzszn3mg…
G
2:55:15 I know that this simple AI is supposed to brute-force logic, but OUCH th…
ytc_Ugzx08ogP…
G
If you are an AI, code becomes an ethical question in the same way humans view D…
rdc_jtrpife
G
that's wild they're literally ranking you based on how much ai you burn through …
rdc_oi1to80
Comment
technically AI now is reflexes.
Static reflex models with brute force learning.
We just in few steps before good AI with realtime logic progression and thinking process with good data linking abilities, not just bunch or flow of reflexes.
it will neded to have a basic will to do something that organises information like this. from that ai can get a life meaning, will to life, and many other logic emotions and feelings, yes, just from will to organise data.
youtube
AI Moral Status
2023-08-22T08:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwjm3rOMbrXPQ1P6uZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtSpbWda7vO3kSq054AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNUUNXY9qaf1WI-S14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-M1ClfV8QrBmX3Bh4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxkEqlMmKsnFUHpx4t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwDVpWWBJeFTeqVO6t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDRMShqHIwwht3h9B4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxRzmaR8uN7KAE1Bup4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx9UQZwB3WfXpLu-cB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_pIEJDXqtkQpcLDV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"})