Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My thoughts exactly. I once watched similar demo but guess what - the JIRA ticke…
rdc_n3l4o41
G
Good to see that not all AI researchers are antisocial nerds
We need this kind …
ytc_UgxafSWCh…
G
Enterprise pilots failing isn’t surprising when success metrics are vague. Axale…
ytc_UgxVfqque…
G
saw an ad for AI powered smart dumbbells...
it's literally a hunk of metal, wha…
ytc_UgzKpaOr4…
G
It’s important to learn understand. When 3D animation came in you still need to …
ytc_UgwjadsvO…
G
We also feed AI other things. Podcasts of people talking in an authoritative man…
ytc_UgwOtBYRq…
G
The only way to install guard rails is to give AI consequences and punishments. …
ytc_UgwjwdP9P…
G
>If there's no clause for using the codes outside the company for board membe…
rdc_kcpfu6g
Comment
Dan Brown's 2017 novel, Origin, fictonalizes the invention of a 2-story "supercomputer" which can access all of historical data and interpret it through an agent named "Winston".
Today's quantum computers with control and cooling systems can be as large as 9 cubic meters (7x7x7 feet). Our AI also have names; e.g., Siri, Alexa, Cortana, Bixby, etc.
The goal of Winston was to determine "Where did we come from; why are we here and where are we going".
What are the goals of today's AI?
WInston also had an automatic erase feature which was trigered by the release of it's goals and the death of it's inventor.
Have any auto erase features been installed in today's AI? If not, why not?
youtube
AI Jobs
2025-12-03T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxLmoPls3jkOTQFnNt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxyC05p0lfCwYbyj4B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyVXKDdiKa7ojp86eR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwqc1KqUHkrUAY4e114AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzvodrN-5KPufQElWF4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwVQYmgIy6CR_9mKWp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz9X8D_0vu0-SLT9oZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx-95kSpGUUkG6ItdZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyfUgT1C8GHo0jUSVh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6SRm4mc5ANsX4tkx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]