Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the people who make AI should make AI to replace themselves in society.
…
ytc_UgyWtctIf…
G
This isn't a problem we should fix about the goddamn ais. It's something we need…
ytc_UgxxmYpXb…
G
ChatGPT Creator's sin... Training IT to be a human while installing spyware into…
ytc_Ugz10iK1Q…
G
How exactly is this fighting back AI? Looks to me that many people just jumped b…
ytc_Ugwwr3LWV…
G
In Philip K Dick’s Do Androids Dream of Electric Sheep, the significant danger f…
ytc_Ugx9pDrbH…
G
Wake up people! What's going on here? Predictive programming in movies is and ha…
ytc_Ugx1sprU_…
G
I fully support the "pause" so that we can have proper governance and policies e…
ytc_UgzuUGAio…
G
Damn.
Ai is bringing them together.
Thats pretty funny.
I remember back in my…
ytc_Ugyee1oNM…
Comment
I worked hauling salt water to the disposal in the oilfield and also hauling flowback to disposal. Is an automated truck going to know which tank to pull, go up the catwalk to gauge the tank, hook up the hoses, turn on the pump and know how many barrels to pull? Will it be able to find a site that was recently occupied, be able to navigate around a rig and know which frack tank to pull from again being able to hook up its own hoses? If a strap is coming loose on a flatbed load inroute will it be able to see that and stop to tighten the strap? How will it check load securements inroute? Will it know if a load has shifted and if so what will it do about it? Then when something does go wrong who are the companies going to blame? Surely you don't expect these companies to assume any liability...
youtube
2019-12-09T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy3J3hhHDIlaCrjU7V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwc3QG2UimwS27c1Qt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwM_vXP0P0LinjB9wJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy2-7vc8pXkDSYs4vN4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxWIIrTxeqVijWjBEJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzw6iM83Xk7zsQRfnN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyaXW87GTJj4F8wO7t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzsX40jrjqsbfZdLex4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgycewaE1qSqm-L0PBt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwNC5jXpy3Uzm-J4X54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]