So after coming across an article that stated that pilot error was the cause of a deadly military plane crash in Georgia that killed 2 people, I realized that most crashes are due to pilot error. I have seen quite a few crashes that involved the pilots making the wrong decision, including trusting an aircraft too much, mishandling a mechanical failure that could have been handled safely, and miscommunication.
After doing some research, this theory was proven true.
Personally I think that when our technology is not advanced enough to handle a situation completely autonomously, we should try to build a bond between machine and the human. Us humans who as of where our technology is at right now has the final say on things, therefore we need to be familiar on what the machine can do and what it cannot do.
For example, the Tesla Autopilot system. It is an "advanced driver-aid" system, it is supposed to help the driver carry out certain tasks easier. However, it is not intended to replace the driver. As a driver aid system, it cannot completely drive on its own, but it can do somethings such as lane keeping, and lane changing on its own. So the driver, who is "in charge" of this system needs to be familiar with the system, its limitations, and where it excels. When the human does not understand the system, safety starts to become a huge issue. For example, the driver assumes that the car can stop at red lights and stop signs but the autopilot system does not, the car runs a red light and causes a major crash. When using advanced aid systems, people need to know when to let the machine do the work and when to take over. This knowledge slowly creates a bond between the user and the machine. The user gives the tasks that a machine can do to the machine, and takes over when the machine cannot do the task anymore. This helps us prevent misunderstandings and accidents with a machine while our technology still requires human supervision and input.
So back to the military airplane crash, if the pilots knew what the aircraft could do and could not do and followed the protocol for that situation (this is basically a "what do the machine do and what does it not do" manual. Basically it is a set of instructions that the machine can do/handle well known by the humans) the crash could of been prevented.