Date on Master's Thesis/Doctoral Dissertation
Computer Engineering and Computer Science
JB Speed School of Engineering
Committee Co-Chair (if applicable)
Artificial Intelligence; AI Safety; Normal Accident Theory; AI Alignment
As AI technologies increase in capability and ubiquity, AI accidents are becoming more common. Based on normal accident theory, high reliability theory, and open systems theory, we create a framework for understanding the risks associated with AI applications. In addition, we also use AI safety principles to quantify the unique risks of increased intelligence and human-like qualities in AI. Together, these two fields give a more complete picture of the risks of contemporary AI. By focusing on system properties near accidents instead of seeking a root cause of accidents, we identify where attention should be paid to safety for current generation AI systems.
Williams, Robert Max C, "Understanding and Avoiding AI Failures: A Practical Guide." (2021). Electronic Theses and Dissertations. Paper 3442.
Retrieved from https://ir.library.louisville.edu/etd/3442