Implicit computational complexity and compilers
Research output: Book/Report › Ph.D. thesis › Research
Standard
Implicit computational complexity and compilers. / Rubiano, Thomas.
Department of Computer Science, Faculty of Science, University of Copenhagen, 2017. 98 p.Research output: Book/Report › Ph.D. thesis › Research
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - BOOK
T1 - Implicit computational complexity and compilers
AU - Rubiano, Thomas
PY - 2017
Y1 - 2017
N2 - Complexity theory helps us predict and control resources, usually time and space, consumed by programs.Static analysis on specific syntactic criterion allows us to categorize some programs. A common approach is toobserve the program’s data’s behavior. For instance, the detection of non-size-increasing programs is based ona simple principle : counting memory allocation and deallocation, particularly in loops. This way, we can detectprograms which compute within a constant amount of space. This method can easily be expressed as propertyon control flow graphs. Because analyses on data’s behaviour are syntactic, they can be done at compile time.Because they are only static, those analyses are not always computable or easily computable and approximationsneed are needed. “Size-Change Principle” from C. S. Lee, N. D. Jones et A. M. Ben-Amram presenteda method to predict termination by observing resources evolution and a lot of research came from this theory.Until now, these implicit complexity theories were essentially applied on more or less toy languages. This thesisapplies implicit computational complexity methods into “real life” programs by manipulating intermediaterepresentation languages in compilers. This give an accurate idea of the actual expressivity of these analysesand show that implicit computational complexity and compilers communities can fuel each other fruitfully. Aswe show in this thesis, the methods developed are quite generals and open the way to several new applications.
AB - Complexity theory helps us predict and control resources, usually time and space, consumed by programs.Static analysis on specific syntactic criterion allows us to categorize some programs. A common approach is toobserve the program’s data’s behavior. For instance, the detection of non-size-increasing programs is based ona simple principle : counting memory allocation and deallocation, particularly in loops. This way, we can detectprograms which compute within a constant amount of space. This method can easily be expressed as propertyon control flow graphs. Because analyses on data’s behaviour are syntactic, they can be done at compile time.Because they are only static, those analyses are not always computable or easily computable and approximationsneed are needed. “Size-Change Principle” from C. S. Lee, N. D. Jones et A. M. Ben-Amram presenteda method to predict termination by observing resources evolution and a lot of research came from this theory.Until now, these implicit complexity theories were essentially applied on more or less toy languages. This thesisapplies implicit computational complexity methods into “real life” programs by manipulating intermediaterepresentation languages in compilers. This give an accurate idea of the actual expressivity of these analysesand show that implicit computational complexity and compilers communities can fuel each other fruitfully. Aswe show in this thesis, the methods developed are quite generals and open the way to several new applications.
UR - https://soeg.kb.dk/permalink/45KBDK_KGL/1f0go08/cdi_abes_theses_2017USPCD076
M3 - Ph.D. thesis
BT - Implicit computational complexity and compilers
PB - Department of Computer Science, Faculty of Science, University of Copenhagen
ER -
ID: 191905210