dc.contributor.advisor | Jacob, T Matthew | |
dc.contributor.author | Joseph, P J | |
dc.date.accessioned | 2009-06-24T06:21:00Z | |
dc.date.accessioned | 2018-07-31T04:39:35Z | |
dc.date.available | 2009-06-24T06:21:00Z | |
dc.date.available | 2018-07-31T04:39:35Z | |
dc.date.issued | 2009-06-24T06:21:00Z | |
dc.date.submitted | 2006 | |
dc.identifier.uri | https://etd.iisc.ac.in/handle/2005/537 | |
dc.description.abstract | Processor architectures are becoming increasingly complex and hence architects have to evaluate a large design space consisting of several parameters, each with a number of potential settings. In order to assist in guiding design decisions we develop simple and accurate models of the superscalar processor design space using a detailed and validated superscalar processor simulator.
Firstly, we obtain precise estimates of all significant micro-architectural parameters and their interactions by building linear regression models using simulation based experiments. We obtain good approximate models at low simulation costs using an iterative process in which Akaike’s Information Criteria is used to extract a good linear model from a small set of simulations, and limited further simulation is guided by the model using D-optimal experimental designs. The iterative process is repeated until desired error bounds are achieved. We use this procedure for model construction and show that it provides a cost effective scheme to experiment with all relevant parameters.
We also obtain accurate predictors of the processors performance response across the entire design-space, by constructing radial basis function networks from sampled simulation experiments. We construct these models, by simulating at limited design points selected by latin hypercube sampling, and then deriving the radial neural networks from the results. We show that these predictors provide accurate approximations to the simulator’s performance response, and hence provide a cheap alternative to simulation while searching for optimal processor design points. | en |
dc.language.iso | en_US | en |
dc.relation.ispartofseries | G20338 | en |
dc.subject | Supercomputers | en |
dc.subject | Supercomputers - Statistical Methods | en |
dc.subject | MATLAB | en |
dc.subject | Linear Regression Models | en |
dc.subject | Superscalar Processor Architecture | en |
dc.subject | Superscalar Processors - Linear Models | en |
dc.subject | Radial Basis Function Networks | en |
dc.subject | Linear Models | en |
dc.subject | RBF Networks | en |
dc.subject | Processor Performance Analysis | en |
dc.subject | Predictive Performance Model | en |
dc.subject | Predictive Modeling | en |
dc.subject.classification | Computer Science | en |
dc.title | Superscalar Processor Models Using Statistical Learning | en |
dc.type | Thesis | en |
dc.degree.name | PhD | en |
dc.degree.level | Doctoral | en |
dc.degree.discipline | Faculty of Engineering | en |