Deep-learning neural-network architectures and methods: Using component-based models in building-design energy prediction.

Advanced Engineering Informatics(2018)

引用 161|浏览75
暂无评分
摘要
Increasing sustainability requirements make evaluating different design options for identifying energy-efficient design ever more important. These requirements demand simulation models that are not only accurate but also fast. Machine Learning (ML) enables effective mimicry of Building Performance Simulation (BPS) while generating results much faster than BPS. Component-Based Machine Learning (CBML) enhances the capabilities of the monolithic ML model. Extending monolithic ML approach, the paper presents deep-learning architectures, component development methods and evaluates their suitability for space exploration in building design. Results indicate that deep learning increases the performance of models over simple artificial neural network models. Methods such as transfer learning and Multi-Task Learning make the component development process more efficient. Testing the deep-learning model on 201 new design cases indicates that its cooling energy prediction (R2: 0.983) is similar to BPS, while errors for heating energy predictions (R2: 0.848) are higher than BPS. Higher heating energy prediction error can be resolved by collecting heating data using better design space sampling methods that cover the heating demand distribution effectively. Given that the accuracy of the deep-learning model for heating predictions can be increased, the major advantage of deep-learning models over BPS is their high computation speed. BPS required 1145 s to simulate 201 design cases. Using the deep-learning model, similar results can be obtained in 0.9 s. High computation speed makes deep-learning models suitable for design space exploration.
更多
查看译文
关键词
Performance gap,Sustainability,Building performance simulation,Transfer learning,Multi-task learning,LSTM
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要