Search results for: Maik Moser
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16

Search results for: Maik Moser

16 The German Air Passenger Tax: An Empirical Analysis of Tourism Outflows

Authors: Paul Gurr, Maik Moser

Abstract:

In Europe, some countries recently abolished air passenger taxes (APT), while others issued or consider issuing an APT. From a fiscal perspective, APT can benefit the environment, while generating a vast amount of tax revenue with relatively low administration costs. However, they may have significant negative effects on the economy. Focusing on the German air passenger tax issued 2011, this work estimates the elasticity of tourism outflows using data on passenger departures from German airports between 2010 and 2016 aggregated by destination country. The results are obtained by estimating a model of the demand for outbound tourism. In line with theory, the regression results indicate a negative relationship between taxes and departures from Germany. Furthermore, on average, an increase of the air passenger tax rate results in a relatively higher decrease of passenger departures. The elasticity of tourism outflows can be used to estimate tax revenue changes and hence evaluate possible policy actions. Neglecting environmental reasons, the results suggest that tax revenue might be maximized by reducing the air passenger tax rate. Besides Germany, this work is also important for countries which have or consider implementing APT.

Keywords: air passenger tax, Germany, Outbound tourism, panel data

Procedia PDF Downloads 269
15 An Improved Lower Bound for Minimal-Area Convex Cover for Closed Unit Curves

Authors: S. Som-Am, B. Grechuk

Abstract:

Moser’s worm problem is the unsolved problem in geometry which asks for the minimal area of a convex region on the plane which can cover all curves of unit length, assuming that curves may be rotated and translated to fit inside the region. We study a version of this problem asking for a minimal convex cover for closed unit curves. By combining geometric methods with numerical box’s search algorithm, we show that any such cover should have an area at least 0.0975. This improves the best previous lower bound of 0.096694. In fact, we show that the minimal area of convex hull of circle, equilateral triangle, and rectangle of perimeter 1 is between 0.0975 and 0.09763.

Keywords: Moser’s worm problem, closed arcs, convex cover, minimal-area cover

Procedia PDF Downloads 179
14 Comparatives Studies about Moser´s Light and Conventional Lights

Authors: Carlos Tadeu Santana Tatum, Suzana Leitão Russo

Abstract:

This paper aims to show comparative studies of different types of innovation applied to lighting, along with a theoretical review by means of a bibliographic method. We demonstrate that it is possible to understand the impacts of industries with a conventional innovation that uses natural resources to manufacture lights, and the opposite, when a frugal innovation solves the problems of a society at the bottom of the pyramid, helping people without access to electricity to get home lighting. The frugal innovation is simply the use of recycled PET bottles. We achieved the objective of our study by gathering data from environment, electrical engineering, international rules, and innovation, which gave us the best results. With all these variables, we can characterize this work as an interdisciplinary study.

Keywords: frugal, innovation, PET bottle, light

Procedia PDF Downloads 251
13 Response Solutions of 2-Dimensional Elliptic Degenerate Quasi-Periodic Systems With Small Parameters

Authors: Song Ni, Junxiang Xu

Abstract:

This paper concerns quasi-periodic perturbations with parameters of 2-dimensional degenerate systems. If the equilibrium point of the unperturbed system is elliptic-type degenerate. Assume that the perturbation is real analytic quasi-periodic with diophantine frequency. Without imposing any assumption on the perturbation, we can use a path of equilibrium points to tackle with the Melnikov non-resonance condition, then by the Leray-Schauder Continuation Theorem and the Kolmogorov-Arnold-Moser technique, it is proved that the equation has a small response solution for many sufficiently small parameters.

Keywords: quasi-periodic systems, KAM-iteration, degenerate equilibrium point, response solution

Procedia PDF Downloads 51
12 Environmental Parameters Influence on Chronic Obstructive Pulmonary Disease (COPD) Patients’ Quality of Life

Authors: Kwok W. Mui, Ling T. Wong, Nai K. K. Fong

Abstract:

Chronic obstructive pulmonary disease (COPD) is the fifth leading cause of death in Hong Kong. Investigators are eager to explore the environmental risk factors for COPD such as air pollution and occupational exposure. Through a cross-sectional survey, this study investigates the impact of air quality to the quality of life of patients with the COPD in terms of the scores of the (Chinese) chronic respiratory questionnaire (CCRQ) and the measurements of indoor air quality (IAQ) and Moser’s activities of daily living (ADL). Strong relationships between a number of indoor/outdoor environmental parameters were found and CRQ sub-scores for patients of COPD and thus indoor air pollutants must be monitored for future studies related to QOL for patients with COPD.

Keywords: chronic obstructive pulmonary disease (COPD), indoor air pollutants, quality of life, chronic respiratory questionnaire (CRQ)

Procedia PDF Downloads 393
11 Developing a Multiagent-Based Decision Support System for Realtime Multi-Risk Disaster Management

Authors: D. Moser, D. Pinto, A. Cipriano

Abstract:

A Disaster Management System (DMS) for countries with different disasters is very important. In the world different disasters like earthquakes, tsunamis, volcanic eruption, fire or other natural or man-made disasters occurs and have an effect on the population. It is also possible that two or more disasters arisen at the same time, this means to handle multi-risk situations. To handle such a situation a Decision Support System (DSS) based on multiagents is a suitable architecture. The most known DMSs deal with one (in the case of an earthquake-tsunami combination with two) disaster and often with one particular disaster. Nevertheless, a DSS helps for a better realtime response. Analyze the existing systems in the literature and expand them for multi-risk disasters to construct a well-organized system is the proposal of our work. The here shown work is an approach of a multi-risk system, which needs an architecture, and well-defined aims. In this moment our study is a kind of case study to analyze the way we have to follow to create our proposed system in the future.

Keywords: decision support system, disaster management system, multi-risk, multiagent system

Procedia PDF Downloads 388
10 Growing Architecture, Technical Product Harvesting of Near Net Shape Building Components

Authors: Franziska Moser, Martin Trautz, Anna-Lena Beger, Manuel Löwer, Jörg Feldhusen, Jürgen Prell, Alexandra Wormit, Björn Usadel, Christoph Kämpfer, Thomas-Benjamin Seiler, Henner Hollert

Abstract:

The demand for bio-based materials and components in architecture has increased in recent years due to society’s heightened environmental awareness. Nowadays, most components are being developed via a substitution approach, which aims at replacing conventional components with natural alternatives who are then being processed, shaped and manufactured to fit the desired application. This contribution introduces a novel approach to the development of bio-based products that decreases resource consumption and increases recyclability. In this approach, natural organisms like plants or trees are not being used in a processed form, but grow into a near net shape before then being harvested and utilized as building components. By minimizing the conventional production steps, the amount of resources used in manufacturing decreases whereas the recyclability increases. This paper presents the approach of technical product harvesting, explains the theoretical basis as well as the matching process of product requirements and biological properties, and shows first results of the growth manipulation studies.

Keywords: design with nature, eco manufacturing, sustainable construction materials, technical product harvesting

Procedia PDF Downloads 470
9 Modelling the Effect of Biomass Appropriation for Human Use on Global Biodiversity

Authors: Karina Reiter, Stefan Dullinger, Christoph Plutzar, Dietmar Moser

Abstract:

Due to population growth and changing patterns of production and consumption, the demand for natural resources and, as a result, the pressure on Earth’s ecosystems are growing. Biodiversity mapping can be a useful tool for assessing species endangerment or detecting hotspots of extinction risks. This paper explores the benefits of using the change in trophic energy flows as a consequence of the human alteration of the biosphere in biodiversity mapping. To this end, multiple linear regression models were developed to explain species richness in areas where there is no human influence (i.e. wilderness) for three taxonomic groups (birds, mammals, amphibians). The models were then applied to predict (I) potential global species richness using potential natural vegetation (NPPpot) and (II) global ‘actual’ species richness after biomass appropriation using NPP remaining in ecosystems after harvest (NPPeco). By calculating the difference between predicted potential and predicted actual species numbers, maps of estimated species richness loss were generated. Results show that biomass appropriation for human use can indeed be linked to biodiversity loss. Areas for which the models predicted high species loss coincide with areas where species endangerment and extinctions are recorded to be particularly high by the International Union for Conservation of Nature and Natural Resources (IUCN). Furthermore, the analysis revealed that while the species distribution maps of the IUCN Red List of Threatened Species used for this research can determine hotspots of biodiversity loss in large parts of the world, the classification system for threatened and extinct species needs to be revised to better reflect local risks of extinction.

Keywords: biodiversity loss, biomass harvest, human appropriation of net primary production, species richness

Procedia PDF Downloads 98
8 Fires in Historic Buildings: Assessment of Evacuation of People by Computational Simulation

Authors: Ivana R. Moser, Joao C. Souza

Abstract:

Building fires are random phenomena that can be extremely violent, and safe evacuation of people is the most guaranteed tactic in saving lives. The correct evacuation of buildings, and other spaces occupied by people, means leaving the place in a short time and by the appropriate way. It depends on the perception of spaces by the individual, the architectural layout and the presence of appropriate routing systems. As historical buildings were constructed in other times, when, as in general, the current security requirements were not available yet, it is necessary to adapt these spaces to make them safe. Computer models of evacuation simulation are widely used tools for assessing the safety of people in a building or agglomeration sites and these are associated with the analysis of human behaviour, makes the results of emergency evacuation more correct and conclusive. The objective of this research is the performance evaluation of historical interest buildings, regarding the safe evacuation of people, through computer simulation, using PTV Viswalk software. The buildings objects of study are the Colégio Catarinense, centennial building, located in the city of Florianópolis, Santa Catarina / Brazil. The software used uses the variables of human behaviour, such as: avoid collision with other pedestrians and avoid obstacles. Scenarios were run on the three-dimensional models and the contribution to safety in risk situations was verified as an alternative measure, especially in the impossibility of applying those measures foreseen by the current fire safety codes in Brazil. The simulations verified the evacuation time in situations of normality and emergency situations, as well as indicate the bottlenecks and critical points of the studied buildings, to seek solutions to prevent and correct these undesirable events. It is understood that adopting an advanced computational performance-based approach promotes greater knowledge of the building and how people behave in these specific environments, in emergency situations.

Keywords: computer simulation, escape routes, fire safety, historic buildings, human behavior

Procedia PDF Downloads 160
7 Free-Standing Pd-Based Metallic Glass Membranes for MEMS Applications

Authors: Wei-Shan Wang, Klaus Vogel, Felix Gabler, Maik Wiemer, Thomas Gessner

Abstract:

Metallic glasses, which are free of grain boundaries, have superior properties including large elastic limits, high strength, and excellent wear and corrosion resistance. Therefore, bulk metallic glasses (BMG) and thin film metallic glasses (TFMG) have been widely developed and investigated. Among various kinds of metallic glasses, Pd-Cu-Si TFMG, which has lower elastic modulus and better resistance of oxidation and corrosions compared to Zr- and Fe-based TFMGs, can be a promising candidate for MEMS applications. However, the study of Pd-TFMG membrane is still limited. This paper presents free-standing Pd-based metallic glass membranes with large area fabricated on wafer level for the first time. Properties of Pd-Cu-Si thin film metallic glass (TFMG) with various deposition parameters are investigated first. When deposited at 25°C, compressive stress occurs in the Pd76Cu6Si18 thin film regardless of Ar pressure. When substrate temperature is increased to 275°C, the stress state changes from compressive to tensile. Thin film stresses are slightly decreased when Ar pressure is higher. To show the influence of temperature on Pd-TFMGs, thin films without and with post annealing below (275°C) and within (370°C) supercooled liquid region are investigated. Results of XRD and TEM analysis indicate that Pd-TFMGs remain amorphous structure with well-controlled parameters. After verification of amorphous structure of the Pd-TFMGs, free-standing Pd-Cu-Si membranes were fabricated by depositing Pd-Cu-Si thin films directly on 200nm-thick silicon nitride membranes, followed by post annealing and dry etching of silicon nitride layer. Post annealing before SiNx removal is used to further release internal stress of Pd-TFMGs. The edge length of the square membrane ranges from 5 to 8mm. The effect of post annealing on Pd-Cu-Si membranes are discussed as well. With annealing at 370°C for 5 min, Pd-MG membranes are fully distortion-free after removal of SiNx layer. Results show that, by introducing annealing process, the stress-relief, distortion-free Pd-TFMG membranes with large area can be a promising candidate for sensing applications such as pressure and gas sensors.

Keywords: amorphous alloy, annealing, metallic glasses, TFMG membrane

Procedia PDF Downloads 329
6 Optimal Pricing Based on Real Estate Demand Data

Authors: Vanessa Kummer, Maik Meusel

Abstract:

Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.

Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning

Procedia PDF Downloads 258
5 Structural Invertibility and Optimal Sensor Node Placement for Error and Input Reconstruction in Dynamic Systems

Authors: Maik Kschischo, Dominik Kahl, Philipp Wendland, Andreas Weber

Abstract:

Understanding and modelling of real-world complex dynamic systems in biology, engineering and other fields is often made difficult by incomplete knowledge about the interactions between systems states and by unknown disturbances to the system. In fact, most real-world dynamic networks are open systems receiving unknown inputs from their environment. To understand a system and to estimate the state dynamics, these inputs need to be reconstructed from output measurements. Reconstructing the input of a dynamic system from its measured outputs is an ill-posed problem if only a limited number of states is directly measurable. A first requirement for solving this problem is the invertibility of the input-output map. In our work, we exploit the fact that invertibility of a dynamic system is a structural property, which depends only on the network topology. Therefore, it is possible to check for invertibility using a structural invertibility algorithm which counts the number of node disjoint paths linking inputs and outputs. The algorithm is efficient enough, even for large networks up to a million nodes. To understand structural features influencing the invertibility of a complex dynamic network, we analyze synthetic and real networks using the structural invertibility algorithm. We find that invertibility largely depends on the degree distribution and that dense random networks are easier to invert than sparse inhomogeneous networks. We show that real networks are often very difficult to invert unless the sensor nodes are carefully chosen. To overcome this problem, we present a sensor node placement algorithm to achieve invertibility with a minimum set of measured states. This greedy algorithm is very fast and also guaranteed to find an optimal sensor node-set if it exists. Our results provide a practical approach to experimental design for open, dynamic systems. Since invertibility is a necessary condition for unknown input observers and data assimilation filters to work, it can be used as a preprocessing step to check, whether these input reconstruction algorithms can be successful. If not, we can suggest additional measurements providing sufficient information for input reconstruction. Invertibility is also important for systems design and model building. Dynamic models are always incomplete, and synthetic systems act in an environment, where they receive inputs or even attack signals from their exterior. Being able to monitor these inputs is an important design requirement, which can be achieved by our algorithms for invertibility analysis and sensor node placement.

Keywords: data-driven dynamic systems, inversion of dynamic systems, observability, experimental design, sensor node placement

Procedia PDF Downloads 121
4 An Approach on Intelligent Tolerancing of Car Body Parts Based on Historical Measurement Data

Authors: Kai Warsoenke, Maik Mackiewicz

Abstract:

To achieve a high quality of assembled car body structures, tolerancing is used to ensure a geometric accuracy of the single car body parts. There are two main techniques to determine the required tolerances. The first is tolerance analysis which describes the influence of individually tolerated input values on a required target value. Second is tolerance synthesis to determine the location of individual tolerances to achieve a target value. Both techniques are based on classical statistical methods, which assume certain probability distributions. To ensure competitiveness in both saturated and dynamic markets, production processes in vehicle manufacturing must be flexible and efficient. The dimensional specifications selected for the individual body components and the resulting assemblies have a major influence of the quality of the process. For example, in the manufacturing of forming tools as operating equipment or in the higher level of car body assembly. As part of the metrological process monitoring, manufactured individual parts and assemblies are recorded and the measurement results are stored in databases. They serve as information for the temporary adjustment of the production processes and are interpreted by experts in order to derive suitable adjustments measures. In the production of forming tools, this means that time-consuming and costly changes of the tool surface have to be made, while in the body shop, uncertainties that are difficult to control result in cost-intensive rework. The stored measurement results are not used to intelligently design tolerances in future processes or to support temporary decisions based on real-world geometric data. They offer potential to extend the tolerancing methods through data analysis and machine learning models. The purpose of this paper is to examine real-world measurement data from individual car body components, as well as assemblies, in order to develop an approach for using the data in short-term actions and future projects. For this reason, the measurement data will be analyzed descriptively in the first step in order to characterize their behavior and to determine possible correlations. In the following, a database is created that is suitable for developing machine learning models. The objective is to create an intelligent way to determine the position and number of measurement points as well as the local tolerance range. For this a number of different model types are compared and evaluated. The models with the best result are used to optimize equally distributed measuring points on unknown car body part geometries and to assign tolerance ranges to them. The current results of this investigation are still in progress. However, there are areas of the car body parts which behave more sensitively compared to the overall part and indicate that intelligent tolerancing is useful here in order to design and control preceding and succeeding processes more efficiently.

Keywords: automotive production, machine learning, process optimization, smart tolerancing

Procedia PDF Downloads 82
3 DEKA-1 a Dose-Finding Phase 1 Trial: Observing Safety and Biomarkers using DK210 (EGFR) for Inoperable Locally Advanced and/or Metastatic EGFR+ Tumors with Progressive Disease Failing Systemic Therapy

Authors: Spira A., Marabelle A., Kientop D., Moser E., Mumm J.

Abstract:

Background: Both interleukin-2 (IL-2) and interleukin-10 (IL-10) have been extensively studied for their stimulatory function on T cells and their potential to obtain sustainable tumor control in RCC, melanoma, lung, and pancreatic cancer as monotherapy, as well as combination with PD-1 blockers, radiation, and chemotherapy. While approved, IL-2 retains significant toxicity, preventing its widespread use. The significant efforts undertaken to uncouple IL-2 toxicity from its anti-tumor function have been unsuccessful, and early phase clinical safety observed with PEGylated IL-10 was not met in a blinded Phase 3 trial. Deka Biosciences has engineered a novel molecule coupling wild-type IL-2 to a high affinity variant of Epstein Barr Viral (EBV) IL-10 via a scaffold (scFv) that binds to epidermal growth factor receptors (EGFR). This patented molecule, termed DK210 (EGFR), is retained at high levels within the tumor microenvironment for days after dosing. In addition to overlapping and non-redundant anti-tumor function, IL-10 reduces IL-2 mediated cytokine release syndrome risks and inhibits IL-2 mediated T regulatory cell proliferation. Methods: DK210 (EGFR) is being evaluated in an open-label, dose-escalation (Phase 1) study with 5 (0.025-0.3 mg/kg) monotherapy dose levels and (expansion cohorts) in combination with PD-1 blockers, or radiation or chemotherapy in patients with advanced solid tumors overexpressing EGFR. Key eligibility criteria include 1) confirmed progressive disease on at least one line of systemic treatment, 2) EGFR overexpression or amplification documented in histology reports, 3) at least a 4 week or 5 half-lives window since last treatment, and 4) excluding subjects with long QT syndrome, multiple myeloma, multiple sclerosis, myasthenia gravis or uncontrolled infectious, psychiatric, neurologic, or cancer disease. Plasma and tissue samples will be investigated for pharmacodynamic and predictive biomarkers and genetic signatures associated with IFN-gamma secretion, aiming to select subjects for treatment in Phase 2. Conclusion: Through successful coupling of wild-type IL-2 with a high affinity IL-10 and targeting directly to the tumor microenvironment, DK210 (EGFR) has the potential to harness IL-2 and IL-10’s known anti-cancer promise while reducing immunogenicity and toxicity risks enabling safe concomitant cytokine treatment with other anti-cancer modalities.

Keywords: cytokine, EGFR over expression, interleukine-2, interleukine-10, clinical trial

Procedia PDF Downloads 51
2 Adaptability in Older People: A Mixed Methods Approach

Authors: V. Moser-Siegmeth, M. C. Gambal, M. Jelovcak, B. Prytek, I. Swietalsky, D. Würzl, C. Fida, V. Mühlegger

Abstract:

Adaptability is the capacity to adjust without great difficulty to changing circumstances. Within our project, we aimed to detect whether older people living within a long-term care hospital lose the ability to adapt. Theoretical concepts are contradictory in their statements. There is also lack of evidence in the literature how the adaptability of older people changes over the time. Following research questions were generated: Are older residents of a long-term care facility able to adapt to changes within their daily routine? How long does it take for older people to adapt? The study was designed as a convergent parallel mixed method intervention study, carried out within a four-month period and took place within seven wards of a long-term care hospital. As a planned intervention, a change of meal-times was established. The inhabitants were surveyed with qualitative interviews and quantitative questionnaires and diaries before, during and after the intervention. In addition, a survey of the nursing staff was carried out in order to detect changes of the people they care for and how long it took them to adapt. Quantitative data was analysed with SPSS, qualitative data with a summarizing content analysis. The average age of the involved residents was 82 years, the average length of stay 45 months. The adaptation to new situations does not cause problems for older residents. 47% of the residents state that their everyday life has not changed by changing the meal times. 24% indicate ‘neither nor’ and only 18% respond that their daily life has changed considerably due to the changeover. The diaries of the residents, which were conducted over the entire period of investigation showed no changes with regard to increased or reduced activity. With regard to sleep quality, assessed with the Pittsburgh sleep quality index, there is little change in sleep behaviour compared to the two survey periods (pre-phase to follow-up phase) in the cross-table. The subjective sleep quality of the residents is not affected. The nursing staff points out that, with good information in advance, changes are not a problem. The ability to adapt to changes does not deteriorate with age or by moving into a long-term care facility. It only takes a few days to get used to new situations. This can be confirmed by the nursing staff. Although there are different determinants like the health status that might make an adjustment to new situations more difficult. In connection with the limitations, the small sample size of the quantitative data collection must be emphasized. Furthermore, the extent to which the quantitative and qualitative sample represents the total population, since only residents without cognitive impairments of selected units participated. The majority of the residents has cognitive impairments. It is important to discuss whether and how well the diary method is suitable for older people to examine their daily structure.

Keywords: adaptability, intervention study, mixed methods, nursing home residents

Procedia PDF Downloads 118
1 Development of an Interface between BIM-model and an AI-based Control System for Building Facades with Integrated PV Technology

Authors: Moser Stephan, Lukasser Gerald, Weitlaner Robert

Abstract:

Urban structures will be used more intensively in the future through redensification or new planned districts with high building densities. Especially, to achieve positive energy balances like requested for Positive Energy Districts (PED) the single use of roofs is not sufficient for dense urban areas. However, the increasing share of window significantly reduces the facade area available for use in PV generation. Through the use of PV technology at other building components, such as external venetian blinds, onsite generation can be maximized and standard functionalities of this product can be positively extended. While offering advantages in terms of infrastructure, sustainability in the use of resources and efficiency, these systems require an increased optimization in planning and control strategies of buildings. External venetian blinds with PV technology require an intelligent control concept to meet the required demands such as maximum power generation, glare prevention, high daylight autonomy, avoidance of summer overheating but also use of passive solar gains in wintertime. Today, geometric representation of outdoor spaces and at the building level, three-dimensional geometric information is available for planning with Building Information Modeling (BIM). In a research project, a web application which is called HELLA DECART was developed to provide this data structure to extract the data required for the simulation from the BIM models and to make it usable for the calculations and coupled simulations. The investigated object is uploaded as an IFC file to this web application and includes the object as well as the neighboring buildings and possible remote shading. This tool uses a ray tracing method to determine possible glare from solar reflections of a neighboring building as well as near and far shadows per window on the object. Subsequently, an annual estimate of the sunlight per window is calculated by taking weather data into account. This optimized daylight assessment per window provides the ability to calculate an estimation of the potential power generation at the integrated PV on the venetian blind but also for the daylight and solar entry. As a next step, these results of the calculations as well as all necessary parameters for the thermal simulation can be provided. The overall aim of this workflow is to advance the coordination between the BIM model and coupled building simulation with the resulting shading and daylighting system with the artificial lighting system and maximum power generation in a control system. In the research project Powershade, an AI based control concept for PV integrated façade elements with coupled simulation results is investigated. The developed automated workflow concept in this paper is tested by using an office living lab at the HELLA company.

Keywords: BIPV, building simulation, optimized control strategy, planning tool

Procedia PDF Downloads 78