Search results for: multiscale theory and modeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8185

Search results for: multiscale theory and modeling

7675 Understanding Personal Well-Being among Entrepreneurial Breadwinners: Bibliographic and Empirical Analyses of Relative Resource Theory

Authors: E. Fredrick Rice

Abstract:

Over the past three decades, a substantial body of academic literature has asserted that the pressure to maintain household income can negatively affect the personal well-being of breadwinners. Given that scholars have failed to thoroughly explore this phenomenon with breadwinners who are also business owners, theory has been underdeveloped in the entrepreneurial context. To identify the most appropriate theories to apply to entrepreneurs, the current paper utilized two approaches. First, a comprehensive bibliographic analysis was conducted focusing on works at the intersection of breadwinner status and well-being. Co-authorship and journal citation patterns highlighted relative resource theory as a boundary spanning approach with promising applications in the entrepreneurial space. To build upon this theory, regression analysis was performed using data from the Panel Study of Entrepreneurial Dynamics (PSED). Empirical results showed evidence for the effects of breadwinner status and household income on entrepreneurial well-being. Further, the findings suggest that it is not merely income or job status that predicts well-being, but one’s relative financial contribution compared to that of one’s non-breadwinning organizationally employed partner. This paper offers insight into how breadwinner status can be studied in relation to the entrepreneurial personality.

Keywords: breadwinner, entrepreneurship, household income, well-being.

Procedia PDF Downloads 151
7674 Modelling Operational Risk Using Extreme Value Theory and Skew t-Copulas via Bayesian Inference

Authors: Betty Johanna Garzon Rozo, Jonathan Crook, Fernando Moreira

Abstract:

Operational risk losses are heavy tailed and are likely to be asymmetric and extremely dependent among business lines/event types. We propose a new methodology to assess, in a multivariate way, the asymmetry and extreme dependence between severity distributions, and to calculate the capital for Operational Risk. This methodology simultaneously uses (i) several parametric distributions and an alternative mix distribution (the Lognormal for the body of losses and the Generalized Pareto Distribution for the tail) via extreme value theory using SAS®, (ii) the multivariate skew t-copula applied for the first time for operational losses and (iii) Bayesian theory to estimate new n-dimensional skew t-copula models via Markov chain Monte Carlo (MCMC) simulation. This paper analyses a newly operational loss data set, SAS Global Operational Risk Data [SAS OpRisk], to model operational risk at international financial institutions. All the severity models are constructed in SAS® 9.2. We implement the procedure PROC SEVERITY and PROC NLMIXED. This paper focuses in describing this implementation.

Keywords: operational risk, loss distribution approach, extreme value theory, copulas

Procedia PDF Downloads 579
7673 The Simulation and Experimental Investigation to Study the Strain Distribution Pattern during the Closed Die Forging Process

Authors: D. B. Gohil

Abstract:

Closed die forging is a very complex process, and measurement of actual forces for real material is difficult and time consuming. Hence, the modelling technique has taken the advantage of carrying out the experimentation with the proper model material which needs lesser forces and relatively low temperature. The results of experiments on the model material then may be correlated with the actual material by using the theory of similarity. There are several methods available to resolve the complexity involved in the closed die forging process. Finite Element Method (FEM) and Finite Difference Method (FDM) are relatively difficult as compared to the slab method. The slab method is very popular and very widely used by the people working on shop floor because it is relatively easy to apply and reasonably accurate for most of the common forging load requirement computations.

Keywords: experimentation, forging, process modeling, strain distribution

Procedia PDF Downloads 185
7672 Extending the Theory of Planned Behaviour to Predict Intention to Commute by Bicycle: Case Study of Mexico City

Authors: Magda Cepeda, Frances Hodgson, Ann Jopson

Abstract:

There are different barriers people face when choosing to cycle for commuting purposes. This study examined the role of psycho-social factors predicting the intention to cycle to commute in Mexico City. An extended version of the theory of planned behaviour was developed and utilized with a simple random sample of 401 road users. We applied exploratory and confirmatory factor analysis and after identifying five factors, a structural equation model was estimated to find the relationships among the variables. The results indicated that cycling attributes, attitudes to cycling, social comparison and social image and prestige were the most important factors influencing intention to cycle. Although the results from this study are specific to Mexico City, they indicate areas of interest to transportation planners in other regions especially in those cities where intention to cycle its linked to its perceived image and there is political ambition to instigate positive cycling cultures. Moreover, this study contributes to the current literature developing applications of the Theory of Planned Behaviour.

Keywords: cycling, latent variable model, perception, theory of planned behaviour

Procedia PDF Downloads 339
7671 Methods for Solving Identification Problems

Authors: Fadi Awawdeh

Abstract:

In this work, we highlight the key concepts in using semigroup theory as a methodology used to construct efficient formulas for solving inverse problems. The proposed method depends on some results concerning integral equations. The experimental results show the potential and limitations of the method and imply directions for future work.

Keywords: identification problems, semigroup theory, methods for inverse problems, scientific computing

Procedia PDF Downloads 463
7670 Multimedia Data Fusion for Event Detection in Twitter by Using Dempster-Shafer Evidence Theory

Authors: Samar M. Alqhtani, Suhuai Luo, Brian Regan

Abstract:

Data fusion technology can be the best way to extract useful information from multiple sources of data. It has been widely applied in various applications. This paper presents a data fusion approach in multimedia data for event detection in twitter by using Dempster-Shafer evidence theory. The methodology applies a mining algorithm to detect the event. There are two types of data in the fusion. The first is features extracted from text by using the bag-ofwords method which is calculated using the term frequency-inverse document frequency (TF-IDF). The second is the visual features extracted by applying scale-invariant feature transform (SIFT). The Dempster - Shafer theory of evidence is applied in order to fuse the information from these two sources. Our experiments have indicated that comparing to the approaches using individual data source, the proposed data fusion approach can increase the prediction accuracy for event detection. The experimental result showed that the proposed method achieved a high accuracy of 0.97, comparing with 0.93 with texts only, and 0.86 with images only.

Keywords: data fusion, Dempster-Shafer theory, data mining, event detection

Procedia PDF Downloads 391
7669 Investigation of the Stability and Spintronic Properties of NbrhgeX (X= Cr, Co, Mn, Fe, Ni) Using Density Functional Theory

Authors: Shittu Akinpelu, Issac Popoola

Abstract:

The compound NbRhGe has been predicted to be a semiconductor with excellent mechanical properties. It is an indirect band gap material. The potential of NbRhGe for non-volatile data storage via element addition is being studied using the Density Functional Theory (DFT). Preliminary results on the electronic and magnetic properties are suggestive for their application in spintronic.

Keywords: half-metals, Heusler compound, semiconductor, spintronic

Procedia PDF Downloads 150
7668 Mitigating the Unwillingness of e-Forums Members to Engage in Information Exchange

Authors: Dora Triki, Irena Vida, Claude Obadia

Abstract:

Social networks such as e-Forums or dating sites often face the reluctance of key members to participate. Relying on the conation theory, this study investigates this phenomenon and proposes solutions to mitigate the issue. We show that highly experienced e-Forum members refuse to share business information in a peer to peer information exchange forums. However, forums managers can mitigate this behavior by developing a sentiment of belongingness to the network. Furthermore, by selecting only elite forum participants with ample experience, they can reduce the reluctance of key information providers to engage in information exchange. Our hypotheses are tested with PLS structural equations modeling using survey data from members of a French e-Forum dedicated to the exchange of business information about exporting.

Keywords: conation, e-Forum, information exchange, members participation

Procedia PDF Downloads 141
7667 Illuminating Regional Identity: An Interdisciplinary Exploration in Saskatchewan

Authors: Anne Gibbons

Abstract:

Both inside and outside of academia, people have sought to understand the “sense of place” of various regions, many times over and for many different reasons. The concept of regional identity is highly complex and surrounded by considerable contention. There are multiple bodies of research on regional identity theory in many different disciplines and even across sub-disciplinary classifications. Each discipline takes a slightly different angle or perspective on regional identity, resulting in a fragmented body of work on this topic overall. There is a need to consolidate this body of increasingly fragmented theory through interdisciplinary integration. For the purpose of this study, the province of Saskatchewan will serve as an exemplar for exploring regional identity in a concrete context. Saskatchewan can be thought of as a ‘functional region,’ with clear boundaries and clear residency, from which regional identity can be studied. This thesis shares the outcomes of a qualitative study grounded in a series of group interviews with askatchewan residents, from which it is concluded that the use of interdisciplinary theory is an appropriate approach to the study of regional identity. Regional identity cannot be compartmentalized; it is a web of characteristics, attributes, and feelings that are inextricably linked. The thesis thus concludes by offering lessons learned about how we might better understand regional identity, as illuminated through both interdisciplinary theory and the lived experiences and imaginations of people living in the region of Saskatchewan.

Keywords: interdisciplinary, regional identity, Saskatchewan, tourism studies

Procedia PDF Downloads 511
7666 The Mechanisms of Peer-Effects in Education: A Frame-Factor Analysis of Instruction

Authors: Pontus Backstrom

Abstract:

In the educational literature on peer effects, attention has been brought to the fact that the mechanisms creating peer effects are still to a large extent hidden in obscurity. The hypothesis in this study is that the Frame Factor Theory can be used to explain these mechanisms. At heart of the theory is the concept of “time needed” for students to learn a certain curricula unit. The relations between class-aggregated time needed and the actual time available, steers and hinders the actions possible for the teacher. Further, the theory predicts that the timing and pacing of the teachers’ instruction is governed by a “criterion steering group” (CSG), namely the pupils in the 10th-25th percentile of the aptitude distribution in class. The class composition hereby set the possibilities and limitations for instruction, creating peer effects on individual outcomes. To test if the theory can be applied to the issue of peer effects, the study employs multilevel structural equation modelling (M-SEM) on Swedish TIMSS 2015-data (Trends in International Mathematics and Science Study; students N=4090, teachers N=200). Using confirmatory factor analysis (CFA) in the SEM-framework in MPLUS, latent variables are specified according to the theory, such as “limitations of instruction” from TIMSS survey items. The results indicate a good model fit to data of the measurement model. Research is still in progress, but preliminary results from initial M-SEM-models verify a strong relation between the mean level of the CSG and the latent variable of limitations on instruction, a variable which in turn have a great impact on individual students’ test results. Further analysis is required, but so far the analysis indicates a confirmation of the predictions derived from the frame factor theory and reveals that one of the important mechanisms creating peer effects in student outcomes is the effect the class composition has upon the teachers’ instruction in class.

Keywords: compositional effects, frame factor theory, peer effects, structural equation modelling

Procedia PDF Downloads 120
7665 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity

Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish

Abstract:

Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.

Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow

Procedia PDF Downloads 118
7664 Leadership Process Model: A Way to Provide Guidance in Dealing with the Key Challenges Within the Organisation

Authors: Rawaa El Ayoubi

Abstract:

Many researchers, academics and practitioners have developed leadership theories during the 20th century. This substantial effort has built more leadership theories, generating considerable organisational research on leadership models in contemporary literature. This paper explores the stages and drivers of leadership theory evolution based on the researcher’s personal conclusions and review of leadership theories. The purpose of this paper is to create a Leadership Process Model (LPM) that can provide guidance in dealing with the key challenges within the organisation. This integrative model of organisational leadership is based on inner meaning, leader values and vision. It further addresses the relationships between leadership theory, practice and development, exploring why challenges exist within the field of leadership theory and how these challenges can be mitigated.

Keywords: leadership challenges, leadership process model, leadership |theories, organisational leadership, paradigm development

Procedia PDF Downloads 62
7663 Empirical Modeling and Spatial Analysis of Heat-Related Morbidity in Maricopa County, Arizona

Authors: Chuyuan Wang, Nayan Khare, Lily Villa, Patricia Solis, Elizabeth A. Wentz

Abstract:

Maricopa County, Arizona, has a semi-arid hot desert climate that is one of the hottest regions in the United States. The exacerbated urban heat island (UHI) effect caused by rapid urbanization has made the urban area even hotter than the rural surroundings. The Phoenix metropolitan area experiences extremely high temperatures in the summer from June to September that can reach the daily highest of 120 °F (48.9 °C). Morbidity and mortality due to the environmental heat is, therefore, a significant public health issue in Maricopa County, especially because it is largely preventable. Public records from the Maricopa County Department of Public Health (MCDPH) revealed that between 2012 and 2016, there were 10,825 incidents of heat-related morbidity incidents, 267 outdoor environmental heat deaths, and 173 indoor heat-related deaths. A lot of research has examined heat-related death and its contributing factors around the world, but little has been done regarding heat-related morbidity issues, especially for regions that are naturally hot in the summer. The objective of this study is to examine the demographic, socio-economic, housing, and environmental factors that contribute to heat-related morbidity in Maricopa County. We obtained heat-related morbidity data between 2012 and 2016 at census tract level from MCDPH. Demographic, socio-economic, and housing variables were derived using 2012-2016 American Community Survey 5-year estimate from the U.S. Census. Remotely sensed Landsat 7 ETM+ and Landsat 8 OLI satellite images and Level-1 products were acquired for all the summer months (June to September) from 2012 and 2016. The National Land Cover Database (NLCD) 2016 percent tree canopy and percent developed imperviousness data were obtained from the U.S. Geological Survey (USGS). We used ordinary least squares (OLS) regression analysis to examine the empirical relationship between all the independent variables and heat-related morbidity rate. Results showed that higher morbidity rates are found in census tracts with higher values in population aged 65 and older, population under poverty, disability, no vehicle ownership, white non-Hispanic, population with less than high school degree, land surface temperature, and surface reflectance, but lower values in normalized difference vegetation index (NDVI) and housing occupancy. The regression model can be used to explain up to 59.4% of total variation of heat-related morbidity in Maricopa County. The multiscale geographically weighted regression (MGWR) technique was then used to examine the spatially varying relationships between heat-related morbidity rate and all the significant independent variables. The R-squared value of the MGWR model increased to 0.691, that shows a significant improvement in goodness-of-fit than the global OLS model, which means that spatial heterogeneity of some independent variables is another important factor that influences the relationship with heat-related morbidity in Maricopa County. Among these variables, population aged 65 and older, the Hispanic population, disability, vehicle ownership, and housing occupancy have much stronger local effects than other variables.

Keywords: census, empirical modeling, heat-related morbidity, spatial analysis

Procedia PDF Downloads 108
7662 Modeling Continuous Flow in a Curved Channel Using Smoothed Particle Hydrodynamics

Authors: Indri Mahadiraka Rumamby, R. R. Dwinanti Rika Marthanty, Jessica Sjah

Abstract:

Smoothed particle hydrodynamics (SPH) was originally created to simulate nonaxisymmetric phenomena in astrophysics. However, this method still has several shortcomings, namely the high computational cost required to model values with high resolution and problems with boundary conditions. The difficulty of modeling boundary conditions occurs because the SPH method is influenced by particle deficiency due to the integral of the kernel function being truncated by boundary conditions. This research aims to answer if SPH modeling with a focus on boundary layer interactions and continuous flow can produce quantifiably accurate values with low computational cost. This research will combine algorithms and coding in the main program of meandering river, continuous flow algorithm, and solid-fluid algorithm with the aim of obtaining quantitatively accurate results on solid-fluid interactions with the continuous flow on a meandering channel using the SPH method. This study uses the Fortran programming language for modeling the SPH (Smoothed Particle Hydrodynamics) numerical method; the model is conducted in the form of a U-shaped meandering open channel in 3D, where the channel walls are soil particles and uses a continuous flow with a limited number of particles.

Keywords: smoothed particle hydrodynamics, computational fluid dynamics, numerical simulation, fluid mechanics

Procedia PDF Downloads 109
7661 Handling Complexity of a Complex System Design: Paradigm, Formalism and Transformations

Authors: Hycham Aboutaleb, Bruno Monsuez

Abstract:

Current systems' complexity has reached a degree that requires addressing conception and design issues while taking into account environmental, operational, social, legal, and financial aspects. Therefore, one of the main challenges is the way complex systems are specified and designed. The exponentially growing effort, cost, and time investment of complex systems in modeling phase emphasize the need for a paradigm, a framework, and an environment to handle the system model complexity. For that, it is necessary to understand the expectations of the human user of the model and his limits. This paper presents a generic framework for designing complex systems, highlights the requirements a system model needs to fulfill to meet human user expectations, and suggests a graph-based formalism for modeling complex systems. Finally, a set of transformations are defined to handle the model complexity.

Keywords: higraph-based, formalism, system engineering paradigm, modeling requirements, graph-based transformations

Procedia PDF Downloads 387
7660 A Novel Algorithm for Parsing IFC Models

Authors: Raninder Kaur Dhillon, Mayur Jethwa, Hardeep Singh Rai

Abstract:

Information technology has made a pivotal progress across disparate disciplines, one of which is AEC (Architecture, Engineering and Construction) industry. CAD is a form of computer-aided building modulation that architects, engineers and contractors use to create and view two- and three-dimensional models. The AEC industry also uses building information modeling (BIM), a newer computerized modeling system that can create four-dimensional models; this software can greatly increase productivity in the AEC industry. BIM models generate open source IFC (Industry Foundation Classes) files which aim for interoperability for exchanging information throughout the project lifecycle among various disciplines. The methods developed in previous studies require either an IFC schema or MVD and software applications, such as an IFC model server or a Building Information Modeling (BIM) authoring tool, to extract a partial or complete IFC instance model. This paper proposes an efficient algorithm for extracting a partial and total model from an Industry Foundation Classes (IFC) instance model without an IFC schema or a complete IFC model view definition (MVD).

Keywords: BIM, CAD, IFC, MVD

Procedia PDF Downloads 278
7659 The Influence of Theories and Approaches to Educational Policy and Planning in Ghana’s Current Educational Developments

Authors: Ruth Donkoh, Wing On Lee, Solomon A. Boateng, Portia Oware Twerefoo, Josephine Donkor

Abstract:

In this paper we defend the value of theories and approaches to educational policy and planning in enhancing the educational developments in Ghana. This mission is achieved by enumerating the recent educational developments in Ghana and juxtaposing it with some educational theories, approaches to policy making, and policy planning to see if the educational developments conform with the theory principles as well as policy making and planning processes. Data collection for the research was made through textual analysis of policy documents as well as review of relevant literatures. The findings reveled that educational developments in Ghana are unable to attain its objectives due to the policies not conforming with the policy formation and planning principles. In addition, was that education planning in Ghana does not follow the policy-administration dichotomy theory principles and likewise the distribution of educational needs goes contrary to the equity theory. We recommend that educational policies in Ghana should be in conformity with the principles of theories as well as the approaches to educational policy making and planning to help meet the needs of learners, attain educational quality, and to help in the accomplishment of educational development objectives.

Keywords: Ghana education, equity theories, politics- administration dichotomy theory, educational policies, educational planning

Procedia PDF Downloads 130
7658 A Survey on Taxpayer's Compliance in Prospect Theory Structure Using Hierarchical Bayesian Approach

Authors: Sahar Dehghan, Yeganeh Mousavi Jahromi, Ghahraman Abdoli

Abstract:

Since tax revenues are one of the most important sources of government revenue, it is essential to consider increasing taxpayers' compliance. One of the factors that can affect the taxpayers' compliance is the structure of the crimes and incentives envisaged in the tax law. In this research, by using the 'prospect theory', the effects of changes in the rate of crimes and the tax incentive in the direct tax law on the taxpayer’s compliance behavior have been investigated. To determine the preferences and preferences of taxpayer’s in the business sector and their degree of sensitivity to fines and incentives, a questionnaire with mixed gamble structure is designed. Estimated results using the Hierarchical Bayesian method indicate that the taxpayer’s that have been tested in this study are more sensitive to the incentives in the direct tax law, and the tax administration can use this to increase the level of collected tax and increase the level of compliance.

Keywords: tax compliance, prospect theory, value function, mixed gamble

Procedia PDF Downloads 153
7657 Image Segmentation: New Methods

Authors: Flaurence Benjamain, Michel Casperance

Abstract:

We present in this paper, first, a comparative study of three mathematical theories to achieve the fusion of information sources. This study aims to identify the characteristics inherent in theories of possibilities, belief functions (DST) and plausible and paradoxical reasoning to establish a strategy of choice that allows us to adopt the most appropriate theory to solve a problem of fusion in order, taking into account the acquired information and imperfections that accompany them. Using the new theory of plausible and paradoxical reasoning, also called Dezert-Smarandache Theory (DSmT), to fuse information multi-sources needs, at first step, the generation of the composites events witch is, in general, difficult. Thus, we present in this paper a new approach to construct pertinent paradoxical classes based on gray levels histograms, which also allows to reduce the cardinality of the hyper-powerset. Secondly, we developed a new technique for order and coding generalized focal elements. This method is exploited, in particular, to calculate the cardinality of Dezert and Smarandache. Then, we give an experimentation of classification of a remote sensing image that illustrates the given methods and we compared the result obtained by the DSmT with that resulting from the use of the DST and theory of possibilities.

Keywords: segmentation, image, approach, vision computing

Procedia PDF Downloads 257
7656 The Diffusion of Telehealth: System-Level Conditions for Successful Adoption

Authors: Danika Tynes

Abstract:

Telehealth is a promising advancement in health care, though there are certain conditions under which telehealth has a greater chance of success. This research sought to further the understanding of what conditions compel the success of telehealth adoption at the systems level applying Diffusion of Innovations (DoI) theory (Rogers, 1962). System-level indicators were selected to represent four components of DoI theory (relative advantage, compatibility, complexity, and observability) and regressed on 5 types of telehealth (teleradiology, teledermatology, telepathology, telepsychology, and remote monitoring) using multiple logistic regression. The analyses supported relative advantage and compatibility as the strongest influencers of telehealth adoption, remote monitoring in particular. These findings help to quantitatively clarify the factors influencing the adoption of innovation and advance the ability to make recommendations on the viability of state telehealth adoption. In addition, results indicate when DoI theory is most applicable to the understanding of telehealth diffusion. Ultimately, this research may contribute to more focused allocation of scarce health care resources through consideration of existing state conditions available foster innovation.

Keywords: adoption, diffusion of innovation theory, remote monitoring, system-level indicators

Procedia PDF Downloads 115
7655 Hominin Niche in the Times of Climate Change

Authors: Emilia Hunt, Sally C. Reynolds, Fiona Coward, Fabio Parracho Silva, Philip Hopley

Abstract:

Ecological niche modeling is widely used in conservation studies, but application to the extinct hominin species is a relatively new approach. Being able to understand what ecological niches were occupied by respective hominin species provides a new perspective into influences on evolutionary processes. Niche separation or overlap can tell us more about specific requirements of the species within the given timeframe. Many of the ancestral species lived through enormous climate changes: glacial and interglacial periods, changes in rainfall, leading to desertification or flooding of regions and displayed impressive levels of adaptation necessary for their survival. This paper reviews niche modeling methodologies and their application to hominin studies. Traditional conservation methods might not be directly applicable to extinct species and are not comparable to hominins. Hominin niche also includes aspects of technologies, use of fire and extended communication, which are not traditionally used in building conservation models. Future perspectives on how to improve niche modeling for extinct hominin species will be discussed.

Keywords: hominin niche, climate change, evolution, adaptation, ecological niche modelling

Procedia PDF Downloads 174
7654 Assessing the Impact of Urbanization on Flood Risk: A Case Study

Authors: Talha Ahmed, Ishtiaq Hassan

Abstract:

Urban areas or metropolitan is portrayed by the very high density of population due to the result of these economic activities. Some critical elements, such as urban expansion and climate change, are driving changes in cities with exposure to the incidence and impacts of pluvial floods. Urban communities are recurrently developed by huge spaces by which water cannot enter impermeable surfaces, such as man-made permanent surfaces and structures, which do not cause the phenomena of infiltration and percolation. Urban sprawl can result in increased run-off volumes, flood stage and flood extents during heavy rainy seasons. The flood risks require a thorough examination of all aspects affecting to severe an event in order to accurately estimate their impacts and other risk factors associated with them. For risk evaluation and its impact due to urbanization, an integrated hydrological modeling approach is used on the study area in Islamabad (Pakistan), focusing on a natural water body that has been adopted in this research. The vulnerability of the physical elements at risk in the research region is analyzed using GIS and SOBEK. The supervised classification of land use containing the images from 1980 to 2020 is used. The modeling of DEM with selected return period is used for modeling a hydrodynamic model for flood event inundation. The selected return periods are 50,75 and 100 years which are used in flood modeling. The findings of this study provided useful information on high-risk places and at-risk properties.

Keywords: urbanization, flood, flood risk, GIS

Procedia PDF Downloads 157
7653 Analytical Modeling of Equivalent Magnetic Circuit in Multi-segment and Multi-barrier Synchronous Reluctance Motor

Authors: Huai-Cong Liu,Tae Chul Jeong,Ju Lee

Abstract:

This paper describes characteristic analysis of a synchronous reluctance motor (SynRM)’s rotor with the Multi-segment and Multi-layer structure. The magnetic-saturation phenomenon in SynRM is often appeared. Therefore, when modeling analysis of SynRM the calculation of nonlinear magnetic field needs to be considered. An important influence factor on the convergence process is how to determine the relative permeability. An improved method, which ensures the calculation, is convergence by linear iterative method for saturated magnetic field. If there are inflection points on the magnetic curve,an optimum convergence method of solution for nonlinear magnetic field was provided. Then the equivalent magnetic circuit is calculated, and d,q-axis inductance can be got. At last, this process is applied to design a 7.5Kw SynRM and its validity is verified by comparing with the result of finite element method (FEM) and experimental test data.

Keywords: SynRM, magnetic-saturation, magnetic circuit, analytical modeling

Procedia PDF Downloads 490
7652 Modeling Sorption and Permeation in the Separation of Benzene/ Cyclohexane Mixtures through Styrene-Butadiene Rubber Crosslinked Membranes

Authors: Hassiba Benguergoura, Kamal Chanane, Sâad Moulay

Abstract:

Pervaporation (PV), a membrane-based separation technology, has gained much attention because of its energy saving capability and low-cost, especially for separation of azeotropic or close-boiling liquid mixtures. There are two crucial issues for industrial application of pervaporation process. The first is developing membrane material and tailoring membrane structure to obtain high pervaporation performances. The second is modeling pervaporation transport to better understand of the above-mentioned structure–pervaporation relationship. Many models were proposed to predict the mass transfer process, among them, solution-diffusion model is most widely used in describing pervaporation transport including preferential sorption, diffusion and evaporation steps. For modeling pervaporation transport, the permeation flux, which depends on the solubility and diffusivity of components in the membrane, should be obtained first. Traditionally, the solubility was calculated according to the Flory–Huggins theory. Separation of the benzene (Bz)/cyclohexane (Cx) mixture is industrially significant. Numerous papers have been focused on the Bz/Cx system to assess the PV properties of membrane materials. Membranes with both high permeability and selectivity are desirable for practical application. Several new polymers have been prepared to get both high permeability and selectivity. Styrene-butadiene rubbers (SBR), dense membranes cross-linked by chloromethylation were used in the separation of benzene/cyclohexane mixtures. The impact of chloromethylation reaction as a new method of cross-linking SBR on the pervaporation performance have been reported. In contrast to the vulcanization with sulfur, the cross-linking takes places on styrene units of polymeric chains via a methylene bridge. The partial pervaporative (PV) fluxes of benzene/cyclohexane mixtures in styrene-butadiene rubber (SBR) were predicted using Fick's first law. The predicted partial fluxes and the PV separation factor agreed well with the experimental data by integrating Fick's law over the benzene concentration. The effects of feed concentration and operating temperature on the predicted permeation flux by this proposed model are investigated. The predicted permeation fluxes are in good agreement with experimental data at lower benzene concentration in feed, but at higher benzene concentration, the model overestimated permeation flux. The predicted and experimental permeation fluxes all increase with operating temperature increasing. Solvent sorption levels for benzene/ cyclohexane mixtures in a SBR membrane were determined experimentally. The results showed that the solvent sorption levels were strongly affected by the feed composition. The Flory- Huggins equation generates higher R-square coefficient for the sorption selectivity.

Keywords: benzene, cyclohexane, pervaporation, permeation, sorption modeling, SBR

Procedia PDF Downloads 310
7651 A Domain Specific Modeling Language Semantic Model for Artefact Orientation

Authors: Bunakiye R. Japheth, Ogude U. Cyril

Abstract:

Since the process of transforming user requirements to modeling constructs are not very well supported by domain-specific frameworks, it became necessary to integrate domain requirements with the specific architectures to achieve an integrated customizable solutions space via artifact orientation. Domain-specific modeling language specifications of model-driven engineering technologies focus more on requirements within a particular domain, which can be tailored to aid the domain expert in expressing domain concepts effectively. Modeling processes through domain-specific language formalisms are highly volatile due to dependencies on domain concepts or used process models. A capable solution is given by artifact orientation that stresses on the results rather than expressing a strict dependence on complicated platforms for model creation and development. Based on this premise, domain-specific methods for producing artifacts without having to take into account the complexity and variability of platforms for model definitions can be integrated to support customizable development. In this paper, we discuss methods for the integration capabilities and necessities within a common structure and semantics that contribute a metamodel for artifact-orientation, which leads to a reusable software layer with concrete syntax capable of determining design intents from domain expert. These concepts forming the language formalism are established from models explained within the oil and gas pipelines industry.

Keywords: control process, metrics of engineering, structured abstraction, semantic model

Procedia PDF Downloads 124
7650 The Life-Cycle Theory of Dividends: Evidence from Indonesia

Authors: Vashti Carissa

Abstract:

The main objective of this study is to examine whether the life-cycle theory of dividends could explain the determinant of an optimal dividend policy in Indonesia. The sample that was used consists of 1,420 non-financial and non-trade, services, investment firms listed in Indonesian Stock Exchange during the period of 2005-2014. According to this finding using logistic regression, firm life-cycle measured by retained earnings as a proportion of total equity (RETE) significantly has a positive effect on the propensity of a firm pays dividend. The higher company’s earned surplus portion in its capital structure could reflect firm maturity level which will increase the likelihood of dividend payment in mature firms. This result provides an additional empirical evidence about the existence of life-cycle theory of dividends for dividend payout phenomenon in Indonesia. It can be known that dividends tend to be paid by mature firms while retention is more dominating in growth firms. From the testing results, it can also be known that majority of sample firms are being in the growth phase which proves the fact about infrequent dividend distribution in Indonesia during the ten years observation period.

Keywords: dividend, dividend policy, life-cycle theory of dividends, mix of earned and contributed capital

Procedia PDF Downloads 272
7649 Conceptual Model of a Residential Waste Collection System Using ARENA Software

Authors: Bruce G. Wilson

Abstract:

The collection of municipal solid waste at the curbside is a complex operation that is repeated daily under varying circumstances around the world. There have been several attempts to develop Monte Carlo simulation models of the waste collection process dating back almost 50 years. Despite this long history, the use of simulation modeling as a planning or optimization tool for waste collection is still extremely limited in practice. Historically, simulation modeling of waste collection systems has been hampered by the limitations of computer hardware and software and by the availability of representative input data. This paper outlines the development of a Monte Carlo simulation model that overcomes many of the limitations contained in previous models. The model uses a general purpose simulation software program that is easily capable of modeling an entire waste collection network. The model treats the stops on a waste collection route as a queue of work to be processed by a collection vehicle (or server). Input data can be collected from a variety of sources including municipal geographic information systems, global positioning system recorders on collection vehicles, and weigh scales at transfer stations or treatment facilities. The result is a flexible model that is sufficiently robust that it can model the collection activities in a large municipality, while providing the flexibility to adapt to changing conditions on the collection route.

Keywords: modeling, queues, residential waste collection, Monte Carlo simulation

Procedia PDF Downloads 387
7648 Analysis of Simply Supported Beams Using Elastic Beam Theory

Authors: M. K. Dce

Abstract:

The aim of this paper is to investigate the behavior of simply supported beams having rectangular section and subjected to uniformly distributed load (UDL). In this study five beams of span 5m, 6m, 7m and 8m have been considered. The width of all the beams is 400 mm and span to depth ratio has been taken as 12. The superimposed live load has been increased from 10 kN/m to 25 kN/m at the interval of 5 kN/m. The analysis of the beams has been carried out using the elastic beam theory. On the basis of present study it has been concluded that the maximum bending moment as well as deflection occurs at the mid-span of simply supported beam and its magnitude increases in proportion to magnitude of UDL. Moreover, the study suggests that the maximum moment is proportional to square of span and maximum deflection is proportional to fourth power of span.

Keywords: beam, UDL, bending moment, deflection, elastic beam theory

Procedia PDF Downloads 376
7647 Outdoor Visible Light Communication Channel Modeling under Fog and Smoke Conditions

Authors: Véronique Georlette, Sebastien Bette, Sylvain Brohez, Nicolas Point, Veronique Moeyaert

Abstract:

Visible light communication (VLC) is a communication technology that is part of the optical wireless communication (OWC) family. It uses the visible and infrared spectrums to send data. For now, this technology has widely been studied for indoor use-cases, but it is sufficiently mature nowadays to consider the outdoor environment potentials. The main outdoor challenges are the meteorological conditions and the presence of smoke due to fire or pollutants in urban areas. This paper proposes a methodology to assess the robustness of an outdoor VLC system given the outdoor conditions. This methodology is put into practice in two realistic scenarios, a VLC bus stop, and a VLC streetlight. The methodology consists of computing the power margin available in the system, given all the characteristics of the VLC system and its surroundings. This is done thanks to an outdoor VLC communication channel simulator developed in Python. This simulator is able to quantify the effects of fog and smoke thanks to models taken from environmental and fire engineering scientific literature as well as the optical power reaching the receiver. These two phenomena impact the communication by increasing the total attenuation of the medium. The main conclusion drawn in this paper is that the levels of attenuation due to fog and smoke are in the same order of magnitude. The attenuation of fog being the highest under the visibility of 1 km. This gives a promising prospect for the deployment of outdoor VLC uses-cases in the near future.

Keywords: channel modeling, fog modeling, meteorological conditions, optical wireless communication, smoke modeling, visible light communication

Procedia PDF Downloads 133
7646 Modeling of the Heat and Mass Transfer in Fluids through Thermal Pollution in Pipelines

Authors: V. Radulescu, S. Dumitru

Abstract:

Introduction: Determination of the temperature field inside a fluid in motion has many practical issues, especially in the case of turbulent flow. The phenomenon is greater when the solid walls have a different temperature than the fluid. The turbulent heat and mass transfer have an essential role in case of the thermal pollution, as it was the recorded during the damage of the Thermoelectric Power-plant Oradea (closed even today). Basic Methods: Solving the theoretical turbulent thermal pollution represents a particularly difficult problem. By using the semi-empirical theories or by simplifying the made assumptions, based on the experimental measurements may be assured the elaboration of the mathematical model for further numerical simulations. The three zones of flow are analyzed separately: the vicinity of the solid wall, the turbulent transition zone, and the turbulent core. For each area are determined the distribution law of temperature. It is determined the dependence of between the Stanton and Prandtl numbers with correction factors, based on measurements experimental. Major Findings/Results: The limitation of the laminar thermal substrate was determined based on the theory of Landau and Levice, using the assumption that the longitudinal component of the velocity pulsation and the pulsation’s frequency varies proportionally with the distance to the wall. For the calculation of the average temperature, the formula is used a similar solution as for the velocity, by an analogous mediation. On these assumptions, the numerical modeling was performed with a gradient of temperature for the turbulent flow in pipes (intact or damaged, with cracks) having 4 different diameters, between 200-500 mm, as there were in the Thermoelectric Power-plant Oradea. Conclusions: It was made a superposition between the molecular viscosity and the turbulent one, followed by addition between the molecular and the turbulent transfer coefficients, necessary to elaborate the theoretical and the numerical modeling. The concept of laminar boundary layer has a different thickness when it is compared the flow with heat transfer and that one without a temperature gradient. The obtained results are within the margin of error of 5%, between the semi-empirical classical theories and the developed model, based on the experimental data. Finally, it is obtained a general correlation between the Stanton number and the Prandtl number, for a specific flow (with associated Reynolds number).

Keywords: experimental measurements, numerical correlations, thermal pollution through pipelines, turbulent thermal flow

Procedia PDF Downloads 144