Search results for: management models
14933 RAPDAC: Role Centric Attribute Based Policy Driven Access Control Model
Authors: Jamil Ahmed
Abstract:
Access control models aim to decide whether a user should be denied or granted access to the user‟s requested activity. Various access control models have been established and proposed. The most prominent of these models include role-based, attribute-based, policy based access control models as well as role-centric attribute based access control model. In this paper, a novel access control model is presented called “Role centric Attribute based Policy Driven Access Control (RAPDAC) model”. RAPDAC incorporates the concept of “policy” in the “role centric attribute based access control model”. It leverages the concept of "policy‟ by precisely combining the evaluation of conditions, attributes, permissions and roles in order to allow authorization access. This approach allows capturing the "access control policy‟ of a real time application in a well defined manner. RAPDAC model allows making access decision at much finer granularity as illustrated by the case study of a real time library information system.Keywords: authorization, access control model, role based access control, attribute based access control
Procedia PDF Downloads 15914932 Predicting Growth of Eucalyptus Marginata in a Mediterranean Climate Using an Individual-Based Modelling Approach
Authors: S.K. Bhandari, E. Veneklaas, L. McCaw, R. Mazanec, K. Whitford, M. Renton
Abstract:
Eucalyptus marginata, E. diversicolor and Corymbia calophylla form widespread forests in south-west Western Australia (SWWA). These forests have economic and ecological importance, and therefore, tree growth and sustainable management are of high priority. This paper aimed to analyse and model the growth of these species at both stand and individual levels, but this presentation will focus on predicting the growth of E. Marginata at the individual tree level. More specifically, the study wanted to investigate how well individual E. marginata tree growth could be predicted by considering the diameter and height of the tree at the start of the growth period, and whether this prediction could be improved by also accounting for the competition from neighbouring trees in different ways. The study also wanted to investigate how many neighbouring trees or what neighbourhood distance needed to be considered when accounting for competition. To achieve this aim, the Pearson correlation coefficient was examined among competition indices (CIs), between CIs and dbh growth, and selected the competition index that can best predict the diameter growth of individual trees of E. marginata forest managed under different thinning regimes at Inglehope in SWWA. Furthermore, individual tree growth models were developed using simple linear regression, multiple linear regression, and linear mixed effect modelling approaches. Individual tree growth models were developed for thinned and unthinned stand separately. The developed models were validated using two approaches. In the first approach, models were validated using a subset of data that was not used in model fitting. In the second approach, the model of the one growth period was validated with the data of another growth period. Tree size (diameter and height) was a significant predictor of growth. This prediction was improved when the competition was included in the model. The fit statistic (coefficient of determination) of the model ranged from 0.31 to 0.68. The model with spatial competition indices validated as being more accurate than with non-spatial indices. The model prediction can be optimized if 10 to 15 competitors (by number) or competitors within ~10 m (by distance) from the base of the subject tree are included in the model, which can reduce the time and cost of collecting the information about the competitors. As competition from neighbours was a significant predictor with a negative effect on growth, it is recommended including neighbourhood competition when predicting growth and considering thinning treatments to minimize the effect of competition on growth. These model approaches are likely to be useful tools for the conservations and sustainable management of forests of E. marginata in SWWA. As a next step in optimizing the number and distance of competitors, further studies in larger size plots and with a larger number of plots than those used in the present study are recommended.Keywords: competition, growth, model, thinning
Procedia PDF Downloads 12814931 Gastronomy: The Preferred Digital Business Models and Impacts in Business Economics within Hospitality, Tourism, and Catering Sectors through Online Commerce
Authors: John Oupa Hlatshwayo
Abstract:
Background: There seem to be preferred digital business models with varying impacts within hospitality, tourism and catering sub-sectors explored through online commerce, as all are ingrained in the business economics domain. Aim: A study aims to establish if such phenomena (Digital Business Models) exist and to what extent if any, within the hospitality, tourism and catering industries, respectively. Setting: This is a qualitative study conducted by exploring several (Four) institutions globally through Case Studies. Method: This research explored explanatory case studies to answer questions about ‘how’ or ’why’ with little control by a researcher over the occurrence of events. It is qualitative research, deductive, and inductive methods. Hence, a comprehensive approach to analyzing qualitative data was attainable through immersion by reading to understand the information. Findings: The results corroborated the notion that digital business models are applicable, by and large, in business economics. Thus, three sectors wherein enterprises operate in the business economics sphere have been narrowed down i.e. hospitality, tourism and catering, are also referred to as triangular polygons due to the atypical nature of being ‘stand-alone’, yet ‘sub-sectors’, but there are confounding factors to consider. Conclusion: The significance of digital business models and digital transformation shows an inevitable merger between business and technology within Hospitality, Tourism, and Catering. Contribution: Such symbiotic relationship of business and technology, persistent evolution of clients’ interface with end-products, forever changing market, current adaptation as well as adjustment to ‘new world order’ by enterprises must be embraced constantly without fail by Business Practitioners, Academics, Business Students, Organizations and Governments.Keywords: digital business models, hospitality, tourism, catering, business economics
Procedia PDF Downloads 1714930 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity
Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish
Abstract:
Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow
Procedia PDF Downloads 13214929 Early Warning Signals: Role and Status of Risk Management in Small and Medium Enterprises
Authors: Alexander Kelíšek, Denisa Janasová, Veronika Mitašová
Abstract:
Weak signals using is often associated with early warning. It is possible to find a link between early warning, respectively early problems detection and risk management. The idea of early warning is very important in the context of crisis management because of the risk prevention possibility. Weak signals are likened to risk symptoms. Nowadays, their usefulness as a tool of proactive problems solving is emphasized. Based on it, it is possible to use weak signals not only in strategic planning, project management, or early warning system, but also as a subsidiary element in risk management. The main question is how to effectively integrate weak signals into risk management. The main aim of the paper is to point out the possibilities of weak signals using in small and medium enterprises risk management.Keywords: early warning system, weak signals, risk management, small and medium enterprises (SMEs)
Procedia PDF Downloads 42714928 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-
Authors: Nieto Bernal Wilson, Carmona Suarez Edgar
Abstract:
The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects. Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse
Procedia PDF Downloads 40914927 Using Traffic Micro-Simulation to Assess the Benefits of Accelerated Pavement Construction for Reducing Traffic Emissions
Authors: Sudipta Ghorai, Ossama Salem
Abstract:
Pavement maintenance, repair, and rehabilitation (MRR) processes may have considerable environmental impacts due to traffic disruptions associated with work zones. The simulation models in use to predict the emission of work zones were mostly static emission factor models (SEFD). SEFD calculates emissions based on average operation conditions e.g. average speed and type of vehicles. Although these models produce accurate results for large-scale planning studies, they are not suitable for analyzing driving conditions at the micro level such as acceleration, deceleration, idling, cruising, and queuing in a work zone. The purpose of this study is to prepare a comprehensive work zone environmental assessment (WEA) framework to calculate the emissions caused due to disrupted traffic; by integrating traffic microsimulation tools with emission models. This will help highway officials to assess the benefits of accelerated construction and opt for the most suitable TMP not only economically but also from an environmental point of view.Keywords: accelerated construction, pavement MRR, traffic microsimulation, congestion, emissions
Procedia PDF Downloads 44914926 Aggregation Scheduling Algorithms in Wireless Sensor Networks
Authors: Min Kyung An
Abstract:
In Wireless Sensor Networks which consist of tiny wireless sensor nodes with limited battery power, one of the most fundamental applications is data aggregation which collects nearby environmental conditions and aggregates the data to a designated destination, called a sink node. Important issues concerning the data aggregation are time efficiency and energy consumption due to its limited energy, and therefore, the related problem, named Minimum Latency Aggregation Scheduling (MLAS), has been the focus of many researchers. Its objective is to compute the minimum latency schedule, that is, to compute a schedule with the minimum number of timeslots, such that the sink node can receive the aggregated data from all the other nodes without any collision or interference. For the problem, the two interference models, the graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR), have been adopted with different power models, uniform-power and non-uniform power (with power control or without power control), and different antenna models, omni-directional antenna and directional antenna models. In this survey article, as the problem has proven to be NP-hard, we present and compare several state-of-the-art approximation algorithms in various models on the basis of latency as its performance measure.Keywords: data aggregation, convergecast, gathering, approximation, interference, omni-directional, directional
Procedia PDF Downloads 22914925 Performance Evaluation of Using Genetic Programming Based Surrogate Models for Approximating Simulation Complex Geochemical Transport Processes
Authors: Hamed K. Esfahani, Bithin Datta
Abstract:
Transport of reactive chemical contaminant species in groundwater aquifers is a complex and highly non-linear physical and geochemical process especially for real life scenarios. Simulating this transport process involves solving complex nonlinear equations and generally requires huge computational time for a given aquifer study area. Development of optimal remediation strategies in aquifers may require repeated solution of such complex numerical simulation models. To overcome this computational limitation and improve the computational feasibility of large number of repeated simulations, Genetic Programming based trained surrogate models are developed to approximately simulate such complex transport processes. Transport process of acid mine drainage, a hazardous pollutant is first simulated using a numerical simulated model: HYDROGEOCHEM 5.0 for a contaminated aquifer in a historic mine site. Simulation model solution results for an illustrative contaminated aquifer site is then approximated by training and testing a Genetic Programming (GP) based surrogate model. Performance evaluation of the ensemble GP models as surrogate models for the reactive species transport in groundwater demonstrates the feasibility of its use and the associated computational advantages. The results show the efficiency and feasibility of using ensemble GP surrogate models as approximate simulators of complex hydrogeologic and geochemical processes in a contaminated groundwater aquifer incorporating uncertainties in historic mine site.Keywords: geochemical transport simulation, acid mine drainage, surrogate models, ensemble genetic programming, contaminated aquifers, mine sites
Procedia PDF Downloads 27614924 Discrete Choice Modeling in Education: Evaluating Early Childhood Educators’ Practices
Authors: Michalis Linardakis, Vasilis Grammatikopoulos, Athanasios Gregoriadis, Kalliopi Trouli
Abstract:
Discrete choice models belong to the family of Conjoint analysis that are applied on the preferences of the respondents towards a set of scenarios that describe alternative choices. The scenarios have been pre-designed to cover all the attributes of the alternatives that may affect the choices. In this study, we examine how preschool educators integrate physical activities into their everyday teaching practices through the use of discrete choice models. One of the advantages of discrete choice models compared to other more traditional data collection methods (e.g. questionnaires and interviews that use ratings) is that the respondent is called to select among competitive and realistic alternatives, rather than objectively rate each attribute that the alternatives may have. We present the effort to construct and choose representative attributes that would cover all possible choices of the respondents, and the scenarios that have arisen. For the purposes of the study, we used a sample of 50 preschool educators in Greece that responded to 4 scenarios (from the total of 16 scenarios that the orthogonal design resulted), with each scenario having three alternative teaching practices. Seven attributes of the alternatives were used in the scenarios. For the analysis of the data, we used multinomial logit model with random effects, multinomial probit model and generalized mixed logit model. The conclusions drawn from the estimated parameters of the models are discussed.Keywords: conjoint analysis, discrete choice models, educational data, multivariate statistical analysis
Procedia PDF Downloads 46514923 Changing MBA Identities: Using Critical Reflection inside and out in Finding a New Narrative
Authors: Keith Schofield, Leigh Morland
Abstract:
Storytelling is an established means of leadership and management development and is also considered a form of leadership of self and others in its own right. This study focuses on the utility of storytelling in the development of management narratives in an MBA programme; sources include programme participants as well as international recruiters, whose voices are often only heard in terms of economic contribution and globalisation. For many MBA candidates, the return to study requires the development of a new identity which complements their professional identity; each candidate has their own journey and expectations, the use of story can enable candidates to explore their aspirations and assumptions and give voice to previously unspoken ideas. For international recruitment, the story of market development and change must be captured if MBAs are to remain fit for purpose. If used effectively, story acts as a form of critical reflection that can inform the learning journeys of individuals, emerging identities as well as the ongoing design and development of programmes. The landscape of management education is shifting; the MBA begins to attract a different kind of candidate, some are younger than before, others are seeking validation for their existing work practices, yet more are entrepreneurial and wish to capitalise on an institutional experience to further their career. There is a shift in context, creating uncertainty and ambiguity for programme managers and recruiters, thus requiring institutions to create a new MBA narrative. This study utilises Lego SeriousPlay as the means to engaging programme participants and international agents in telling the story of their MBA. We asked MBA participants to tell the story of their leadership and management aspirations and compare these to stories of their development journeys, allowing for critical reflection of their respective development gaps. We asked international recruiters, who act as university agents and promote courses in the student’s country of origin, to explore their mental models of MBA candidates and their learning agenda. The purpose of this process was to explore the agent’s perception of the MBA programme and to articulate the student journey from a recruitment perspective. The paper’s unique contribution is in combining these stories in order to explore the assumptions that determine programme design. Data drawn from reflective statements together with images of Lego ‘builds’ created the opportunity for reflection between the mental models of these groups. Findings will inform the design of the MBA journey and experience; we review the extent to which the changing identities of learners are congruent with programme design. Data from international recruiters also determines the extent to which marketing and recruitment strategies identify with would be candidates.Keywords: critical reflection, programme management, recruitment, storytelling
Procedia PDF Downloads 22614922 Forecasting Model for Rainfall in Thailand: Case Study Nakhon Ratchasima Province
Authors: N. Sopipan
Abstract:
In this paper, we study of rainfall time series of weather stations in Nakhon Ratchasima province in Thailand using various statistical methods enabled to analyse the behaviour of rainfall in the study areas. Time-series analysis is an important tool in modelling and forecasting rainfall. ARIMA and Holt-Winter models based on exponential smoothing were built. All the models proved to be adequate. Therefore, could give information that can help decision makers establish strategies for proper planning of agriculture, drainage system and other water resource applications in Nakhon Ratchasima province. We found the best perform for forecasting is ARIMA(1,0,1)(1,0,1)12.Keywords: ARIMA Models, exponential smoothing, Holt-Winter model
Procedia PDF Downloads 30014921 A Risk Management Approach for Nigeria Manufacturing Industries
Authors: Olaniyi O. Omoyajowo
Abstract:
To be successful in today’s competitive global environment, manufacturing industry must be able to respond quickly to changes in technology. These changes in technology introduce new risks and hazards. The management of risk/hazard in a manufacturing process recommends method through which the success rate of an organization can be increased. Thus, there is a continual need for manufacturing industries to invest significant amount of resources in risk management, which in turn optimizes the production output and profitability of any manufacturing industry (if implemented properly). To help improve the existing risk prevention and mitigation practices in Small and Medium Enterprise (SME) in Nigeria Manufacturing Industries (NMI), the researcher embarks on this research to develop a systematic Risk Management process.Keywords: manufacturing management, risk, risk management, SMEs
Procedia PDF Downloads 40214920 Development of Performance Measures for the Implementation of Total Quality Management in Indian Industry
Authors: Perminderjit Singh, Sukhvir Singh
Abstract:
Total Quality Management (TQM) refers to management methods used to enhance quality and productivity in business organizations. Total Quality Management (TQM) has become a frequently used term in discussions concerning quality. Total Quality management has brought rise in demands on the organizations policy and the customers have gained more importance in the organizations focus. TQM is considered as an important management tool, which helps the organizations to satisfy their customers. In present research critical success factors includes management commitment, customer satisfaction, continuous improvement, work culture and environment, supplier quality management, training and development, employee satisfaction and product/process design are studied. A questionnaire is developed to implement these critical success factors in implementation of total quality management in Indian industry. Questionnaires filled by consulting different industrial organizations. Data collected from questionnaires is analyzed by descriptive and importance indexes.Keywords: total quality management, critical success factor, employee satisfaction, supplier quality management, customer focus, quality information, quality measurement
Procedia PDF Downloads 47714919 Modeling the Relation between Discretionary Accrual Earnings Management, International Financial Reporting Standards and Corporate Governance
Authors: Ikechukwu Ndu
Abstract:
This study examines the econometric modeling of the relation between discretionary accrual earnings management, International Financial Reporting Standards (IFRS), and certain corporate governance factors with regard to listed Nigerian non-financial firms. Although discretionary accrual earnings management is a well-known and global problem that has an adverse impact on users of the financial statements, its relationship with IFRS and corporate governance is neither adequately researched nor properly systematically investigated in Nigeria. The dearth of research in the relation between discretionary accrual earnings management, IFRS and corporate governance in Nigeria has made it difficult for academics, practitioners, government setting bodies, regulators and international bodies to achieve a clearer understanding of how discretionary accrual earnings management relates to IFRS and certain corporate governance characteristics. This is the first study to the author’s best knowledge to date that makes interesting research contributions that significantly add to the literature of discretionary accrual earnings management and its relation with corporate governance and IFRS pertaining to the Nigerian context. A comprehensive review is undertaken of the literature of discretionary total accrual earnings management, IFRS, and certain corporate governance characteristics as well as the data, models, methodologies, and different estimators used in the study. Secondary financial statement, IFRS, and corporate governance data are sourced from Bloomberg database and published financial statements of Nigerian non-financial firms for the period 2004 to 2016. The methodology uses both the total and working capital accrual basis. This study has a number of interesting preliminary findings. First, there is a negative relationship between the level of discretionary accrual earnings management and the adoption of IFRS. However, this relationship does not appear to be statistically significant. Second, there is a significant negative relationship between the size of the board of directors and discretionary accrual earnings management. Third, CEO Separation of roles does not constrain earnings management, indicating the need to preserve relationships, personal connections, and maintain bonded friendships between the CEO, Chairman, and executive directors. Fourth, there is a significant negative relationship between discretionary accrual earnings management and the use of a Big Four firm as an auditor. Fifth, including shareholders in the audit committee, leads to a reduction in discretionary accrual earnings management. Sixth, the debt and return on assets (ROA) variables are significant and positively related to discretionary accrual earnings management. Finally, the company size variable indicated by the log of assets is surprisingly not found to be statistically significant and indicates that all Nigerian companies irrespective of size engage in discretionary accrual management. In conclusion, this study provides key insights that enable a better understanding of the relationship between discretionary accrual earnings management, IFRS, and corporate governance in the Nigerian context. It is expected that the results of this study will be of interest to academics, practitioners, regulators, governments, international bodies and other parties involved in policy setting and economic development in areas of financial reporting, securities regulation, accounting harmonization, and corporate governance.Keywords: discretionary accrual earnings management, earnings manipulation, IFRS, corporate governance
Procedia PDF Downloads 14414918 Adjusting Electricity Demand Data to Account for the Impact of Loadshedding in Forecasting Models
Authors: Migael van Zyl, Stefanie Visser, Awelani Phaswana
Abstract:
The electricity landscape in South Africa is characterized by frequent occurrences of loadshedding, a measure implemented by Eskom to manage electricity generation shortages by curtailing demand. Loadshedding, classified into stages ranging from 1 to 8 based on severity, involves the systematic rotation of power cuts across municipalities according to predefined schedules. However, this practice introduces distortions in recorded electricity demand, posing challenges to accurate forecasting essential for budgeting, network planning, and generation scheduling. Addressing this challenge requires the development of a methodology to quantify the impact of loadshedding and integrate it back into metered electricity demand data. Fortunately, comprehensive records of loadshedding impacts are maintained in a database, enabling the alignment of Loadshedding effects with hourly demand data. This adjustment ensures that forecasts accurately reflect true demand patterns, independent of loadshedding's influence, thereby enhancing the reliability of electricity supply management in South Africa. This paper presents a methodology for determining the hourly impact of load scheduling and subsequently adjusting historical demand data to account for it. Furthermore, two forecasting models are developed: one utilizing the original dataset and the other using the adjusted data. A comparative analysis is conducted to evaluate forecast accuracy improvements resulting from the adjustment process. By implementing this methodology, stakeholders can make more informed decisions regarding electricity infrastructure investments, resource allocation, and operational planning, contributing to the overall stability and efficiency of South Africa's electricity supply system.Keywords: electricity demand forecasting, load shedding, demand side management, data science
Procedia PDF Downloads 6114917 Expert Based System Design for Integrated Waste Management
Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy
Abstract:
Recently, an increasing number of researchers have been focusing on working out realistic solutions to sustainability problems. As sustainability issues gain higher importance for organisations, the management of such decisions becomes critical. Knowledge representation is a fundamental issue of complex knowledge based systems. Many types of sustainability problems would benefit from models based on experts’ knowledge. Cognitive maps have been used for analyzing and aiding decision making. A cognitive map can be made of almost any system or problem. A fuzzy cognitive map (FCM) can successfully represent knowledge and human experience, introducing concepts to represent the essential elements and the cause and effect relationships among the concepts to model the behavior of any system. Integrated waste management systems (IWMS) are complex systems that can be decomposed to non-related and related subsystems and elements, where many factors have to be taken into consideration that may be complementary, contradictory, and competitive; these factors influence each other and determine the overall decision process of the system. The goal of the present paper is to construct an efficient IWMS which considers various factors. The authors’ intention is to propose an expert based system design approach for implementing expert decision support in the area of IWMSs and introduces an appropriate methodology for the development and analysis of group FCM. A framework for such a methodology consisting of the development and application phases is presented.Keywords: factors, fuzzy cognitive map, group decision, integrated waste management system
Procedia PDF Downloads 27614916 Concurrent Engineering Challenges and Resolution Mechanisms from Quality Perspectives
Authors: Grmanesh Gidey Kahsay
Abstract:
In modern technical engineering applications, quality is defined in two ways. The first one is that quality is the parameter that measures a product or service’s characteristics to meet and satisfy the pre-stated or fundamental needs (reliability, durability, serviceability). The second one is the quality of a product or service free of any defect or deficiencies. The American Society for Quality (ASQ) describes quality as a pursuit of optimal solutions to confirm successes and fulfillment to be accountable for the product or service's requirements and expectations. This article focuses on quality engineering tools in modern industrial applications. Quality engineering is a field of engineering that deals with the principles, techniques, models, and applications of the product or service to guarantee quality. Including the entire activities to analyze the product’s design and development, quality engineering emphasizes how to make sure that products and services are designed and developed to meet consumers’ requirements. This episode acquaints with quality tools such as quality systems, auditing, product design, and process control. The finding presents thoughts that aim to improve quality engineering proficiency and effectiveness by introducing essential quality techniques and tools in some selected industries.Keywords: essential quality tools, quality systems and models, quality management systems, and quality assurance
Procedia PDF Downloads 15214915 Comparative Analysis of Effecting Factors on Fertility by Birth Order: A Hierarchical Approach
Authors: Ali Hesari, Arezoo Esmaeeli
Abstract:
Regarding to dramatic changes of fertility and higher order births during recent decades in Iran, access to knowledge about affecting factors on different birth orders has crucial importance. In this study, According to hierarchical structure of many of social sciences data and the effect of variables of different levels of social phenomena that determine different birth orders in 365 days ending to 1390 census have been explored by multilevel approach. In this paper, 2% individual row data for 1390 census is analyzed by HLM software. Three different hierarchical linear regression models are estimated for data analysis of the first and second, third, fourth and more birth order. Research results displays different outcomes for three models. Individual level variables entered in equation are; region of residence (rural/urban), age, educational level and labor participation status and province level variable is GDP per capita. Results show that individual level variables have different effects in these three models and in second level we have different random and fixed effects in these models.Keywords: fertility, birth order, hierarchical approach, fixe effects, random effects
Procedia PDF Downloads 33914914 Ground State Phases in Two-Mode Quantum Rabi Models
Authors: Suren Chilingaryan
Abstract:
We study two models describing a single two-level system coupled to two boson field modes in either a parallel or orthogonal setup. Both models may be feasible for experimental realization through Raman adiabatic driving in cavity QED. We study their ground state configurations; that is, we find the quantum precursors of the corresponding semi-classical phase transitions. We found that the ground state configurations of both models present the same critical coupling as the quantum Rabi model. Around this critical coupling, the ground state goes from the so-called normal configuration with no excitation, the qubit in the ground state and the fields in the quantum vacuum state, to a ground state with excitations, the qubit in a superposition of ground and excited state, while the fields are not in the vacuum anymore, for the first model. The second model shows a more complex ground state configuration landscape where we find the normal configuration mentioned above, two single-mode configurations, where just one of the fields and the qubit are excited, and a dual-mode configuration, where both fields and the qubit are excited.Keywords: quantum optics, quantum phase transition, cavity QED, circuit QED
Procedia PDF Downloads 36714913 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.Keywords: integral differential equations, jump–diffusion model, American options, rational approximation
Procedia PDF Downloads 11914912 Modelling and Optimization of Laser Cutting Operations
Authors: Hany Mohamed Abdu, Mohamed Hassan Gadallah, El-Giushi Mokhtar, Yehia Mahmoud Ismail
Abstract:
Laser beam cutting is one nontraditional machining process. This paper optimizes the parameters of Laser beam cutting machining parameters of Stainless steel (316L) by considering the effect of input parameters viz. power, oxygen pressure, frequency and cutting speed. Statistical design of experiments are carried in three different levels and process responses such as 'Average kerf taper (Ta)' and 'Surface Roughness (Ra)' are measured accordingly. A quadratic mathematical model (RSM) for each of the responses is developed as a function of the process parameters. Responses predicted by the models (as per Taguchi’s L27 OA) are employed to search for an optimal parametric combination to achieve desired yield of the process. RSM models are developed for mean responses, S/N ratio, and standard deviation of responses. Optimization models are formulated as single objective problem subject to process constraints. Models are formulated based on Analysis of Variance (ANOVA) using MATLAB environment. Optimum solutions are compared with Taguchi Methodology results.Keywords: optimization, laser cutting, robust design, kerf width, Taguchi method, RSM and DOE
Procedia PDF Downloads 62014911 The Use of Performance Indicators for Evaluating Models of Drying Jackfruit (Artocarpus heterophyllus L.): Page, Midilli, and Lewis
Authors: D. S. C. Soares, D. G. Costa, J. T. S., A. K. S. Abud, T. P. Nunes, A. M. Oliveira Júnior
Abstract:
Mathematical models of drying are used for the purpose of understanding the drying process in order to determine important parameters for design and operation of the dryer. The jackfruit is a fruit with high consumption in the Northeast and perishability. It is necessary to apply techniques to improve their conservation for longer in order to diffuse it by regions with low consumption. This study aimed to analyse several mathematical models (Page, Lewis, and Midilli) to indicate one that best fits the conditions of convective drying process using performance indicators associated with each model: accuracy (Af) and noise factors (Bf), mean square error (RMSE) and standard error of prediction (% SEP). Jackfruit drying was carried out in convective type tray dryer at a temperature of 50°C for 9 hours. It is observed that the model Midili was more accurate with Af: 1.39, Bf: 1.33, RMSE: 0.01%, and SEP: 5.34. However, the use of the Model Midilli is not appropriate for purposes of control process due to need four tuning parameters. With the performance indicators used in this paper, the Page model showed similar results with only two parameters. It is concluded that the best correlation between the experimental and estimated data is given by the Page’s model.Keywords: drying, models, jackfruit, biotechnology
Procedia PDF Downloads 37914910 Investigations of Flow Field with Different Turbulence Models on NREL Phase VI Blade
Authors: T. Y. Liu, C. H. Lin, Y. M. Ferng
Abstract:
Wind energy is one of the clean renewable energy. However, the low frequency (20-200HZ) noise generated from the wind turbine blades, which bothers the residents, becomes the major problem to be developed. It is useful for predicting the aerodynamic noise by flow field and pressure distribution analysis on the wind turbine blades. Therefore, the main objective of this study is to use different turbulence models to analyse the flow field and pressure distributions of the wing blades. Three-dimensional Computation Fluid Dynamics (CFD) simulation of the flow field was used to calculate the flow phenomena for the National Renewable Energy Laboratory (NREL) Phase VI horizontal axis wind turbine rotor. Two different flow cases with different wind speeds were investigated: 7m/s with 72rpm and 15m/s with 72rpm. Four kinds of RANS-based turbulence models, Standard k-ε, Realizable k-ε, SST k-ω, and v2f, were used to predict and analyse the results in the present work. The results show that the predictions on pressure distributions with SST k-ω and v2f turbulence models have good agreements with experimental data.Keywords: horizontal axis wind turbine, turbulence model, noise, fluid dynamics
Procedia PDF Downloads 26514909 Climate Change Effects on Agriculture
Authors: Abdellatif Chebboub
Abstract:
Agricultural production is sensitive to weather and thus directly affected by climate change. Plausible estimates of these climate change impacts require combined use of climate, crop, and economic models. Results from previous studies vary substantially due to differences in models, scenarios, and data. This paper is part of a collective effort to systematically integrate these three types of models. We focus on the economic component of the assessment, investigating how nine global economic models of agriculture represent endogenous responses to seven standardized climate change scenarios produced by two climate and five crop models. These responses include adjustments in yields, area, consumption, and international trade. We apply biophysical shocks derived from the Intergovernmental Panel on Climate Change’s representative concentration pathway with end-of-century radiative forcing of 8.5 W/m2. The mean biophysical yield effect with no incremental CO2 fertilization is a 17% reduction globally by 2050 relative to a scenario with unchanging climate. Endogenous economic responses reduce yield loss to 11%, increase area of major crops by 11%, and reduce consumption by 3%. Agricultural production, cropland area, trade, and prices show the greatest degree of variability in response to climate change, and consumption the lowest. The sources of these differences include model structure and specification; in particular, model assumptions about ease of land use conversion, intensification, and trade. This study identifies where models disagree on the relative responses to climate shocks and highlights research activities needed to improve the representation of agricultural adaptation responses to climate change.Keywords: climate change, agriculture, weather change, danger of climate change
Procedia PDF Downloads 31614908 Correlation between Speech Emotion Recognition Deep Learning Models and Noises
Authors: Leah Lee
Abstract:
This paper examines the correlation between deep learning models and emotions with noises to see whether or not noises mask emotions. The deep learning models used are plain convolutional neural networks (CNN), auto-encoder, long short-term memory (LSTM), and Visual Geometry Group-16 (VGG-16). Emotion datasets used are Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS), Crowd-sourced Emotional Multimodal Actors Dataset (CREMA-D), Toronto Emotional Speech Set (TESS), and Surrey Audio-Visual Expressed Emotion (SAVEE). To make it four times bigger, audio set files, stretch, and pitch augmentations are utilized. From the augmented datasets, five different features are extracted for inputs of the models. There are eight different emotions to be classified. Noise variations are white noise, dog barking, and cough sounds. The variation in the signal-to-noise ratio (SNR) is 0, 20, and 40. In summation, per a deep learning model, nine different sets with noise and SNR variations and just augmented audio files without any noises will be used in the experiment. To compare the results of the deep learning models, the accuracy and receiver operating characteristic (ROC) are checked.Keywords: auto-encoder, convolutional neural networks, long short-term memory, speech emotion recognition, visual geometry group-16
Procedia PDF Downloads 7514907 New Public Management: Step towards Democratization
Authors: Aneri Mehta, Krunal Mehta
Abstract:
Administration is largely based on two sciences: ‘management science’ and ‘political science’. The approach of new public management is more inclined towards the management science. Era of ‘New Public Management’ has affected the developing countries very immensely. Public management reforms are needed to enhance the development of the countries. This reform mainly includes capacity building, control of corruption, political decentralization, debureaucratization and public empowerment. This gives the opportunity to create self-sustaining change in the governance. This paper includes the link of approach of new public management and their effect on building effective democratization in the country. This approach mainly focuses on rationality and effectiveness of governance system. These need to have deep efforts on technological, organizational, social and cultural fields. Bringing citizen participation in governance is main objective of NPM. The shift from traditional public management to new public management have low success rate of reforms. This research includes case study of RTI which is a big step of government towards citizen centric approach of governance. The aspect of ‘publicness’ in the democratic policy implementation is important for good governance in India.Keywords: public management, development, public empowerment, governance
Procedia PDF Downloads 50514906 Potential Impact of Climate Change on Suspended Sediment Changes in Mekong River Basin
Authors: Zuliziana Suif, Nordila Ahmad, Sengheng Hul
Abstract:
This paper evaluates the impact of climate change on suspended sediment changes in the Mekong River Basin. In this study, the distributed process-based sediment transport model is used to examine the potential impact of future climate on suspended sediment dynamic changes in the Mekong River Basin. To this end, climate scenarios from two General Circulation Model (GCMs) were considered in the scenario analysis. The simulation results show that the sediment load and concentration shows 0.64% to 69% increase in the near future (2041-2050) and 2.5% to 95% in the far future (2090- 2099). As the projected climate change impact on sediment varies remarkably between the different climate models, the uncertainty should be taken into account in sediment management. Overall, the changes in sediment load and concentration can have a great implication for related sediment management.Keywords: climate change, suspended sediment, Mekong River Basin, GCMs
Procedia PDF Downloads 44314905 Operating System Based Virtualization Models in Cloud Computing
Authors: Dev Ras Pandey, Bharat Mishra, S. K. Tripathi
Abstract:
Cloud computing is ready to transform the structure of businesses and learning through supplying the real-time applications and provide an immediate help for small to medium sized businesses. The ability to run a hypervisor inside a virtual machine is important feature of virtualization and it is called nested virtualization. In today’s growing field of information technology, many of the virtualization models are available, that provide a convenient approach to implement, but decision for a single model selection is difficult. This paper explains the applications of operating system based virtualization in cloud computing with an appropriate/suitable model with their different specifications and user’s requirements. In the present paper, most popular models are selected, and the selection was based on container and hypervisor based virtualization. Selected models were compared with a wide range of user’s requirements as number of CPUs, memory size, nested virtualization supports, live migration and commercial supports, etc. and we identified a most suitable model of virtualization.Keywords: virtualization, OS based virtualization, container based virtualization, hypervisor based virtualization
Procedia PDF Downloads 32914904 A Predictive MOC Solver for Water Hammer Waves Distribution in Network
Authors: A. Bayle, F. Plouraboué
Abstract:
Water Distribution Network (WDN) still suffers from a lack of knowledge about fast pressure transient events prediction, although the latter may considerably impact their durability. Accidental or planned operating activities indeed give rise to complex pressure interactions and may drastically modified the local pressure value generating leaks and, in rare cases, pipe’s break. In this context, a numerical predictive analysis is conducted to prevent such event and optimize network management. A couple of Python/FORTRAN 90, home-made software, has been developed using Method Of Characteristic (MOC) solving for water-hammer equations. The solver is validated by direct comparison with theoretical and experimental measurement in simple configurations whilst afterward extended to network analysis. The algorithm's most costly steps are designed for parallel computation. A various set of boundary conditions and energetic losses models are considered for the network simulations. The results are analyzed in both real and frequencies domain and provide crucial information on the pressure distribution behavior within the network.Keywords: energetic losses models, method of characteristic, numerical predictive analysis, water distribution network, water hammer
Procedia PDF Downloads 232