Search results for: ‘Layer Diffusion Control’ model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11006

Search results for: ‘Layer Diffusion Control’ model

416 Effects of Process Parameters on the Yield of Oil from Coconut Fruit

Authors: Ndidi F. Amulu, Godian O. Mbah, Maxwel I. Onyiah, Callistus N. Ude

Abstract:

Analysis of the properties of coconut (Cocos nucifera) and its oil was evaluated in this work using standard analytical techniques. The analyses carried out include proximate composition of the fruit, extraction of oil from the fruit using different process parameters and physicochemical analysis of the extracted oil. The results showed the percentage (%) moisture, crude lipid, crude protein, ash and carbohydrate content of the coconut as 7.59, 55.15, 5.65, 7.35 and 19.51 respectively. The oil from the coconut fruit was odourless and yellowish liquid at room temperature (30oC). The treatment combinations used (leaching time, leaching temperature and solute: solvent ratio) showed significant differences (P<0.05) in the yield of oil from coconut flour. The oil yield ranged between 36.25%-49.83%. Lipid indices of the coconut oil indicated the acid value (AV) as 10.05Na0H/g of oil, free fatty acid (FFA) as 5.03%, saponification values (SV) as 183.26mgKOH-1g of oil, iodine value (IV) as 81.00 I2/g of oil, peroxide value (PV) as 5.00 ml/ g of oil and viscosity (V) as 0.002. A standard statistical package minitab version 16.0 program was used in the regression analysis and analysis of variance (ANOVA). The statistical software mentioned above was also used to generate various plots such as single effect plot, interactions effect plot and contour plot. The response or yield of oil from the coconut flour was used to develop a mathematical model that correlates the yield to the process variables studied. The maximum conditions obtained that gave the highest yield of coconut oil were leaching time of 2hrs, leaching temperature of 50oC and solute/solvent ratio of 0.05g/ml.

Keywords: Coconut, oil-extraction, optimization, physicochemical, proximate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2651
415 Environmental Management in Arid Regions:The Question of Water

Authors: Yousef Bakhbakhi, Mourad Boumaza

Abstract:

Only recently have water ethics received focused interest in the international water community. Because water is metabolically basic to life, an ethical dimension persists in every decision related to water. Water ethics at once express human society-s approach to water and act as guidelines for behaviour. Ideas around water are often implicit and embedded as assumptions. They can be entrenched in behaviour and difficult to contest because they are difficult to “see". By explicitly revealing the ethical ideas underlying water-related decisions, human society-s relationship with water, and with natural systems of which water is part, can be contested and shifted or be accepted with conscious intention by human society. In recent decades, improved understanding of water-s importance for ecosystem functioning and ecological services for human survival is moving us beyond this growth-driven, supplyfocused management paradigm. Environmental ethics challenge this paradigm by extending the ethical sphere to the environment and thus water or water Resources management per se. An ethical approach is a legitimate, important, and often ignored approach to effect change in environmental decision making. This qualitative research explores principles of water ethics and examines the underlying ethical precepts of selected water policy examples. The constructed water ethic principles act as a set of criteria against which a policy comparison can be established. This study shows that water Resources management is a progressive issue by embracing full public participation and a new planning model, and knowledgegeneration initiatives.

Keywords: water resources, environmental management, publicparticipation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1453
414 Landfill Failure Mobility Analysis: A Probabilistic Approach

Authors: Ali Jahanfar, Brajesh Dubey, Bahram Gharabaghi, Saber Bayat Movahed

Abstract:

Ever increasing population growth of major urban centers and environmental challenges in siting new landfills have resulted in a growing trend in design of mega-landfills some with extraordinary heights and dangerously steep slopes. Landfill failure mobility risk analysis is one of the most uncertain types of dynamic rheology models due to very large inherent variabilities in the heterogeneous solid waste material shear strength properties. The waste flow of three historic dumpsite and two landfill failures were back-analyzed using run-out modeling with DAN-W model. The travel distances of the waste flow during landfill failures were calculated approach by taking into account variability in material shear strength properties. The probability distribution function for shear strength properties of the waste material were grouped into four major classed based on waste material compaction (landfills versus dumpsites) and composition (high versus low quantity) of high shear strength waste materials such as wood, metal, plastic, paper and cardboard in the waste. This paper presents a probabilistic method for estimation of the spatial extent of waste avalanches, after a potential landfill failure, to create maps of vulnerability scores to inform property owners and residents of the level of the risk.

Keywords: Landfill failure, waste flow, Voellmy rheology, friction coefficient, waste compaction and type.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2286
413 Investigating the Pedestrian Willingness to Pay to Choose Appropriate Policies for Improving the Safety of Pedestrian Facilities

Authors: Babak Mirbaha, Mahmoud Saffarzadeh, Fatemeh Mohajeri

Abstract:

Road traffic accidents lead to a higher rate of death and injury, especially in vulnerable road users such as pedestrians. Improving the safety of facilities for pedestrians is a major concern for policymakers because of the high number of pedestrian fatalities and direct and indirect costs which are imposed to the society. This study focuses on the idea of determining the willingness to pay of pedestrians for increasing their safety while crossing the street. In this study, three different scenarios including crossing the street with zebra crossing facilities, crossing the street with zebra crossing facilities and installing a pedestrian traffic light and constructing a pedestrian bridge with escalator are presented. The research was conducted based on stated preferences method. The required data were collected from a questionnaire that consisted of three parts: pedestrian’s demographic characteristics, travel characteristics and scenarios. Four different payment amounts are presented for each scenario and a logit model has been built for each proposed payment. The results show that sex, age, education, average household income and individual salary have significant effect on choosing a scenario. Among the policies that have been mentioned through the questionnaire scenarios, the scenario of crossing the street with zebra crossing facilities and installing a traffic lights is the most frequent, with willingness to pay 10,000 Rials and the scenario of crossing the street with a zebra crossing with a willingness to pay 100,000 Rials having the least frequency. For all scenarios, as the payment is increasing, the willingness to pay decreases.

Keywords: Pedestrians, willingness to pay, safety, immunization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 830
412 Achieving Design-Stage Elemental Cost Planning Accuracy: Case Study of New Zealand

Authors: Johnson Adafin, James O. B. Rotimi, Suzanne Wilkinson, Abimbola O. Windapo

Abstract:

An aspect of client expenditure management that requires attention is the level of accuracy achievable in design-stage elemental cost planning. This has been a major concern for construction clients and practitioners in New Zealand (NZ). Pre-tender estimating inaccuracies are significantly influenced by the level of risk information available to estimators. Proper cost planning activities should ensure the production of a project’s likely construction costs (initial and final), and subsequent cost control activities should prevent unpleasant consequences of cost overruns, disputes and project abandonment. If risks were properly identified and priced at the design stage, observed variance between design-stage elemental cost plans (ECPs) and final tender sums (FTS) (initial contract sums) could be reduced. This study investigates the variations between design-stage ECPs and FTS of construction projects, with a view to identifying risk factors that are responsible for the observed variance. Data were sourced through interviews, and risk factors were identified by using thematic analysis. Access was obtained to project files from the records of study participants (consultant quantity surveyors), and document analysis was employed in complementing the responses from the interviews. Study findings revealed the discrepancies between ECPs and FTS in the region of -14% and +16%. It is opined in this study that the identified risk factors were responsible for the variability observed. The values obtained from the analysis would enable greater accuracy in the forecast of FTS by Quantity Surveyors. Further, whilst inherent risks in construction project developments are observed globally, these findings have important ramifications for construction projects by expanding existing knowledge on what is needed for reasonable budgetary performance and successful delivery of construction projects. The findings contribute significantly to the study by providing quantitative confirmation to justify the theoretical conclusions generated in the literature from around the world. This therefore adds to and consolidates existing knowledge.

Keywords: Accuracy, design-stage, elemental cost plan, final tender sum, New Zealand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1804
411 Stability Optimization of Functionally Graded Pipes Conveying Fluid

Authors: Karam Y. Maalawi, Hanan E.M EL-Sayed

Abstract:

This paper presents an exact analytical model for optimizing stability of thin-walled, composite, functionally graded pipes conveying fluid. The critical flow velocity at which divergence occurs is maximized for a specified total structural mass in order to ensure the economic feasibility of the attained optimum designs. The composition of the material of construction is optimized by defining the spatial distribution of volume fractions of the material constituents using piecewise variations along the pipe length. The major aim is to tailor the material distribution in the axial direction so as to avoid the occurrence of divergence instability without the penalty of increasing structural mass. Three types of boundary conditions have been examined; namely, Hinged-Hinged, Clamped- Hinged and Clamped-Clamped pipelines. The resulting optimization problem has been formulated as a nonlinear mathematical programming problem solved by invoking the MatLab optimization toolbox routines, which implement constrained function minimization routine named “fmincon" interacting with the associated eigenvalue problem routines. In fact, the proposed mathematical models have succeeded in maximizing the critical flow velocity without mass penalty and producing efficient and economic designs having enhanced stability characteristics as compared with the baseline designs.

Keywords: Functionally graded materials, pipe flow, optimumdesign, fluid- structure interaction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2208
410 Pushover Analysis of Reinforced Concrete Buildings Using Full Jacket Technics: A Case Study on an Existing Old Building in Madinah

Authors: Tarek M. Alguhane, Ayman H. Khalil, M. N. Fayed, Ayman M. Ismail

Abstract:

The retrofitting of existing buildings to resist the seismic loads is very important to avoid losing lives or financial disasters. The aim at retrofitting processes is increasing total structure strength by increasing stiffness or ductility ratio. In addition, the response modification factors (R) have to satisfy the code requirements for suggested retrofitting types. In this study, two types of jackets are used, i.e. full reinforced concrete jackets and surrounding steel plate jackets. The study is carried out on an existing building in Madinah by performing static pushover analysis before and after retrofitting the columns. The selected model building represents nearly all-typical structure lacks structure built before 30 years ago in Madina City, KSA. The comparison of the results indicates a good enhancement of the structure respect to the applied seismic forces. Also, the response modification factor of the RC building is evaluated for the studied cases before and after retrofitting. The design of all vertical elements (columns) is given. The results show that the design of retrofitted columns satisfied the code's design stress requirements. However, for some retrofitting types, the ductility requirements represented by response modification factor do not satisfy KSA design code (SBC- 301).

Keywords: Concrete jackets, steel jackets, RC buildings pushover analysis, non-linear analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1776
409 Further Development in Predicting Post-Earthquake Fire Ignition Hazard

Authors: Pegah Farshadmanesh, Jamshid Mohammadi, Mehdi Modares

Abstract:

In nearly all earthquakes of the past century that resulted in moderate to significant damage, the occurrence of postearthquake fire ignition (PEFI) has imposed a serious hazard and caused severe damage, especially in urban areas. In order to reduce the loss of life and property caused by post-earthquake fires, there is a crucial need for predictive models to estimate the PEFI risk. The parameters affecting PEFI risk can be categorized as: 1) factors influencing fire ignition in normal (non-earthquake) condition, including floor area, building category, ignitability, type of appliance, and prevention devices, and 2) earthquake related factors contributing to the PEFI risk, including building vulnerability and earthquake characteristics such as intensity, peak ground acceleration, and peak ground velocity. State-of-the-art statistical PEFI risk models are solely based on limited available earthquake data, and therefore they cannot predict the PEFI risk for areas with insufficient earthquake records since such records are needed in estimating the PEFI model parameters. In this paper, the correlation between normal condition ignition risk, peak ground acceleration, and PEFI risk is examined in an effort to offer a means for predicting post-earthquake ignition events. An illustrative example is presented to demonstrate how such correlation can be employed in a seismic area to predict PEFI hazard.

Keywords: Fire risk, post-earthquake fire ignition (PEFI), risk management, seismicity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1361
408 Thermo-Mechanical Approach to Evaluate Softening Behavior of Polystyrene: Validation and Modeling

Authors: Salah Al-Enezi, Rashed Al-Zufairi, Naseer Ahmad

Abstract:

A Thermo-mechanical technique was developed to determine softening point temperature/glass transition temperature (Tg) of polystyrene exposed to high pressures. The design utilizes the ability of carbon dioxide to lower the glass transition temperature of polymers and acts as plasticizer. In this apparatus, the sorption of carbon dioxide to induce softening of polymers as a function of temperature/pressure is performed and the extent of softening is measured in three-point-flexural-bending mode. The polymer strip was placed in the cell in contact with the linear variable differential transformer (LVDT). CO2 was pumped into the cell from a supply cylinder to reach high pressure. The results clearly showed that full softening point of the samples, accompanied by a large deformation on the polymer strip. The deflection curves are initially relatively flat and then undergo a dramatic increase as the temperature is elevated. It was found that increasing the pressure of CO2 causes the temperature curves to shift from higher to lower by increment of about 45 K, over the pressure range of 0-120 bars. The obtained experimental Tg values were validated with the values reported in the literature. Finally, it is concluded that the defection model fits consistently to the generated experimental results, which attempts to describe in more detail how the central deflection of a thin polymer strip affected by the CO2 diffusions in the polymeric samples.

Keywords: Softening, high-pressure, polystyrene, CO2 diffusions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 665
407 Retrospective Reconstruction of Time Series Data for Integrated Waste Management

Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy

Abstract:

The development, operation and maintenance of Integrated Waste Management Systems (IWMS) affects essentially the sustainable concern of every region. The features of such systems have great influence on all of the components of sustainability. In order to reach the optimal way of processes, a comprehensive mapping of the variables affecting the future efficiency of the system is needed such as analysis of the interconnections among the components and modeling of their interactions. The planning of a IWMS is based fundamentally on technical and economical opportunities and the legal framework. Modeling the sustainability and operation effectiveness of a certain IWMS is not in the scope of the present research. The complexity of the systems and the large number of the variables require the utilization of a complex approach to model the outcomes and future risks. This complex method should be able to evaluate the logical framework of the factors composing the system and the interconnections between them. The authors of this paper studied the usability of the Fuzzy Cognitive Map (FCM) approach modeling the future operation of IWMS’s. The approach requires two input data set. One is the connection matrix containing all the factors affecting the system in focus with all the interconnections. The other input data set is the time series, a retrospective reconstruction of the weights and roles of the factors. This paper introduces a novel method to develop time series by content analysis.

Keywords: Content analysis, factors, integrated waste management system, time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2018
406 Allometric Models for Biomass Estimation in Savanna Woodland Area, Niger State, Nigeria

Authors: Abdullahi Jibrin, Aishetu Abdulkadir

Abstract:

The development of allometric models is crucial to accurate forest biomass/carbon stock assessment. The aim of this study was to develop a set of biomass prediction models that will enable the determination of total tree aboveground biomass for savannah woodland area in Niger State, Nigeria. Based on the data collected through biometric measurements of 1816 trees and destructive sampling of 36 trees, five species specific and one site specific models were developed. The sample size was distributed equally between the five most dominant species in the study site (Vitellaria paradoxa, Irvingia gabonensis, Parkia biglobosa, Anogeissus leiocarpus, Pterocarpus erinaceous). Firstly, the equations were developed for five individual species. Secondly these five species were mixed and were used to develop an allometric equation of mixed species. Overall, there was a strong positive relationship between total tree biomass and the stem diameter. The coefficient of determination (R2 values) ranging from 0.93 to 0.99 P < 0.001 were realised for the models; with considerable low standard error of the estimates (SEE) which confirms that the total tree above ground biomass has a significant relationship with the dbh. F-test values for the biomass prediction models were also significant at p < 0.001 which indicates that the biomass prediction models are valid. This study recommends that for improved biomass estimates in the study site, the site specific biomass models should preferably be used instead of using generic models.

Keywords: Allometriy, biomass, carbon stock, model, regression equation, woodland, inventory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2788
405 Statistical Modeling of Local Area Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad S. Daba, J. P. Dubois

Abstract:

Fading noise degrades the performance of cellular communication, most notably in femto- and pico-cells in 3G and 4G systems. When the wireless channel consists of a small number of scattering paths, the statistics of fading noise is not analytically tractable and poses a serious challenge to developing closed canonical forms that can be analysed and used in the design of efficient and optimal receivers. In this context, noise is multiplicative and is referred to as stochastically local fading. In many analytical investigation of multiplicative noise, the exponential or Gamma statistics are invoked. More recent advances by the author of this paper utilized a Poisson modulated-weighted generalized Laguerre polynomials with controlling parameters and uncorrelated noise assumptions. In this paper, we investigate the statistics of multidiversity stochastically local area fading channel when the channel consists of randomly distributed Rayleigh and Rician scattering centers with a coherent Nakagami-distributed line of sight component and an underlying doubly stochastic Poisson process driven by a lognormal intensity. These combined statistics form a unifying triply stochastic filtered marked Poisson point process model.

Keywords: Cellular communication, femto- and pico-cells, stochastically local area fading channel, triply stochastic filtered marked Poisson point process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1345
404 Understanding and Designing Situation-Aware Mobile and Ubiquitous Computing Systems

Authors: Kai Häussermann, Christoph Hubig, Paul Levi, Frank Leymann, Oliver Siemoneit, Matthias Wieland, Oliver Zweigle

Abstract:

Using spatial models as a shared common basis of information about the environment for different kinds of contextaware systems has been a heavily researched topic in the last years. Thereby the research focused on how to create, to update, and to merge spatial models so as to enable highly dynamic, consistent and coherent spatial models at large scale. In this paper however, we want to concentrate on how context-aware applications could use this information so as to adapt their behavior according to the situation they are in. The main idea is to provide the spatial model infrastructure with a situation recognition component based on generic situation templates. A situation template is – as part of a much larger situation template library – an abstract, machinereadable description of a certain basic situation type, which could be used by different applications to evaluate their situation. In this paper, different theoretical and practical issues – technical, ethical and philosophical ones – are discussed important for understanding and developing situation dependent systems based on situation templates. A basic system design is presented which allows for the reasoning with uncertain data using an improved version of a learning algorithm for the automatic adaption of situation templates. Finally, for supporting the development of adaptive applications, we present a new situation-aware adaptation concept based on workflows.

Keywords: context-awareness, ethics, facilitation of system use through workflows, situation recognition and learning based on situation templates and situation ontology's, theory of situationaware systems

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
403 Software Vulnerability Markets: Discoverers and Buyers

Authors: Abdullah M. Algarni, Yashwant K. Malaiya

Abstract:

Some of the key aspects of vulnerability—discovery, dissemination, and disclosure—have received some attention recently. However, the role of interaction among the vulnerability discoverers and vulnerability acquirers has not yet been adequately addressed. Our study suggests that a major percentage of discoverers, a majority in some cases, are unaffiliated with the software developers and thus are free to disseminate the vulnerabilities they discover in any way they like. As a result, multiple vulnerability markets have emerged. In some of these markets, the exchange is regulated, but in others, there is little or no regulation. In recent vulnerability discovery literature, the vulnerability discoverers have remained anonymous individuals. Although there has been an attempt to model the level of their efforts, information regarding their identities, modes of operation, and what they are doing with the discovered vulnerabilities has not been explored.

Reports of buying and selling of the vulnerabilities are now appearing in the press; however, the existence of such markets requires validation, and the natures of the markets need to be analyzed. To address this need, we have attempted to collect detailed information. We have identified the most prolific vulnerability discoverers throughout the past decade and examined their motivation and methods. A large percentage of these discoverers are located in Eastern and Western Europe and in the Far East. We have contacted several of them in order to collect firsthand information regarding their techniques, motivations, and involvement in the vulnerability markets. We examine why many of the discoverers appear to retire after a highly successful vulnerability-finding career. The paper identifies the actual vulnerability markets, rather than the hypothetical ideal markets that are often examined. The emergence of worldwide government agencies as vulnerability buyers has significant implications. We discuss potential factors that can impact the risk to society and the need for detailed exploration.

Keywords: Risk management, software security, vulnerability discoverers, vulnerability markets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3263
402 A Machine Learning Approach for Earthquake Prediction in Various Zones Based on Solar Activity

Authors: Viacheslav Shkuratskyy, Aminu Bello Usman, Michael O’Dea, Mujeeb Ur Rehman, Saifur Rahman Sabuj

Abstract:

This paper examines relationships between solar activity and earthquakes, it applied machine learning techniques: K-nearest neighbour, support vector regression, random forest regression, and long short-term memory network. Data from the SILSO World Data Center, the NOAA National Center, the GOES satellite, NASA OMNIWeb, and the United States Geological Survey were used for the experiment. The 23rd and 24th solar cycles, daily sunspot number, solar wind velocity, proton density, and proton temperature were all included in the dataset. The study also examined sunspots, solar wind, and solar flares, which all reflect solar activity, and earthquake frequency distribution by magnitude and depth. The findings showed that the long short-term memory network model predicts earthquakes more correctly than the other models applied in the study, and solar activity is more likely to effect earthquakes of lower magnitude and shallow depth than earthquakes of magnitude 5.5 or larger with intermediate depth and deep depth

.

Keywords: K-Nearest Neighbour, Support Vector Regression, Random Forest Regression, Long Short-Term Memory Network, earthquakes, solar activity, sunspot number, solar wind, solar flares.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 202
401 Strategic Thinking to Change Behavior and Improve Sanitation in Jodipan and Kesatrian, Malang, East Java, Indonesia

Authors: Prasanti Widyasih Sarli, Prayatni Soewondo

Abstract:

Greater access to sanitation in developing countries is urgent. However even though sanitation is crucial, overall budget for sanitation is limited. With this budget limitation, it is important to (1) allocate resources strategically to maximize impact and (2) take into account communal agency to potentially be a source for sanitation improvements. The Jodipan and Kesatrian Project in Malang, Indonesia is an interesting alternative for solving the sanitation problem in which resources were allocated strategically and communal agency was also observed. Although the projects initial goal was only to improve visually the situation in the slums, it became a new tourist destination, and the economic benefit that came with it had an effect also on the change of behavior of the residents and the government towards sanitation. It also grew from only including the Kesatrian Village to expanding to the Jodipan Village in the course of less than a year. To investigate the success of this project, in this paper a descriptive model will be used and data will be drawn from intensive interviews with the initiators of the project, residents affected by the project and government officials. In this research it is argued that three points mark the success of the project: (1) the strategic initial impact due to choice of location, (2) the influx of tourists that triggered behavioral change among residents and, (3) the direct economic impact which ensured its sustainability and growth by gaining government officials support and attention for more public spending in the area for slum development and sanitation improvement.

Keywords: Behavior change, sanitation, slum, strategic thinking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 997
400 Classification of Business Models of Italian Bancassurance by Balance Sheet Indicators

Authors: Andrea Bellucci, Martina Tofi

Abstract:

The aim of paper is to analyze business models of bancassurance in Italy for life business. The life insurance business is very developed in the Italian market and banks branches have 80% of the market share. Given its maturity, the life insurance market needs to consolidate its organizational form to allow for the development of non-life business, which nowadays collects few premiums but represents a great opportunity to enlarge the market share of bancassurance using its strength in the distribution channel while the market share of independent agents is decreasing. Starting with the main business model of bancassurance for life business, this paper will analyze the performances of life companies in the Italian market by balance sheet indicators and by main discriminant variables of business models. The study will observe trends from 2013 to 2015 for the Italian market by exploiting a database managed by Associazione Nazionale delle Imprese di Assicurazione (ANIA). The applied approach is based on a bottom-up analysis starting with variables and indicators to define business models’ classification. The statistical classification algorithm proposed by Ward is employed to design business models’ profiles. Results from the analysis will be a representation of the main business models built by their profile related to indicators. In that way, an unsupervised analysis is developed that has the limit of its judgmental dimension based on research opinion, but it is possible to obtain a design of effective business models.

Keywords: Balance sheet indicators, Bancassurance, business models, ward algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1261
399 Low Resolution Face Recognition Using Mixture of Experts

Authors: Fatemeh Behjati Ardakani, Fatemeh Khademian, Abbas Nowzari Dalini, Reza Ebrahimpour

Abstract:

Human activity is a major concern in a wide variety of applications, such as video surveillance, human computer interface and face image database management. Detecting and recognizing faces is a crucial step in these applications. Furthermore, major advancements and initiatives in security applications in the past years have propelled face recognition technology into the spotlight. The performance of existing face recognition systems declines significantly if the resolution of the face image falls below a certain level. This is especially critical in surveillance imagery where often, due to many reasons, only low-resolution video of faces is available. If these low-resolution images are passed to a face recognition system, the performance is usually unacceptable. Hence, resolution plays a key role in face recognition systems. In this paper we introduce a new low resolution face recognition system based on mixture of expert neural networks. In order to produce the low resolution input images we down-sampled the 48 × 48 ORL images to 12 × 12 ones using the nearest neighbor interpolation method and after that applying the bicubic interpolation method yields enhanced images which is given to the Principal Component Analysis feature extractor system. Comparison with some of the most related methods indicates that the proposed novel model yields excellent recognition rate in low resolution face recognition that is the recognition rate of 100% for the training set and 96.5% for the test set.

Keywords: Low resolution face recognition, Multilayered neuralnetwork, Mixture of experts neural network, Principal componentanalysis, Bicubic interpolation, Nearest neighbor interpolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724
398 Preparation of Corn Flour Based Extruded Product and Evaluate Its Physical Characteristics

Authors: C. S. Saini

Abstract:

The composite flour blend consisting of corn, pearl millet, black gram and wheat bran in the ratio of 80:5:10:5 was taken to prepare the extruded product and their effect on physical properties of extrudate was studied. The extrusion process was conducted in laboratory by using twin screw extruder. The physical characteristics evaluated include lateral expansion, bulk density, water absorption index, water solubility index, and rehydration ratio and moisture retention. The Central Composite Rotatable Design (CCRD) was used to decide the level of processing variables i.e. feed moisture content (%), screw speed (rpm), and barrel temperature (oC) for the experiment. The data obtained after extrusion process were analyzed by using response surface methodology. A second order polynomial model for the dependent variables was established to fit the experimental data. The numerical optimization studies resulted in 127°C of barrel temperature, 246 rpm of screw speed, and 14.5% of feed moisture as optimum variables to produce acceptable extruded product. The responses predicted by the software for the optimum process condition resulted in lateral expansion 126%, bulk density 0.28 g/cm3, water absorption index 4.10 g/g, water solubility index 39.90%, rehydration ratio 544% and moisture retention 11.90% with 75% desirability.

Keywords: Black gram, corn flour, extrusion, physical characteristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3311
397 Design and Development of Graphene Oxide Modified by Chitosan Nanosheets Showing pH-Sensitive Surface as a Smart Drug Delivery System for Controlled Release of Doxorubicin

Authors: Parisa Shirzadeh

Abstract:

Drug delivery systems in which drugs are traditionally used, multi-stage and at specified intervals by patients, do not meet the needs of the world's up-to-date drug delivery. In today's world, we are dealing with a huge number of recombinant peptide and protean drugs and analogues of hormones in the body, most of which are made with genetic engineering techniques. Most of these drugs are used to treat critical diseases such as cancer. Due to the limitations of the traditional method, researchers sought to find ways to solve the problems of the traditional method to a large extent. Following these efforts, controlled drug release systems were introduced, which have many advantages. Using controlled release of the drug in the body, the concentration of the drug is kept at a certain level, and in a short time, it is done at a higher rate. Graphene is a natural material that is biodegradable, non-toxic, natural and wide surfaces of graphene plates makes it more effective to modify graphene than carbon nanotubes. Graphene oxide is often synthesized using concentrated oxidizers such as sulfuric acid, nitric acid, and potassium permanganate based on Hummer method. graphene oxide is very hydrophilic and easily dissolves in water and creates a stable solution. Graphene oxide (GO) has been modified by chitosan (CS) covalently, developed for control release of doxorubicin (DOX). In this study, GO is produced by the hummer method under acidic conditions. Then, it is chlorinated by oxalyl chloride to increase its reactivity against amine. After that, in the presence of CS, the amino reaction was performed to form amide transplantation, and the DOX was connected to the carrier surface by π-π interaction in buffer phosphate. GO, GO-CS, and GO-CS-DOX were characterized by FT-IR and TGA to recognize new functional groups which show the new bonding of CS to GO, RAMA and SEM to recognize size of layers that show changing in size and number of layers. The ability to load and release is determined by UV-Visible spectroscopy. The loading result showed a high capacity of DOX absorption (99%) and pH dependence identified as a result of DOX release from GO-CS nanosheet at pH 5.3 and 7.4, which show a fast release rate in acidic conditions.

Keywords: Graphene oxide, chitosan, nanosheet, controlled drug release, doxorubicin.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 233
396 Web Proxy Detection via Bipartite Graphs and One-Mode Projections

Authors: Zhipeng Chen, Peng Zhang, Qingyun Liu, Li Guo

Abstract:

With the Internet becoming the dominant channel for business and life, many IPs are increasingly masked using web proxies for illegal purposes such as propagating malware, impersonate phishing pages to steal sensitive data or redirect victims to other malicious targets. Moreover, as Internet traffic continues to grow in size and complexity, it has become an increasingly challenging task to detect the proxy service due to their dynamic update and high anonymity. In this paper, we present an approach based on behavioral graph analysis to study the behavior similarity of web proxy users. Specifically, we use bipartite graphs to model host communications from network traffic and build one-mode projections of bipartite graphs for discovering social-behavior similarity of web proxy users. Based on the similarity matrices of end-users from the derived one-mode projection graphs, we apply a simple yet effective spectral clustering algorithm to discover the inherent web proxy users behavior clusters. The web proxy URL may vary from time to time. Still, the inherent interest would not. So, based on the intuition, by dint of our private tools implemented by WebDriver, we examine whether the top URLs visited by the web proxy users are web proxies. Our experiment results based on real datasets show that the behavior clusters not only reduce the number of URLs analysis but also provide an effective way to detect the web proxies, especially for the unknown web proxies.

Keywords: Bipartite graph, clustering, one-mode projection, web proxy detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 746
395 Origins of Strict Liability for Abnormally Dangerous Activities in the United States, Rylands v. Fletcher and a General Clause of Strict Liability in the UK

Authors: Maria Lubomira Kubica

Abstract:

The paper reveals the birth and evolution of the British precedent Rylands v. Fletcher that, once adopted on the other side of the Ocean (in United States), gave rise to a general clause of liability for abnormally dangerous activities recognized by the §20 of the American Restatements of the Law Third, Liability for Physical and Emotional Harm. The main goal of the paper was to analyze the development of the legal doctrine and of the case law posterior to the precedent together with the intent of the British judicature to leapfrog from the traditional rule contained in Rylands v. Fletcher to a general clause similar to that introduced in the United States and recently also on the European level. As it is well known, within the scope of tort law two different initiatives compete with the aim of harmonizing the European laws: European Group on Tort Law with its Principles of European Tort Law (hereinafter PETL) in which article 5:101 sets forth a general clause for strict liability for abnormally dangerous activities and Study Group on European Civil Code with its Common Frame of Reference (CFR) which promotes rather ad hoc model of listing out determined cases of strict liability. Very narrow application scope of the art. 5:101 PETL, restricted only to abnormally dangerous activities, stays in opposition to very broad spectrum of strict liability cases governed by the CFR. The former is a perfect example of a general clause that offers a minimum and basic standard, possibly acceptable also in those countries in which, like in the United Kingdom, this regime of liability is completely marginalized.

Keywords: Dangerous activities, general clause, risk, strict liability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2323
394 Ownership, Management Responsibility and Corporate Performance of the Listed Firms in Kazakhstan

Authors: Gulnara Moldasheva

Abstract:

The research explores the relationship between management responsibility and corporate governance of listed companies in Kazakhstan. This research employs firm level data of selected listed non-financial firms and firm level data “operational” financial sector, consisted from banking sector, insurance companies and accumulated pension funds using multivariate regression analysis under fixed effect model approach. Ownership structure includes institutional ownership, managerial ownership and private investor’s ownership. Management responsibility of the firm is expressed by the decision of the firm on amount of leverage. Results of the cross sectional panel study for non-financial firms showed that only institutional shareholding is significantly negatively correlated with debt to equity ratio. Findings from “operational” financial sector show that leverage is significantly affected only by the CEO/Chair duality and the size of financial institutions, and insignificantly affected by ownership structure. Also, the findings show, that there is a significant negative relationship between profitability and the debt to equity ratio for non-financial firms, which is consistent with pecking order theory. Generally, the found results suggest that corporate governance and a management responsibility play important role in corporate performance of listed firms in Kazakhstan.

Keywords: Corporate governance, corporate performance, debt to equity ratio, ownership.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657
393 Public Policy for Quality School Lunch Development in Thailand

Authors: W. Kongnoo, J. Loysongkroa, S. Chotivichien, N. Viriyautsahakul, N. Saiwongse

Abstract:

Obesity, stunting and wasting problems among Thai school-aged children are increasing due to inappropriate food consumption behavior and poor environments for desirable nutritional behavior. Because of a low school lunch budget of only 0.40 USD per person per day, food quality is not up to nutritional standards. Therefore, the Health Department with the Education Ministry and the Thai Health Promotion Foundation have developed a quality school lunch project during 2009–2013. The program objectives were development and management of public policy to increase school lunch budget. The methods used a healthy public policy motivation process and movement in 241 local administrative organizations and 538 schools. The problem and solution research was organized to study school food and nutrition management, create a best practice policy mobilization model and hold a public hearing to motivate an increase of school meal funding. The results showed that local public policy has been motivated during 2009-2011 to increase school meal budget using local budgets. School children with best food consumption behavior and exercise increased from 13.2% in 2009 to 51.6% in 2013 and stunting decreased from 6.0% in 2009 to 4.7% in 2013. As the result of national policy motivation (2012-2013), the cabinet meeting on October 22, 2013 has approved an increase of school lunch budget from 0.40 USD to 0.62 USD per person per day. Thus, 5,800,469 school children nationwide have benefited from the budget increase.

Keywords: Public policy, Quality school lunch, Thailand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4772
392 Statistical Modeling of Accelerated Pavement Failure Using Response Surface Methodology

Authors: Anshu Manik, Kasthurirangan Gopalakrishnan, Siddhartha K. Khaitan

Abstract:

Rutting is one of the major load-related distresses in airport flexible pavements. Rutting in paving materials develop gradually with an increasing number of load applications, usually appearing as longitudinal depressions in the wheel paths and it may be accompanied by small upheavals to the sides. Significant research has been conducted to determine the factors which affect rutting and how they can be controlled. Using the experimental design concepts, a series of tests can be conducted while varying levels of different parameters, which could be the cause for rutting in airport flexible pavements. If proper experimental design is done, the results obtained from these tests can give a better insight into the causes of rutting and the presence of interactions and synergisms among the system variables which have influence on rutting. Although traditionally, laboratory experiments are conducted in a controlled fashion to understand the statistical interaction of variables in such situations, this study is an attempt to identify the critical system variables influencing airport flexible pavement rut depth from a statistical DoE perspective using real field data from a full-scale test facility. The test results do strongly indicate that the response (rut depth) has too much noise in it and it would not allow determination of a good model. From a statistical DoE perspective, two major changes proposed for this experiment are: (1) actual replication of the tests is definitely required, (2) nuisance variables need to be identified and blocked properly. Further investigation is necessary to determine possible sources of noise in the experiment.

Keywords: Airport Pavement, Design of Experiments, Rutting, NAPTF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673
391 A Corpus-Based Approach to Understanding Market Access in Fisheries and Aquaculture: A Systematic Literature Review

Authors: Cheryl Marie Cordeiro

Abstract:

Although fisheries and aquaculture studies might seem marginal to international business (IB) studies in general, fisheries and aquaculture IB (FAIB) management is currently facing increasing pressure to meet global demand and consumption for fish in the next coming decades. In part address to this challenge, the purpose of this systematic review of literature (SLR) study is to investigate the use of the term ‘market access’ in its context of use in the generic literature and business sector discourse, in comparison to the more specific literature and discourse in fisheries, aquaculture and seafood. This SLR aims to uncover the knowledge/interest gaps between the academic subject discourses and business sector practices. Corpus driven in methodology and using a triangulation method of three different text analysis software including AntConc, VOSviewer and Web of Science (WoS) analytics, the SLR results indicate a gap in conceptual knowledge and business practices in how ‘market access’ is conceived and used in the context of the pharmaceutical healthcare industry and FAIB research and practice. While it is acknowledged that the product orientation of different business sectors might differ, this SLR study works with the assumption that both business sectors are global in orientation. These business sectors are complex in their operations from product to market. This SLR suggests a conceptual model in understanding the challenges, the potential barriers as well as avenues for solutions to developing market access for FAIB.

Keywords: Market access, fisheries and aquaculture, international business, systematic literature review.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 945
390 Simulation and Assessment of Carbon Dioxide Separation by Piperazine Blended Solutions Using E-NRTL and Peng-Robinson Models: A Study of Regeneration Heat Duty

Authors: Arash Esmaeili, Zhibang Liu, Yang Xiang, Jimmy Yun, Lei Shao

Abstract:

High pressure carbon dioxide (CO2) absorption from a specific off-gas in a conventional column has been evaluated for the environmental concerns by the Aspen HYSYS simulator using a wide range of single absorbents and piperazine (PZ) blended solutions to estimate the outlet CO2 concentration, CO2 loading, reboiler power supply and regeneration heat duty to choose the most efficient solution in terms of CO2 removal and required heat duty. The property package, which is compatible with all applied solutions for the simulation in this study, estimates the properties based on electrolyte non-random two-liquid (E-NRTL) model for electrolyte thermodynamics and Peng-Robinson equation of state for vapor phase and liquid hydrocarbon phase properties. The results of the simulation indicate that PZ in addition to the mixture of PZ and monoethanolamine (MEA) demand the highest regeneration heat duty compared with other studied single and blended amine solutions respectively. The blended amine solutions with the lowest PZ concentrations (5wt% and 10wt%) were considered and compared to reduce the cost of process, among which the blended solution of 10wt%PZ+35wt%MDEA (methyldiethanolamine) was found as the most appropriate solution in terms of CO2 content in the outlet gas, rich-CO2 loading and regeneration heat duty.

Keywords: Absorption, amine solutions, Aspen HYSYS, CO2 loading, piperazine, regeneration heat duty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 615
389 Expert Based System Design for Integrated Waste Management

Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy

Abstract:

Recently, an increasing number of researchers have been focusing on working out realistic solutions to sustainability problems. As sustainability issues gain higher importance for organisations, the management of such decisions becomes critical. Knowledge representation is a fundamental issue of complex knowledge based systems. Many types of sustainability problems would benefit from models based on experts’ knowledge. Cognitive maps have been used for analyzing and aiding decision making. A cognitive map can be made of almost any system or problem. A fuzzy cognitive map (FCM) can successfully represent knowledge and human experience, introducing concepts to represent the essential elements and the cause and effect relationships among the concepts to model the behaviour of any system. Integrated waste management systems (IWMS) are complex systems that can be decomposed to non-related and related subsystems and elements, where many factors have to be taken into consideration that may be complementary, contradictory, and competitive; these factors influence each other and determine the overall decision process of the system. The goal of the present paper is to construct an efficient IWMS which considers various factors. The authors’ intention is to propose an expert based system design approach for implementing expert decision support in the area of IWMSs and introduces an appropriate methodology for the development and analysis of group FCM. A framework for such a methodology consisting of the development and application phases is presented.

Keywords: Factors, fuzzy cognitive map, group decision, integrated waste management system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964
388 Dynamic Stability Assessment of Different Wheel Sized Bicycles Based on Current Frame Design Practice with ISO Requirement for Bicycle Safety

Authors: Milan Paudel, Fook Fah Yap, Anil K. Bastola

Abstract:

The difficulties in riding small wheel bicycles and their lesser stability have been perceived for a long time. Although small wheel bicycles are designed using the similar approach and guidelines that have worked well for big wheel bicycles, the performance of the big wheelers and the smaller wheelers are markedly different. Since both the big wheelers and small wheelers have same fundamental geometry, most blame the small wheel for this discrepancy in the performance. This paper reviews existing guidelines for bicycle design, especially the front steering geometry for the bicycle, and provides a systematic and quantitative analysis of different wheel sized bicycles. A validated mathematical model has been used as a tool to assess the dynamic performance of the bicycles in term of their self-stability. The results obtained were found to corroborate the subjective perception of cyclists for small wheel bicycles. The current approach for small wheel bicycle design requires higher speed to be self-stable. However, it was found that increasing the headtube angle and selecting a proper trail could improve the dynamic performance of small wheel bicycles. A range of parameters for front steering geometry has been identified for small wheel bicycles that have comparable stability as big wheel bicycles. Interestingly, most of the identified geometries are found to be beyond the ISO recommended range and seem to counter the current approach of small wheel bicycle design. Therefore, it was successfully shown that the guidelines for big wheelers do not translate directly to small wheelers, but careful selection of the front geometry could make small wheel bicycles as stable as big wheel bicycles.

Keywords: Big wheel bicycle, design approach, ISO requirements, small wheel bicycle, stability and performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 848
387 Hybrid Advanced Oxidative Pretreatment of Complex Industrial Effluent for Biodegradability Enhancement

Authors: K. Paradkar, S. N. Mudliar, A. Sharma, A. B. Pandit, R. A. Pandey

Abstract:

The study explores the hybrid combination of Hydrodynamic Cavitation (HC) and Subcritical Wet Air Oxidation-based pretreatment of complex industrial effluent to enhance the biodegradability selectively (without major COD destruction) to facilitate subsequent enhanced downstream processing via anaerobic or aerobic biological treatment. Advanced oxidation based techniques can be less efficient as standalone options and a hybrid approach by combining Hydrodynamic Cavitation (HC), and Wet Air Oxidation (WAO) can lead to a synergistic effect since both the options are based on common free radical mechanism. The HC can be used for initial turbulence and generation of hotspots which can begin the free radical attack and this agitating mixture then can be subjected to less intense WAO since initial heat (to raise the activation energy) can be taken care by HC alone. Lab-scale venturi-based hydrodynamic cavitation and wet air oxidation reactor with biomethanated distillery wastewater (BMDWW) as a model effluent was examined for establishing the proof-of-concept. The results indicated that for a desirable biodegradability index (BOD: COD - BI) enhancement (up to 0.4), the Cavitation (standalone) pretreatment condition was: 5 bar and 88 min reaction time with a COD reduction of 36 % and BI enhancement of up to 0.27 (initial BI - 0.17). The optimum WAO condition (standalone) was: 150oC, 6 bar and 30 minutes with 31% COD reduction and 0.33 BI. The hybrid pretreatment (combined Cavitation + WAO) worked out to be 23.18 min HC (at 5 bar) followed by 30 min WAO at 150oC, 6 bar, at which around 50% COD was retained yielding a BI of 0.55. FTIR & NMR analysis of pretreated effluent indicated dissociation and/or reorientation of complex organic compounds in untreated effluent to simpler organic compounds post-pretreatment.

Keywords: BI, hybrid, hydrodynamic cavitation, wet air oxidation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1757