Search results for: interval valued function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5917

Search results for: interval valued function

5617 Organization of the Purchasing Function for Innovation

Authors: Jasna Prester, Ivana Rašić Bakarić, Božidar Matijević

Abstract:

Various prominent scholars and substantial practitioner-oriented literature on innovation orientation have shown positive effects on firm performance. There is a myriad of factors that influence and enhance innovation but it has been found in the literature that new product innovations accounted for an average of 14 percent of sales revenues for all firms. If there is one thing that has changed in innovation management during the last decade, it is the growing reliance on external partners. As a consequence, a new task for purchasing arises, as firms need to understand which suppliers actually do have high potential contributing to the innovativeness of the firm and which do not. Purchasing function in an organization is extremely important as it deals on an average of 50% or more of a firm's expenditures. In the nineties the purchasing department was largely seen as a transaction-oriented, clerical function but today purchasing integration provides a formal interface mechanism between purchasing and other firm functions that services other functions within the company. Purchasing function has to be organized differently to enable firm innovation potential. However, innovations are inherently risky. There are behavioral risk (that some partner will take advantage of the other party), technological risk in terms of complexity of products and processes of manufacturing and incoming materials and finally market risks, which in fact judge the value of the innovation. These risks are investigated in this work since it has been found in the literature that the higher the technological risk, higher will be the centralization of the purchasing function as an interface with other supply chain members. Most researches on organization of purchasing function were done by case study analysis of innovative firms. This work actually tends to prove or discard results found in the literature based on case study method. A large data set of 1493 companies, from 25 countries collected in the GMRG 4 survey served as a basis for analysis.

Keywords: purchasing function organization, innovation, technological risk, GMRG 4 survey

Procedia PDF Downloads 479
5616 Fuzzy Time Series- Markov Chain Method for Corn and Soybean Price Forecasting in North Carolina Markets

Authors: Selin Guney, Andres Riquelme

Abstract:

Among the main purposes of optimal and efficient forecasts of agricultural commodity prices is to guide the firms to advance the economic decision making process such as planning business operations and marketing decisions. Governments are also the beneficiaries and suppliers of agricultural price forecasts. They use this information to establish a proper agricultural policy, and hence, the forecasts affect social welfare and systematic errors in forecasts could lead to a misallocation of scarce resources. Various empirical approaches have been applied to forecast commodity prices that have used different methodologies. Most commonly-used approaches to forecast commodity sectors depend on classical time series models that assume values of the response variables are precise which is quite often not true in reality. Recently, this literature has mostly evolved to a consideration of fuzzy time series models that provide more flexibility in terms of the classical time series models assumptions such as stationarity, and large sample size requirement. Besides, fuzzy modeling approach allows decision making with estimated values under incomplete information or uncertainty. A number of fuzzy time series models have been developed and implemented over the last decades; however, most of them are not appropriate for forecasting repeated and nonconsecutive transitions in the data. The modeling scheme used in this paper eliminates this problem by introducing Markov modeling approach that takes into account both the repeated and nonconsecutive transitions. Also, the determination of length of interval is crucial in terms of the accuracy of forecasts. The problem of determining the length of interval arbitrarily is overcome and a methodology to determine the proper length of interval based on the distribution or mean of the first differences of series to improve forecast accuracy is proposed. The specific purpose of this paper is to propose and investigate the potential of a new forecasting model that integrates methodologies for determining the proper length of interval based on the distribution or mean of the first differences of series and Fuzzy Time Series- Markov Chain model. Moreover, the accuracy of the forecasting performance of proposed integrated model is compared to different univariate time series models and the superiority of proposed method over competing methods in respect of modelling and forecasting on the basis of forecast evaluation criteria is demonstrated. The application is to daily corn and soybean prices observed at three commercially important North Carolina markets; Candor, Cofield and Roaring River for corn and Fayetteville, Cofield and Greenville City for soybeans respectively. One main conclusion from this paper is that using fuzzy logic improves the forecast performance and accuracy; the effectiveness and potential benefits of the proposed model is confirmed with small selection criteria value such MAPE. The paper concludes with a discussion of the implications of integrating fuzzy logic and nonarbitrary determination of length of interval for the reliability and accuracy of price forecasts. The empirical results represent a significant contribution to our understanding of the applicability of fuzzy modeling in commodity price forecasts.

Keywords: commodity, forecast, fuzzy, Markov

Procedia PDF Downloads 216
5615 The Impact of Transformational Leadership and Interpersonal Interaction on Mentoring Function

Authors: Ching-Yuan Huang, Rhay-Hung Weng, Yi-Ting Chen

Abstract:

Mentoring functions will improve new nurses' job performance, provide support with new nurses, and then reduce the turnover rate of them. This study explored the impact of transformational leadership and interpersonal interaction on mentoring functions. We employed a questionnaire survey to collect data and selected a sample of new nurses from three hospitals in Taiwan. A total of 306 valid surveys were obtained. Multiple regression model analysis was conducted to test the study hypothesis. Inspirational motivation, idealized influence, and individualized consideration had a positive influence on overall mentoring function, but intellectual stimulation had a positive influence on career development function only. Perceived similarity and interaction frequency also had positive influences on mentoring functions. When the shift overlap rate exceeded 80%, mentoring function experienced a negative result. The transformational leadership of mentors actually would improve the mentoring functions among new staff nurses. Perceived similarity and interaction frequency between mentees and mentors also had a positive influence on mentoring functions. Managers should enhance the transformational leadership of mentors by designing leadership training and motivation programs. Furthermore, nursing managers should promote the interaction between new staff nurses and their mentors, but the shift overlap rate should not exceed 80%.

Keywords: interpersonal interaction, mentoring function, mentor, new nurse, transformational leadership

Procedia PDF Downloads 328
5614 System Identification and Controller Design for a DC Electrical Motor

Authors: Armel Asongu Nkembi, Ahmad Fawad

Abstract:

The aim of this paper is to determine in a concise way the transfer function that characterizes a DC electrical motor with a helix. In practice it can be obtained by applying a particular input to the system and then, based on the observation of its output, determine an approximation to the transfer function of the system. In our case, we use a step input and find the transfer function parameters that give the simulated first-order time response. The simulation of the system is done using MATLAB/Simulink. In order to determine the parameters, we assume a first order system and use the Broida approximation to determine the parameters and then its Mean Square Error (MSE). Furthermore, we design a PID controller for the control process first in the continuous time domain and tune it using the Ziegler-Nichols open loop process. We then digitize the controller to obtain a digital controller since most systems are implemented using computers, which are digital in nature.

Keywords: transfer function, step input, MATLAB, Simulink, DC electrical motor, PID controller, open-loop process, mean square process, digital controller, Ziegler-Nichols

Procedia PDF Downloads 49
5613 Fuzzy Time Series Forecasting Based on Fuzzy Logical Relationships, PSO Technique, and Automatic Clustering Algorithm

Authors: A. K. M. Kamrul Islam, Abdelhamid Bouchachia, Suang Cang, Hongnian Yu

Abstract:

Forecasting model has a great impact in terms of prediction and continues to do so into the future. Although many forecasting models have been studied in recent years, most researchers focus on different forecasting methods based on fuzzy time series to solve forecasting problems. The forecasted models accuracy fully depends on the two terms that are the length of the interval in the universe of discourse and the content of the forecast rules. Moreover, a hybrid forecasting method can be an effective and efficient way to improve forecasts rather than an individual forecasting model. There are different hybrids forecasting models which combined fuzzy time series with evolutionary algorithms, but the performances are not quite satisfactory. In this paper, we proposed a hybrid forecasting model which deals with the first order as well as high order fuzzy time series and particle swarm optimization to improve the forecasted accuracy. The proposed method used the historical enrollments of the University of Alabama as dataset in the forecasting process. Firstly, we considered an automatic clustering algorithm to calculate the appropriate interval for the historical enrollments. Then particle swarm optimization and fuzzy time series are combined that shows better forecasting accuracy than other existing forecasting models.

Keywords: fuzzy time series (fts), particle swarm optimization, clustering algorithm, hybrid forecasting model

Procedia PDF Downloads 246
5612 The Benefits of Security Culture for Improving Physical Protection Systems at Detection and Radiation Measurement Laboratory

Authors: Ari S. Prabowo, Nia Febriyanti, Haryono B. Santosa

Abstract:

Security function that is called as Physical Protection Systems (PPS) has functions to detect, delay and response. Physical Protection Systems (PPS) in Detection and Radiation Measurement Laboratory needs to be improved continually by using internal resources. The nuclear security culture provides some potentials to support this research. The study starts by identifying the security function’s weaknesses and its strengths of security culture as a purpose. Secondly, the strengths of security culture are implemented in the laboratory management. Finally, a simulation was done to measure its effectiveness. Some changes were happened in laboratory personnel behaviors and procedures. All became more prudent. The results showed a good influence of nuclear security culture in laboratory security functions.

Keywords: laboratory, physical protection system, security culture, security function

Procedia PDF Downloads 181
5611 Effect of Tissue Preservation Chemicals on Decomposition in Different Soil Types

Authors: Onyekachi Ogbonnaya Iroanya, Taiye Abdullahi Gegele, Frank Tochukwu Egwuatu

Abstract:

Introduction: Forensic taphonomy is a multifaceted area that incorporates decomposition, chemical and biological cadaver exposure in post-mortem event chronology and reconstruction to predict the Post Mortem Interval (PMI). The aim of this study was to evaluate the integrity of DNA extracted from the remains of embalmed decomposed Sus domesticus tissues buried in different soil types. Method: A total of 12 limbs of Sus domesticus weighing between 0.7-1.4 kg were used. Each of the samples across the groups was treated with 10% formaldehyde, absolute methanol and 50% Pine oil for 24 hours before burial except the control samples, which were buried immediately. All samples were buried in shallow simulated Clay, Sandy and Loamy soil graves for 12 months. The DNA for each sample was extracted and quantified with Nanodrop Spectrophotometer (6305 JENWAY spectrometers). The rate of decomposition was examined through the modified qualitative decomposition analysis. Extracted DNA was amplified through PCR and bands visualized via gel electrophoresis. A biochemical enzyme assay was done for each burial grave soil. Result: The limbs in all burial groups had lost weight over the burial period. There was a significant increase in the soil urease level in the samples preserved in formaldehyde across the 3 soil type groups (p≤0.01). Also, the control grave soils recorded significantly higher alkaline phosphatase, dehydrogenase and calcium carbonate values compared to experimental grave soils (p≤0.01). The experimental samples showed a significant decrease in DNA concentration and purity when compared to the control groups (p≤0.01). Obtained findings of the soil biochemical analysis showed the embalming treatment altered the relationship between organic matter decomposition and soil biochemical properties as observed in the fluctuations that were recorded in the soil biochemical parameters. The PCR amplified DNA showed no bands on the gel electrophoresis plates. Conclusion: In criminal investigations, factors such as burial grave soil, grave soil biochemical properties, antemortem exposure to embalming chemicals should be considered in post-mortem interval (PMI) determination.

Keywords: forensic taphonomy, post-mortem interval (PMI), embalmment, decomposition, grave soil

Procedia PDF Downloads 162
5610 Virtual Routing Function Allocation Method for Minimizing Total Network Power Consumption

Authors: Kenichiro Hida, Shin-Ichi Kuribayashi

Abstract:

In a conventional network, most network devices, such as routers, are dedicated devices that do not have much variation in capacity. In recent years, a new concept of network functions virtualisation (NFV) has come into use. The intention is to implement a variety of network functions with software on general-purpose servers and this allows the network operator to select their capacities and locations without any constraints. This paper focuses on the allocation of NFV-based routing functions which are one of critical network functions, and presents the virtual routing function allocation algorithm that minimizes the total power consumption. In addition, this study presents the useful allocation policy of virtual routing functions, based on an evaluation with a ladder-shaped network model. This policy takes the ratio of the power consumption of a routing function to that of a circuit and traffic distribution between areas into consideration. Furthermore, the present paper shows that there are cases where the use of NFV-based routing functions makes it possible to reduce the total power consumption dramatically, in comparison to a conventional network, in which it is not economically viable to distribute small-capacity routing functions.

Keywords: NFV, resource allocation, virtual routing function, minimum power consumption

Procedia PDF Downloads 338
5609 Nonlinear Triad Interactions in Magnetohydrodynamic Plasma Turbulence

Authors: Yasser Rammah, Wolf-Christian Mueller

Abstract:

Nonlinear triad interactions in incompressible three-dimensional magnetohydrodynamic (3D-MHD) turbulence are studied by analyzing data from high-resolution direct numerical simulations of decaying isotropic (5123 grid points) and forced anisotropic (10242 x256 grid points) turbulence. An accurate numerical approach toward analyzing nonlinear turbulent energy transfer function and triad interactions is presented. It involves the direct numerical examination of every wavenumber triad that is associated with the nonlinear terms in the differential equations of MHD in the inertial range of turbulence. The technique allows us to compute the spectral energy transfer and energy fluxes, as well as the spectral locality property of energy transfer function. To this end, the geometrical shape of each underlying wavenumber triad that contributes to the statistical transfer density function is examined to infer the locality of the energy transfer. Results show that the total energy transfer is local via nonlocal triad interactions in decaying macroscopically isotropic MHD turbulence. In anisotropic MHD, turbulence subject to a strong mean magnetic field the nonlinear transfer is generally weaker and exhibits a moderate increase of nonlocality in both perpendicular and parallel directions compared to the isotropic case. These results support the recent mathematical findings, which also claim the locality of nonlinear energy transfer in MHD turbulence.

Keywords: magnetohydrodynamic (MHD) turbulence, transfer density function, locality function, direct numerical simulation (DNS)

Procedia PDF Downloads 381
5608 Human Resources Development and Management: A Guide to School Owners

Authors: Charita B. Lasala, Lakambini G. Reluya

Abstract:

The human factor composing the organization is an asset that needs to be managed conscientiously and to be in tuned with the organization’s need. Thus, the human resources add value to the organization by using their talents, skills and knowledge in transforming the other resources of the organization to either produce or to deliver products and services that generate profits or other valued forms for return. Keeping these kinds of employees has always been the main goal of each Human Resources Department in every company worldwide; regardless of the work being done. They are the most important resource a company can have and treating them well will make them priceless assets that can help make a business a success. Larmen de Guia Memorial College (LGMC) and Royal Oaks International School (ROIS) is one of the many organizations that seek ways to keep the human factor and are in the process of formalization and that people management is on the top of the list thus, this study was made since there was a need for the creation of the Human Resources Department due to its absence in the organization and to help the organization in keeping these valued employees. The study was anchored on the concept that human resources consist of people who perform its activities and that all decisions that affect the workforce concern the organization’s human resources functions. In conducting this study, it made use of the mixed method using both the qualitative and quantitative approaches with focus group discussions. The design has three stages namely: problem conceptualization, case analysis, and output. The output from the survey and interviews tells the abstracted ideas on the proposed HR program for the said institution. Based on the findings of the study, it can be concluded that the personnel in the institution is not in the correct perspective, much more that the personnel has no specific job descriptions. The hiring procedure is not extensive, nor the personnel was given the chance to be exposed to training that would aid them in job development and enhancement of their skills and talents. The compensation package offered by the institution does not commensurate to their services rendered. Lastly, it is concluded that in the opinion/decision rendered by the grievance committee is not fair and that the institution failed to give good motivation/initiative for the employees to be more productive.

Keywords: employee benefits, employee relations, human resources and management, people management, recruitment, trainings

Procedia PDF Downloads 315
5607 The Predictive Implication of Executive Function and Language in Theory of Mind Development in Preschool Age Children

Authors: Michael Luc Andre, Célia Maintenant

Abstract:

Theory of mind is a milestone in child development which allows children to understand that others could have different mental states than theirs. Understanding the developmental stages of theory of mind in children leaded researchers on two Connected research problems. In one hand, the link between executive function and theory of mind, and on the other hand, the relationship of theory of mind and syntax processing. These two lines of research involved a great literature, full of important results, despite certain level of disagreement between researchers. For a long time, these two research perspectives continue to grow up separately despite research conclusion suggesting that the three variables should implicate same developmental period. Indeed, our goal was to study the relation between theory of mind, executive function, and language via a unique research question. It supposed that between executive function and language, one of the two variables could play a critical role in the relationship between theory of mind and the other variable. Thus, 112 children aged between three and six years old were recruited for completing a receptive and an expressive vocabulary task, a syntax understanding task, a theory of mind task, and three executive function tasks (inhibition, cognitive flexibility and working memory). The results showed significant correlations between performance on theory of mind task and performance on executive function domain tasks, except for cognitive flexibility task. We also found significant correlations between success on theory of mind task and performance in all language tasks. Multiple regression analysis justified only syntax and general abilities of language as possible predictors of theory of mind performance in our preschool age children sample. The results were discussed in the perspective of a great role of language abilities in theory of mind development. We also discussed possible reasons that could explain the non-significance of executive domains in predicting theory of mind performance, and the meaning of our results for the literature.

Keywords: child development, executive function, general language, syntax, theory of mind

Procedia PDF Downloads 59
5606 Process Optimization for Albanian Crude Oil Characterization

Authors: Xhaklina Cani, Ilirjan Malollari, Ismet Beqiraj, Lorina Lici

Abstract:

Oil characterization is an essential step in the design, simulation, and optimization of refining facilities. To achieve optimal crude selection and processing decisions, a refiner must have exact information refer to crude oil quality. This includes crude oil TBP-curve as the main data for correct operation of refinery crude oil atmospheric distillation plants. Crude oil is typically characterized based on a distillation assay. This procedure is reasonably well-defined and is based on the representation of the mixture of actual components that boil within a boiling point interval by hypothetical components that boil at the average boiling temperature of the interval. The crude oil assay typically includes TBP distillation according to ASTM D-2892, which can characterize this part of oil that boils up to 400 C atmospheric equivalent boiling point. To model the yield curves obtained by physical distillation is necessary to compare the differences between the modelling and the experimental data. Most commercial use a different number of components and pseudo-components to represent crude oil. Laboratory tests include distillations, vapor pressures, flash points, pour points, cetane numbers, octane numbers, densities, and viscosities. The aim of the study is the drawing of true boiling curves for different crude oil resources in Albania and to compare the differences between the modeling and the experimental data for optimal characterization of crude oil.

Keywords: TBP distillation curves, crude oil, optimization, simulation

Procedia PDF Downloads 299
5605 Gaussian Probability Density for Forest Fire Detection Using Satellite Imagery

Authors: S. Benkraouda, Z. Djelloul-Khedda, B. Yagoubi

Abstract:

we present a method for early detection of forest fires from a thermal infrared satellite image, using the image matrix of the probability of belonging. The principle of the method is to compare a theoretical mathematical model to an experimental model. We considered that each line of the image matrix, as an embodiment of a non-stationary random process. Since the distribution of pixels in the satellite image is statistically dependent, we divided these lines into small stationary and ergodic intervals to characterize the image by an adequate mathematical model. A standard deviation was chosen to generate random variables, so each interval behaves naturally like white Gaussian noise. The latter has been selected as the mathematical model that represents a set of very majority pixels, which we can be considered as the image background. Before modeling the image, we made a few pretreatments, then the parameters of the theoretical Gaussian model were extracted from the modeled image, these settings will be used to calculate the probability of each interval of the modeled image to belong to the theoretical Gaussian model. The high intensities pixels are regarded as foreign elements to it, so they will have a low probability, and the pixels that belong to the background image will have a high probability. Finally, we did present the reverse of the matrix of probabilities of these intervals for a better fire detection.

Keywords: forest fire, forest fire detection, satellite image, normal distribution, theoretical gaussian model, thermal infrared matrix image

Procedia PDF Downloads 137
5604 Mobile Payment over NFC: The M-Check System Case

Authors: Karima Maazouz, Habib Benlahmer, Naceur Achtaich

Abstract:

The realization of mobile payments will make possible new and unforeseen ways of convenience and m-commerce. Mobile payment today benefit from technology and trends. NFC technology is creating a new era of contactless mobile payment. the “M-check” is a mobile payment system provides a new way facilitating transaction with high valued payment and enable new m-commerce. The objective of the paper is to propose a new solution for m-payment. The proposed combination of m-check system and NFC offers acceptable security for payment mobile, client’s satisfaction, and simplifies the process payment between clients and merchants.

Keywords: M-payment, NFC, M-check, M-commerce, security

Procedia PDF Downloads 590
5603 Frailty Models for Modeling Heterogeneity: Simulation Study and Application to Quebec Pension Plan

Authors: Souad Romdhane, Lotfi Belkacem

Abstract:

When referring to actuarial analysis of lifetime, only models accounting for observable risk factors have been developed. Within this context, Cox proportional hazards model (CPH model) is commonly used to assess the effects of observable covariates as gender, age, smoking habits, on the hazard rates. These covariates may fail to fully account for the true lifetime interval. This may be due to the existence of another random variable (frailty) that is still being ignored. The aim of this paper is to examine the shared frailty issue in the Cox proportional hazard model by including two different parametric forms of frailty into the hazard function. Four estimated methods are used to fit them. The performance of the parameter estimates is assessed and compared between the classical Cox model and these frailty models through a real-life data set from the Quebec Pension Plan and then using a more general simulation study. This performance is investigated in terms of the bias of point estimates and their empirical standard errors in both fixed and random effect parts. Both the simulation and the real dataset studies showed differences between classical Cox model and shared frailty model.

Keywords: life insurance-pension plan, survival analysis, risk factors, cox proportional hazards model, multivariate failure-time data, shared frailty, simulations study

Procedia PDF Downloads 356
5602 Jensen's Inequality and M-Convex Functions

Authors: Yamin Sayyari

Abstract:

In this paper, we generalized the Jensen's inequality for m-convex functions and also we present a correction of Jensen's inequality which is a better than the generalization of this inequality for m-convex functions. Finally, we have found new lower and new upper bounds for Jensen's discrete inequality.

Keywords: Jensen's inequality, m-convex function, Convex function, Inequality

Procedia PDF Downloads 142
5601 An Application of Sinc Function to Approximate Quadrature Integrals in Generalized Linear Mixed Models

Authors: Altaf H. Khan, Frank Stenger, Mohammed A. Hussein, Reaz A. Chaudhuri, Sameera Asif

Abstract:

This paper discusses a novel approach to approximate quadrature integrals that arise in the estimation of likelihood parameters for the generalized linear mixed models (GLMM) as well as Bayesian methodology also requires computation of multidimensional integrals with respect to the posterior distributions in which computation are not only tedious and cumbersome rather in some situations impossible to find solutions because of singularities, irregular domains, etc. An attempt has been made in this work to apply Sinc function based quadrature rules to approximate intractable integrals, as there are several advantages of using Sinc based methods, for example: order of convergence is exponential, works very well in the neighborhood of singularities, in general quite stable and provide high accurate and double precisions estimates. The Sinc function based approach seems to be utilized first time in statistical domain to our knowledge, and it's viability and future scopes have been discussed to apply in the estimation of parameters for GLMM models as well as some other statistical areas.

Keywords: generalized linear mixed model, likelihood parameters, qudarature, Sinc function

Procedia PDF Downloads 391
5600 [Keynote Speech]: Bridge Damage Detection Using Frequency Response Function

Authors: Ahmed Noor Al-Qayyim

Abstract:

During the past decades, the bridge structures are considered very important portions of transportation networks, due to the fast urban sprawling. With the failure of bridges that under operating conditions lead to focus on updating the default bridge inspection methodology. The structures health monitoring (SHM) using the vibration response appeared as a promising method to evaluate the condition of structures. The rapid development in the sensors technology and the condition assessment techniques based on the vibration-based damage detection made the SHM an efficient and economical ways to assess the bridges. SHM is set to assess state and expects probable failures of designated bridges. In this paper, a presentation for Frequency Response function method that uses the captured vibration test information of structures to evaluate the structure condition. Furthermore, the main steps of the assessment of bridge using the vibration information are presented. The Frequency Response function method is applied to the experimental data of a full-scale bridge.

Keywords: bridge assessment, health monitoring, damage detection, frequency response function (FRF), signal processing, structure identification

Procedia PDF Downloads 342
5599 Efficient Subgoal Discovery for Hierarchical Reinforcement Learning Using Local Computations

Authors: Adrian Millea

Abstract:

In hierarchical reinforcement learning, one of the main issues encountered is the discovery of subgoal states or options (which are policies reaching subgoal states) by partitioning the environment in a meaningful way. This partitioning usually requires an expensive global clustering operation or eigendecomposition of the Laplacian of the states graph. We propose a local solution to this issue, much more efficient than algorithms using global information, which successfully discovers subgoal states by computing a simple function, which we call heterogeneity for each state as a function of its neighbors. Moreover, we construct a value function using the difference in heterogeneity from one step to the next, as reward, such that we are able to explore the state space much more efficiently than say epsilon-greedy. The same principle can then be applied to higher level of the hierarchy, where now states are subgoals discovered at the level below.

Keywords: exploration, hierarchical reinforcement learning, locality, options, value functions

Procedia PDF Downloads 169
5598 Storage Influence on Physico-Chemical Composition and Antioxidant Activity of Jamun Drink Prepared From Two Types of Pulp

Authors: Muhammad Atif Randhawa, Mahreen Akhtar, Sidrah

Abstract:

In this paper, Jamun (Syzygium cumini; Myrtaceae) drink enriched with jamun pulp and seed was assessed for different physicochemical parameters (titratable acidity, pH, TSS, ascorbic acid, and total sugars and reducing sugars) and phytochemical aspects at every 15 days interval till 60 days storage period. Jamun pulp both with seed and without seed were used at levels of 7, 10 and 13 percent to prepare jamun drink in six combinations; T1 (7% pulp without seed), T2 (10% pulp without seed), T3 (13% pulp without seed), T4 (7% pulp with seed), T5 (10% pulp with seed), T6 (13% pulp with seed). Storage period resulted decrease in pH (4.18 to 4.08) and ascorbic acid (21.92%) significantly along with phenolic contents (6.13 to 4.85g of GAE/kg) and antioxidant activity (70.68 to 48.62 percent) within treatments. All treatments showed significant increases in total sugars (11.59 to 11.80%), reducing sugars (2.30 to 2.50%), TSS (12.2 to 13.32 °B) and acidity (0.23% to 0.31%) during storage. Treatments T3, T5 and T6 showed best results in terms of all physicochemical parameters during storage. Statistically significant differences were obtained among sensory parameters as a function of pulp type and concentration, while treatment T5 (10% pulp with seed) obtained highest score (7.16) in terms of all sensory parameters. It can be concluded that nutrient rich jamun drink can be prepared as an attempt to add value to the underutilized jamun fruit of Pakistan.

Keywords: antioxidant activity, Jamun beverage, physicochemical, storage

Procedia PDF Downloads 307
5597 Bayesian Optimization for Reaction Parameter Tuning: An Exploratory Study of Parameter Optimization in Oxidative Desulfurization of Thiophene

Authors: Aman Sharma, Sonali Sengupta

Abstract:

The study explores the utility of Bayesian optimization in tuning the physical and chemical parameters of reactions in an offline experimental setup. A comparative analysis of the influence of the acquisition function on the optimization performance is also studied. For proxy first and second-order reactions, the results are indifferent to the acquisition function used, whereas, while studying the parameters for oxidative desulphurization of thiophene in an offline setup, upper confidence bound (UCB) provides faster convergence along with a marginal trade-off in the maximum conversion achieved. The work also demarcates the critical number of independent parameters and input observations required for both sequential and offline reaction setups to yield tangible results.

Keywords: acquisition function, Bayesian optimization, desulfurization, kinetics, thiophene

Procedia PDF Downloads 181
5596 Estimation of a Finite Population Mean under Random Non Response Using Improved Nadaraya and Watson Kernel Weights

Authors: Nelson Bii, Christopher Ouma, John Odhiambo

Abstract:

Non-response is a potential source of errors in sample surveys. It introduces bias and large variance in the estimation of finite population parameters. Regression models have been recognized as one of the techniques of reducing bias and variance due to random non-response using auxiliary data. In this study, it is assumed that random non-response occurs in the survey variable in the second stage of cluster sampling, assuming full auxiliary information is available throughout. Auxiliary information is used at the estimation stage via a regression model to address the problem of random non-response. In particular, the auxiliary information is used via an improved Nadaraya-Watson kernel regression technique to compensate for random non-response. The asymptotic bias and mean squared error of the estimator proposed are derived. Besides, a simulation study conducted indicates that the proposed estimator has smaller values of the bias and smaller mean squared error values compared to existing estimators of finite population mean. The proposed estimator is also shown to have tighter confidence interval lengths at a 95% coverage rate. The results obtained in this study are useful, for instance, in choosing efficient estimators of the finite population mean in demographic sample surveys.

Keywords: mean squared error, random non-response, two-stage cluster sampling, confidence interval lengths

Procedia PDF Downloads 133
5595 Analytical Design of Fractional-Order PI Controller for Decoupling Control System

Authors: Truong Nguyen Luan Vu, Le Hieu Giang, Le Linh

Abstract:

The FOPI controller is proposed based on the main properties of the decoupling control scheme, as well as the fractional calculus. By using the simplified decoupling technique, the transfer function of decoupled apparent process is firstly separated into a set of n equivalent independent processes in terms of a ratio of the diagonal elements of original open-loop transfer function to those of dynamic relative gain array and the fraction – order PI controller is then developed for each control loops due to the Bode’s ideal transfer function that gives the desired fractional closed-loop response in the frequency domain. The simulation studies were carried out to evaluate the proposed design approach in a fair compared with the other existing methods in accordance with the structured singular value (SSV) theory that used to measure the robust stability of control systems under multiplicative output uncertainty. The simulation results indicate that the proposed method consistently performs well with fast and well-balanced closed-loop time responses.

Keywords: ideal transfer function of bode, fractional calculus, fractional order proportional integral (FOPI) controller, decoupling control system

Procedia PDF Downloads 328
5594 Control of Biofilm Formation and Inorganic Particle Accumulation on Reverse Osmosis Membrane by Hypochlorite Washing

Authors: Masaki Ohno, Cervinia Manalo, Tetsuji Okuda, Satoshi Nakai, Wataru Nishijima

Abstract:

Reverse osmosis (RO) membranes have been widely used for desalination to purify water for drinking and other purposes. Although at present most RO membranes have no resistance to chlorine, chlorine-resistant membranes are being developed. Therefore, direct chlorine treatment or chlorine washing will be an option in preventing biofouling on chlorine-resistant membranes. Furthermore, if particle accumulation control is possible by using chlorine washing, expensive pretreatment for particle removal can be removed or simplified. The objective of this study was to determine the effective hypochlorite washing condition required for controlling biofilm formation and inorganic particle accumulation on RO membrane in a continuous flow channel with RO membrane and spacer. In this study, direct chlorine washing was done by soaking fouled RO membranes in hypochlorite solution and fluorescence intensity was used to quantify biofilm on the membrane surface. After 48 h of soaking the membranes in high fouling potential waters, the fluorescence intensity decreased to 0 from 470 using the following washing conditions: 10 mg/L chlorine concentration, 2 times/d washing interval, and 30 min washing time. The chlorine concentration required to control biofilm formation decreased as the chlorine concentration (0.5–10 mg/L), the washing interval (1–4 times/d), or the washing time (1–30 min) increased. For the sample solutions used in the study, 10 mg/L chlorine concentration with 2 times/d interval, and 5 min washing time was required for biofilm control. The optimum chlorine washing conditions obtained from soaking experiments proved to be applicable also in controlling biofilm formation in continuous flow experiments. Moreover, chlorine washing employed in controlling biofilm with suspended particles resulted in lower amounts of organic (0.03 mg/cm2) and inorganic (0.14 mg/cm2) deposits on the membrane than that for sample water without chlorine washing (0.14 mg/cm2 and 0.33 mg/cm2, respectively). The amount of biofilm formed was 79% controlled by continuous washing with 10 mg/L of free chlorine concentration, and the inorganic accumulation amount decreased by 58% to levels similar to that of pure water with kaolin (0.17 mg/cm2) as feed water. These results confirmed the acceleration of particle accumulation due to biofilm formation, and that the inhibition of biofilm growth can almost completely reduce further particle accumulation. In addition, effective hypochlorite washing condition which can control both biofilm formation and particle accumulation could be achieved.

Keywords: reverse osmosis, washing condition optimization, hypochlorous acid, biofouling control

Procedia PDF Downloads 344
5593 Combined Odd Pair Autoregressive Coefficients for Epileptic EEG Signals Classification by Radial Basis Function Neural Network

Authors: Boukari Nassim

Abstract:

This paper describes the use of odd pair autoregressive coefficients (Yule _Walker and Burg) for the feature extraction of electroencephalogram (EEG) signals. In the classification: the radial basis function neural network neural network (RBFNN) is employed. The RBFNN is described by his architecture and his characteristics: as the RBF is defined by the spread which is modified for improving the results of the classification. Five types of EEG signals are defined for this work: Set A, Set B for normal signals, Set C, Set D for interictal signals, set E for ictal signal (we can found that in Bonn university). In outputs, two classes are given (AC, AD, AE, BC, BD, BE, CE, DE), the best accuracy is calculated at 99% for the combined odd pair autoregressive coefficients. Our method is very effective for the diagnosis of epileptic EEG signals.

Keywords: epilepsy, EEG signals classification, combined odd pair autoregressive coefficients, radial basis function neural network

Procedia PDF Downloads 341
5592 Functions of Public Policy in Private International Law

Authors: Fedorova Elena

Abstract:

In this article, we draw a distinction between two important functions of public policy in private international law. The first function is widely recognized and relates to the prevention of application of foreign laws and enforcement of foreign court judgments whenever their effects are incompatible with the domestic legal system of the forum. This effectively protects sovereign rights of the forum state as it allows to resist against the undesirable effects of foreign law-making and law-enforcement policies. The second function is less obvious, but not less important. As the internal private legal relationships, international private relationships are usually governed by rules of public policy, to which the parties can not derogate by mutual agreement. Thefore, for international private law relations public policy has a different function than previously mentioned: in this case, the public policy acts as a defense against unacceptable effects of the party autonomy. Thus, this second function of public policy consists in the limitation of the party autonomy wich effects would be unacceptable for the local legal system. In the frame of this second function the author will analyse two types of public policy which can limit the party autonomy: « substantial » public policy (which regulates the substance of international legal relationship) and « conflictual » public policy (which regulates the party autonomy to choose the law applicable for the substance of relationship). The author provides an analysis of these functions of the public policy in the field of international contract law because of the important role of the principle of party autonomy for international contract relations.

Keywords: public policy, general theory of private international law, substantial public policy, conflictual public policy

Procedia PDF Downloads 568
5591 Exploring the In-Between: An Examination of the Contextual Factors That Impact How Young Children Come to Value and Use the Visual Arts in Their Learning and Lives

Authors: S. Probine

Abstract:

The visual arts have been proven to be a central means through which young children can communicate their ideas, reflect on experience, and construct new knowledge. Despite this, perceptions of, and the degree to which the visual arts are valued within education, vary widely within political, educational, community and family contexts. These differing perceptions informed my doctoral research project, which explored the contextual factors that affect how young children come to value and use the visual arts in their lives and learning. The qualitative methodology of narrative inquiry with inclusion of arts-based methods was most appropriate for this inquiry. Using a sociocultural framework, the stories collected were analysed through the sociocultural theories of Lev Vygotsky as well as the work of Urie Bronfenbrenner, together with postmodern theories about identity formation. The use of arts-based methods such as teacher’s reflective art journals and the collection of images by child participants and their parent/caregivers allowed the research participants to have a significant role in the research. Three early childhood settings at which the visual arts were deeply valued as a meaning-making device in children’s learning, were purposively selected to be involved in the research. At each setting, the study found a unique and complex web of influences and interconnections, which shaped how children utilised the visual arts to mediate their thinking. Although the teachers' practices at all three centres were influenced by sociocultural theories, each settings' interpretations of these theories were unique and resulted in innovative interpretations of the role of the teacher in supporting visual arts learning. These practices had a significant impact on children’s experiences of the visual arts. For many of the children involved in this study, visual art was the primary means through which they learned. The children in this study used visual art to represent their experiences, relationships, to explore working theories, their interests (including those related to popular culture), to make sense of their own and other cultures, and to enrich their imaginative play. This research demonstrates that teachers have fundamental roles in fostering and disseminating the importance of the visual arts within their educational communities.

Keywords: arts-based methods, early childhood education, teacher's visual arts pedagogies, visual arts

Procedia PDF Downloads 137
5590 A Perspective on Teaching Mathematical Concepts to Freshman Economics Students Using 3D-Visualisations

Authors: Muhammad Saqib Manzoor, Camille Dickson-Deane, Prashan Karunaratne

Abstract:

Cobb-Douglas production (utility) function is a fundamental function widely used in economics teaching and research. The key reason is the function's characteristics to describe the actual production using inputs like labour and capital. The characteristics of the function like returns to scale, marginal, and diminishing marginal productivities are covered in the introductory units in both microeconomics and macroeconomics with a 2-dimensional static visualisation of the function. However, less insight is provided regarding three-dimensional surface, changes in the curvature properties due to returns to scale, the linkage of the short-run production function with its long-run counterpart and marginal productivities, the level curves, and the constraint optimisation. Since (freshman) learners have diverse prior knowledge and cognitive skills, the existing “one size fits all” approach is not very helpful. The aim of this study is to bridge this gap by introducing technological intervention with interactive animations of the three-dimensional surface and sequential unveiling of the characteristics mentioned above using Python software. A small classroom intervention has helped students enhance their analytical and visualisation skills towards active and authentic learning of this topic. However, to authenticate the strength of our approach, a quasi-Delphi study will be conducted to ask domain-specific experts, “What value to the learning process in economics is there using a 2-dimensional static visualisation compared to using a 3-dimensional dynamic visualisation?’ Here three perspectives of the intervention were reviewed by a panel comprising of novice students, experienced students, novice instructors, and experienced instructors in an effort to determine the learnings from each type of visualisations within a specific domain of knowledge. The value of this approach is key to suggesting different pedagogical methods which can enhance learning outcomes.

Keywords: cobb-douglas production function, quasi-Delphi method, effective teaching and learning, 3D-visualisations

Procedia PDF Downloads 139
5589 Optimal Mother Wavelet Function for Shoulder Muscles of Upper Limb Amputees

Authors: Amanpreet Kaur

Abstract:

Wavelet transform (WT) is a powerful statistical tool used in applied mathematics for signal and image processing. The different mother, wavelet basis function, has been compared to select the optimal wavelet function that represents the electromyogram signal characteristics of upper limb amputees. Four different EMG electrode has placed on different location of shoulder muscles. Twenty one wavelet functions from different wavelet families were investigated. These functions included Daubechies (db1-db10), Symlets (sym1-sym5), Coiflets (coif1-coif5) and Discrete Meyer. Using mean square error value, the significance of the mother wavelet functions has been determined for teres, pectorals, and infraspinatus around shoulder muscles. The results show that the best mother wavelet is the db3 from the Daubechies family for efficient classification of the signal.

Keywords: Daubechies, upper limb amputation, shoulder muscles, Symlets, Coiflets

Procedia PDF Downloads 231
5588 Characterizing Solid Glass in Bending, Torsion and Tension: High-Temperature Dynamic Mechanical Analysis up to 950 °C

Authors: Matthias Walluch, José Alberto Rodríguez, Christopher Giehl, Gunther Arnold, Daniela Ehgartner

Abstract:

Dynamic mechanical analysis (DMA) is a powerful method to characterize viscoelastic properties and phase transitions for a wide range of materials. It is often used to characterize polymers and their temperature-dependent behavior, including thermal transitions like the glass transition temperature Tg, via determination of storage and loss moduli in tension (Young’s modulus, E) and shear or torsion (shear modulus, G) or other testing modes. While production and application temperatures for polymers are often limited to several hundred degrees, material properties of glasses usually require characterization at temperatures exceeding 600 °C. This contribution highlights a high temperature setup for rotational and oscillatory rheometry as well as for DMA in different modes. The implemented standard convection oven enables the characterization of glass in different loading modes at temperatures up to 950 °C. Three-point bending, tension and torsional measurements on different glasses, with E and G moduli as a function of frequency and temperature, are presented. Additional tests include superimposing several frequencies in a single temperature sweep (“multiwave”). This type of test results in a considerable reduction of the experiment time and allows to evaluate structural changes of the material and their frequency dependence. Furthermore, DMA in torsion and tension was performed to determine the complex Poisson’s ratio as a function of frequency and temperature within a single test definition. Tests were performed in a frequency range from 0.1 to 10 Hz and temperatures up to the glass transition. While variations in the frequency did not reveal significant changes of the complex Poisson’s ratio of the glass, a monotonic increase of this parameter was observed when increasing the temperature. This contribution outlines the possibilities of DMA in bending, tension and torsion for an extended temperature range. It allows the precise mechanical characterization of material behavior from room temperature up to the glass transition and the softening temperature interval. Compared to other thermo-analytical methods, like Dynamic Scanning Calorimetry (DSC) where mechanical stress is neglected, the frequency-dependence links measurement results (e.g. relaxation times) to real applications

Keywords: dynamic mechanical analysis, oscillatory rheometry, Poisson's ratio, solid glass, viscoelasticity

Procedia PDF Downloads 78