Search results for: entropy minimization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 610

Search results for: entropy minimization

400 Investment Projects Selection Problem under Hesitant Fuzzy Environment

Authors: Irina Khutsishvili

Abstract:

In the present research, a decision support methodology for the multi-attribute group decision-making (MAGDM) problem is developed, namely for the selection of investment projects. The objective of the investment project selection problem is to choose the best project among the set of projects, seeking investment, or to rank all projects in descending order. The project selection is made considering a set of weighted attributes. To evaluate the attributes in our approach, expert assessments are used. In the proposed methodology, lingual expressions (linguistic terms) given by all experts are used as initial attribute evaluations, since they are the most natural and convenient representation of experts' evaluations. Then lingual evaluations are converted into trapezoidal fuzzy numbers, and the aggregate trapezoidal hesitant fuzzy decision matrix will be built. The case is considered when information on the attribute weights is completely unknown. The attribute weights are identified based on the De Luca and Termini information entropy concept, determined in the context of hesitant fuzzy sets. The decisions are made using the extended Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) method under a hesitant fuzzy environment. Hence, a methodology is based on a trapezoidal valued hesitant fuzzy TOPSIS decision-making model with entropy weights. The ranking of alternatives is performed by the proximity of their distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). For this purpose, the weighted hesitant Hamming distance is used. An example of investment decision-making is shown that clearly explains the procedure of the proposed methodology.

Keywords: In the present research, a decision support methodology for the multi-attribute group decision-making (MAGDM) problem is developed, namely for the selection of investment projects. The objective of the investment project selection problem is to choose the best project among the set of projects, seeking investment, or to rank all projects in descending order. The project selection is made considering a set of weighted attributes. To evaluate the attributes in our approach, expert assessments are used. In the proposed methodology, lingual expressions (linguistic terms) given by all experts are used as initial attribute evaluations since they are the most natural and convenient representation of experts' evaluations. Then lingual evaluations are converted into trapezoidal fuzzy numbers, and the aggregate trapezoidal hesitant fuzzy decision matrix will be built. The case is considered when information on the attribute weights is completely unknown. The attribute weights are identified based on the De Luca and Termini information entropy concept, determined in the context of hesitant fuzzy sets. The decisions are made using the extended Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) method under a hesitant fuzzy environment. Hence, a methodology is based on a trapezoidal valued hesitant fuzzy TOPSIS decision-making model with entropy weights. The ranking of alternatives is performed by the proximity of their distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). For this purpose, the weighted hesitant Hamming distance is used. An example of investment decision-making is shown that clearly explains the procedure of the proposed methodology.

Procedia PDF Downloads 89
399 Ill-Posed Inverse Problems in Molecular Imaging

Authors: Ranadhir Roy

Abstract:

Inverse problems arise in medical (molecular) imaging. These problems are characterized by large in three dimensions, and by the diffusion equation which models the physical phenomena within the media. The inverse problems are posed as a nonlinear optimization where the unknown parameters are found by minimizing the difference between the predicted data and the measured data. To obtain a unique and stable solution to an ill-posed inverse problem, a priori information must be used. Mathematical conditions to obtain stable solutions are established in Tikhonov’s regularization method, where the a priori information is introduced via a stabilizing functional, which may be designed to incorporate some relevant information of an inverse problem. Effective determination of the Tikhonov regularization parameter requires knowledge of the true solution, or in the case of optical imaging, the true image. Yet, in, clinically-based imaging, true image is not known. To alleviate these difficulties we have applied the penalty/modified barrier function (PMBF) method instead of Tikhonov regularization technique to make the inverse problems well-posed. Unlike the Tikhonov regularization method, the constrained optimization technique, which is based on simple bounds of the optical parameter properties of the tissue, can easily be implemented in the PMBF method. Imposing the constraints on the optical properties of the tissue explicitly restricts solution sets and can restore uniqueness. Like the Tikhonov regularization method, the PMBF method limits the size of the condition number of the Hessian matrix of the given objective function. The accuracy and the rapid convergence of the PMBF method require a good initial guess of the Lagrange multipliers. To obtain the initial guess of the multipliers, we use a least square unconstrained minimization problem. Three-dimensional images of fluorescence absorption coefficients and lifetimes were reconstructed from contact and noncontact experimentally measured data.

Keywords: constrained minimization, ill-conditioned inverse problems, Tikhonov regularization method, penalty modified barrier function method

Procedia PDF Downloads 247
398 Bioclimatic Niches of Endangered Garcinia indica Species on the Western Ghats: Predicting Habitat Suitability under Current and Future Climate

Authors: Malay K. Pramanik

Abstract:

In recent years, climate change has become a major threat and has been widely documented in the geographic distribution of many plant species. However, the impacts of climate change on the distribution of ecologically vulnerable medicinal species remain largely unknown. The identification of a suitable habitat for a species under climate change scenario is a significant step towards the mitigation of biodiversity decline. The study, therefore, aims to predict the impact of current, and future climatic scenarios on the distribution of the threatened Garcinia indica across the northern Western Ghats using Maximum Entropy (MaxEnt) modelling. The future projections were made for the year 2050 and 2070 with all Representative Concentration Pathways (RCPs) scenario (2.6, 4.5, 6.0, and 8.5) using 56 species occurrence data, and 19 bioclimatic predictors from the BCC-CSM1.1 model of the Intergovernmental Panel for Climate Change’s (IPCC) 5th assessment. The bioclimatic variables were minimised to a smaller number of variables after a multicollinearity test, and their contributions were assessed using jackknife test. The AUC value of 0.956 ± 0.023 indicates that the model performs with excellent accuracy. The study identified that temperature seasonality (39.5 ± 3.1%), isothermality (19.2 ± 1.6%), and annual precipitation (12.7 ± 1.7%) would be the major influencing variables in the current and future distribution. The model predicted 10.5% (19318.7 sq. km) of the study area as moderately to very highly suitable, while 82.60% (151904 sq. km) of the study area was identified as ‘unsuitable’ or ‘very low suitable’. Our predictions of climate change impact on habitat suitability suggest that there will be a drastic reduction in the suitability by 5.29% and 5.69% under RCP 8.5 for 2050 and 2070, respectively. Finally, the results signify that the model might be an effective tool for biodiversity protection, ecosystem management, and species re-habitation planning under future climate change scenarios.

Keywords: Garcinia Indica, maximum entropy modelling, climate change, MaxEnt, Western Ghats, medicinal plants

Procedia PDF Downloads 129
397 A Holistic Approach for Technical Product Optimization

Authors: Harald Lang, Michael Bader, A. Buchroithner

Abstract:

Holistic methods covering the development process as a whole – e.g. systems engineering – have established themselves in product design. However, technical product optimization, representing improvements in efficiency and/or minimization of loss, usually applies to single components of a system. A holistic approach is being defined based on a hierarchical point of view of systems engineering. This is subsequently presented using the example of an electromechanical flywheel energy storage system for automotive applications.

Keywords: design, product development, product optimization, systems engineering

Procedia PDF Downloads 596
396 Implementation of Statistical Parameters to Form an Entropic Mathematical Models

Authors: Gurcharan Singh Buttar

Abstract:

It has been discovered that although these two areas, statistics, and information theory, are independent in their nature, they can be combined to create applications in multidisciplinary mathematics. This is due to the fact that where in the field of statistics, statistical parameters (measures) play an essential role in reference to the population (distribution) under investigation. Information measure is crucial in the study of ambiguity, assortment, and unpredictability present in an array of phenomena. The following communication is a link between the two, and it has been demonstrated that the well-known conventional statistical measures can be used as a measure of information.

Keywords: probability distribution, entropy, concavity, symmetry, variance, central tendency

Procedia PDF Downloads 134
395 Optimization of the Numerical Fracture Mechanics

Authors: H. Hentati, R. Abdelmoula, Li Jia, A. Maalej

Abstract:

In this work, we present numerical simulations of the quasi-static crack propagation based on the variation approach. We perform numerical simulations of a piece of brittle material without initial crack. An alternate minimization algorithm is used. Based on these numerical results, we determine the influence of numerical parameters on the location of crack. We show the importance of trying to optimize the time of numerical computation and we present the first attempt to develop a simple numerical method to optimize this time.

Keywords: fracture mechanics, optimization, variation approach, mechanic

Procedia PDF Downloads 575
394 Exploring Error-Minimization Protocols for Upper-Limb Function During Activities of Daily Life in Chronic Stroke Patients

Authors: M. A. Riurean, S. Heijnen, C. A. Knott, J. Makinde, D. Gotti, J. VD. Kamp

Abstract:

Objectives: The current study is done in preparation for a randomized controlled study investigating the effects of an implicit motor learning protocol implemented using an extension-supporting glove. It will explore different protocols to find out which is preferred when studying motor learn-ing in the chronic stroke population that struggles with hand spasticity. Design: This exploratory study will follow 24 individuals who have a chronic stroke (> 6 months) during their usual care journey. We will record the results of two 9-Hole Peg Tests (9HPT) done during their therapy ses-sions with a physiotherapist or in their home before and after 4 weeks of them wearing an exten-sion-supporting glove used to employ the to-be-studied protocols. The participants will wear the glove 3 times/week for one hour while performing their activities of daily living and record the times they wore it in a diary. Their experience will be monitored through telecommunication once every week. Subjects: Individuals that have had a stroke at least 6 months prior to participation, hand spasticity measured on the modified Ashworth Scale of maximum 3, and finger flexion motor control measured on the Motricity Index of at least 19/33. Exclusion criteria: extreme hemi-neglect. Methods: The participants will be randomly divided into 3 groups: one group using the glove in a pre-set way of decreasing support (implicit motor learning), one group using the glove in a self-controlled way of decreasing support (autonomous motor learning), and the third using the glove with constant support (as control). Before and after the 4-week period, there will be an intake session and a post-assessment session. Analysis: We will compare the results of the two 9HPTs to check whether the protocols were effective. Furthermore, we will compare the results between the three groups to find the preferred one. A qualitative analysis will be run of the experience of participants throughout the 4-week period. Expected results: We expect that the group using the implicit learning protocol will show superior results.

Keywords: implicit learning, hand spasticity, stroke, error minimization, motor task

Procedia PDF Downloads 31
393 Instructional Information Resources

Authors: Parveen Kumar

Abstract:

This article discusses institute information resources. Information, in its most restricted technical sense, is a sequence of symbols that can be interpreted as message information can be recorded as signs, or transmitted as signals. Information is any kind of event that affects the state of a dynamic system. Conceptually, information is the message being conveyed. This concept has numerous other meanings in different contexts. Moreover, the concept of information is closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, mental stimulus, pattern, perception, representation, and especially entropy.

Keywords: institutions, information institutions, information services for mission-oriented institute, pattern

Procedia PDF Downloads 330
392 A Combined Meta-Heuristic with Hyper-Heuristic Approach to Single Machine Production Scheduling Problem

Authors: C. E. Nugraheni, L. Abednego

Abstract:

This paper is concerned with minimization of mean tardiness and flow time in a real single machine production scheduling problem. Two variants of genetic algorithm as meta-heuristic are combined with hyper-heuristic approach are proposed to solve this problem. These methods are used to solve instances generated with real world data from a company. Encouraging results are reported.

Keywords: hyper-heuristics, evolutionary algorithms, production scheduling, meta-heuristic

Procedia PDF Downloads 353
391 Physics of the Riemann Zeros: The Low Bound for the Zeta Derivative via Quantum Field Theory

Authors: Andrey Egorov

Abstract:

A product of the specific Lagrangian and the entropy factor is defined. Its positive definiteness is stated for the proper coupling constant. The passage from statistical mechanics to quantum field theory is performed by Wick rotation. The Green function (a convolution of the spectral amplitude and the propagator) is positive. Masses of quasiparticles are computed as residues. The role of the zeta derivative at zeta zeros is then highlighted, and the correspondent low bound is obtained.

Keywords: mass gap, positive definite kernels, quantum fields, Riemann zeta zeros

Procedia PDF Downloads 20
390 Two Stage Fuzzy Methodology to Evaluate the Credit Risks of Investment Projects

Authors: O. Badagadze, G. Sirbiladze, I. Khutsishvili

Abstract:

The work proposes a decision support methodology for the credit risk minimization in selection of investment projects. The methodology provides two stages of projects’ evaluation. Preliminary selection of projects with minor credit risks is made using the Expertons Method. The second stage makes ranking of chosen projects using the Possibilistic Discrimination Analysis Method. The latter is a new modification of a well-known Method of Fuzzy Discrimination Analysis.

Keywords: expert valuations, expertons, investment project risks, positive and negative discriminations, possibility distribution

Procedia PDF Downloads 636
389 Design of a Fuzzy Luenberger Observer for Fault Nonlinear System

Authors: Mounir Bekaik, Messaoud Ramdani

Abstract:

We present in this work a new technique of stabilization for fault nonlinear systems. The approach we adopt focus on a fuzzy Luenverger observer. The T-S approximation of the nonlinear observer is based on fuzzy C-Means clustering algorithm to find local linear subsystems. The MOESP identification approach was applied to design an empirical model describing the subsystems state variables. The gain of the observer is given by the minimization of the estimation error through Lyapunov-krasovskii functional and LMI approach. We consider a three tank hydraulic system for an illustrative example.

Keywords: nonlinear system, fuzzy, faults, TS, Lyapunov-Krasovskii, observer

Procedia PDF Downloads 301
388 Minimizing Fresh and Wastewater Using Water Pinch Technique in Petrochemical Industries

Authors: Wasif Mughees, Malik Al-Ahmad, Muhammad Naeem

Abstract:

This research involves the design and analysis of pinch-based water/wastewater networks to minimize water utility in the petrochemical and petroleum industries. A study has been done on Tehran Oil Refinery to analyze feasibilities of regeneration, reuse and recycling of water network. COD is considered as a single key contaminant. Amount of freshwater was reduced about 149m3/h (43.8%) regarding COD. Re-design (or retrofitting) of water allocation in the networks was undertaken. The results were analyzed through graphical method and mathematical programming technique which clearly demonstrated that amount of required water would be determined by mass transfer of COD.

Keywords: minimization, water pinch, water management, pollution prevention

Procedia PDF Downloads 419
387 Properties of Magnesium-Based Hydrogen Storage Alloy Added with Palladium and Titanium Hydride

Authors: Jun Ying Lin, Tzu Hsiang Yen, Cha'o Kuang Chen

Abstract:

Nowadays, the great majority believe that there is great potentiality in hydrogen storage alloy storing hydrogen by physical and chemical absorption. However, the hydrogen storage alloy is limited by high operation temperature. Scientists find that adding transition elements can improve the properties of hydrogen storage alloy. In this research, outstanding improvements of kinetic and thermal properties are given by the addition of Palladium and Titanium hydride to Magnesium-based hydrogen storage alloy. Magnesium-based alloy is the main material, into which TiH2 / Pd are added separately. Following that, materials are milled by a Planetary Ball Miller at 650 rpm. TGA/DSC and PCT measure the capacity, spending time and temperature of abs/des-orption. Additionally, SEM and XRD analyze the structures and components of material. It is clearly shown that Pd is beneficial to kinetic properties. 2MgH2-0.1Pd has the highest capacity of all the alloys listed, approximately 5.5 wt%. Secondly, there are not any new Ti-related compounds found from XRD analysis. Thus, TiH2, considered as the catalyst, leads to the condition of 2MgH2-TiH2 and 2MgH2-TiH2-0.1Pd efficiently absorbing hydrogen in low temperature. 2MgH2-TiH2 can reach roughly 3.0 wt% in 82.4 minutes at 50°C and 8 minutes at 100°C, while2MgH2-TiH2-0.1Pd can reach 2.0 wt% in 400 minutes at 50°C and in 48 minutes at 100°C. The lowest temperature of 2MgH2-0.1Pd and 2MgH2-TiH2 is similar (320°C), otherwise the lowest temperature of 2MgH2-TiH2-0.1Pd decrease by 20°C. From XRD, it can be observed that PdTi2 and Pd3Ti are produced by mechanical alloying when adding Pd as well as TiH2 into MgH2. Due to the synergistic effects between Pd and TiH2, 2MgH2-TiH2-0.1Pd owns the lowest dehydrogenation temperature. Furthermore, the Pressure-Composition-Temperature (PCT) curve of 2MgH2-TiH2-0.1Pd is measured at different temperature, 370°C, 350°C, 320°C and 300°C separately. The plateau pressure is given form the PCT curves above. In accordance to different plateau pressures, enthalpy and entropy in the Van’t Hoff equation can be solved. In 2MgH2-TiH2-0.1Pd, the enthalpy is 74.9 KJ/mol and the entropy is 122.9 J/mol. Activation means that hydrogen storage alloy undergoes repeat abs/des-orpting processes. It plays an important role in the abs/des-orption. Activation shortens the abs/des-orption time because of the increase in surface area. From SEM, it is clear that the grain size and surface become smaller and rougher

Keywords: hydrogen storage materials, magnesium hydride, abs-/des-orption performance, Plateau pressure

Procedia PDF Downloads 228
386 Discovering Event Outliers for Drug as Commercial Products

Authors: Arunas Burinskas, Aurelija Burinskiene

Abstract:

On average, ten percent of drugs - commercial products are not available in pharmacies due to shortage. The shortage event disbalance sales and requires a recovery period, which is too long. Therefore, one of the critical issues that pharmacies do not record potential sales transactions during shortage and recovery periods. The authors suggest estimating outliers during shortage and recovery periods. To shorten the recovery period, the authors suggest using average sales per sales day prediction, which helps to protect the data from being downwards or upwards. Authors use the outlier’s visualization method across different drugs and apply the Grubbs test for significance evaluation. The researched sample is 100 drugs in a one-month time frame. The authors detected that high demand variability products had outliers. Among analyzed drugs, which are commercial products i) High demand variability drugs have a one-week shortage period, and the probability of facing a shortage is equal to 69.23%. ii) Mid demand variability drugs have three days shortage period, and the likelihood to fall into deficit is equal to 34.62%. To avoid shortage events and minimize the recovery period, real data must be set up. Even though there are some outlier detection methods for drug data cleaning, they have not been used for the minimization of recovery period once a shortage has occurred. The authors use Grubbs’ test real-life data cleaning method for outliers’ adjustment. In the paper, the outliers’ adjustment method is applied with a confidence level of 99%. In practice, the Grubbs’ test was used to detect outliers for cancer drugs and reported positive results. The application of the Grubbs’ test is used to detect outliers which exceed boundaries of normal distribution. The result is a probability that indicates the core data of actual sales. The application of the outliers’ test method helps to represent the difference of the mean of the sample and the most extreme data considering the standard deviation. The test detects one outlier at a time with different probabilities from a data set with an assumed normal distribution. Based on approximation data, the authors constructed a framework for scaling potential sales and estimating outliers with Grubbs’ test method. The suggested framework is applicable during the shortage event and recovery periods. The proposed framework has practical value and could be used for the minimization of the recovery period required after the shortage of event occurrence.

Keywords: drugs, Grubbs' test, outlier, shortage event

Procedia PDF Downloads 109
385 Effects of Subsidy Reform on Consumption and Income Inequalities in Iran

Authors: Pouneh Soleimaninejadian, Chengyu Yang

Abstract:

In this paper, we use data on Household Income and Expenditure survey of Statistics Centre of Iran, conducted from 2005-2014, to calculate several inequality measures and to estimate the effects of Iran’s targeted subsidy reform act on consumption and income inequality. We first calculate Gini coefficients for income and consumption in order to study the relation between the two and also the effects of subsidy reform. Results show that consumption inequality has not been always mirroring changes in income inequality. However, both Gini coefficients indicate that subsidy reform caused improvement in inequality. Then we calculate Generalized Entropy Index based on consumption and income for years before and after the Subsidy Reform Act of 2010 in order to have a closer look into the changes in internal structure of inequality after subsidy reforms. We find that the improvement in income inequality is mostly caused by the decrease in inequality of lower income individuals. At the same time consumption inequality has been decreased as a result of more equal consumption in both lower and higher income groups. Moreover, the increase in Engle coefficient after the subsidy reform shows that a bigger portion of income is allocated to consumption on food which is a sign of lower living standard in general. This increase in Engle coefficient is due to rise in inflation rate and relative increase in price of food which partially is another consequence of subsidy reform. We have conducted some experiments on effect of subsidy payments and possible effects of change on distribution pattern and amount of cash subsidy payments on income inequality. Result of the effect of cash payments on income inequality shows that it leads to a definite decrease in income inequality and had a bigger share in improvement of rural areas compared to those of urban households. We also examine the possible effect of constant payments on the increasing income inequality for years after 2011. We conclude that reduction in value of payments as a result of inflation plays an important role regardless of the fact that there may be other reasons. We finally experiment with alternative allocations of transfers while keeping the total amount of cash transfers constant or make it smaller through eliminating three higher deciles from the cash payment program, the result shows that income equality would be improved significantly.

Keywords: consumption inequality, generalized entropy index, income inequality, Irans subsidy reform

Procedia PDF Downloads 202
384 An Approximation Method for Exact Boundary Controllability of Euler-Bernoulli

Authors: A. Khernane, N. Khelil, L. Djerou

Abstract:

The aim of this work is to study the numerical implementation of the Hilbert uniqueness method for the exact boundary controllability of Euler-Bernoulli beam equation. This study may be difficult. This will depend on the problem under consideration (geometry, control, and dimension) and the numerical method used. Knowledge of the asymptotic behaviour of the control governing the system at time T may be useful for its calculation. This idea will be developed in this study. We have characterized as a first step the solution by a minimization principle and proposed secondly a method for its resolution to approximate the control steering the considered system to rest at time T.

Keywords: boundary control, exact controllability, finite difference methods, functional optimization

Procedia PDF Downloads 315
383 A New Distribution and Application on the Lifetime Data

Authors: Gamze Ozel, Selen Cakmakyapan

Abstract:

We introduce a new model called the Marshall-Olkin Rayleigh distribution which extends the Rayleigh distribution using Marshall-Olkin transformation and has increasing and decreasing shapes for the hazard rate function. Various structural properties of the new distribution are derived including explicit expressions for the moments, generating and quantile function, some entropy measures, and order statistics are presented. The model parameters are estimated by the method of maximum likelihood and the observed information matrix is determined. The potentiality of the new model is illustrated by means of real life data set.

Keywords: Marshall-Olkin distribution, Rayleigh distribution, estimation, maximum likelihood

Procedia PDF Downloads 465
382 Exergy Model for a Solar Water Heater with Flat Plate Collector

Authors: P. Sathyakala, G. Sai Sundara Krishnan

Abstract:

The objective of this paper is to derive an exergy model for a solar water heater with honey comb structure in order to identify the element which has larger irreversibility in the system. This will help us in finding the means to reduce the wasted work potential so that the overall efficiency of the system can be improved by finding the ways to reduce those wastages.

Keywords: exergy, energy balance, entropy balance, work potential, degradation, honey comb, flat plate collector

Procedia PDF Downloads 452
381 Application of GIS Techniques for Analysing Urban Built-Up Growth of Class-I Indian Cities: A Case Study of Surat

Authors: Purba Biswas, Priyanka Dey

Abstract:

Worldwide rapid urbanisation has accelerated city expansion in both developed and developing nations. This unprecedented urbanisation trend due to the increasing population and economic growth has caused challenges for the decision-makers in city planning and urban management. Metropolitan cities, class-I towns, and major urban centres undergo a continuous process of evolution due to interaction between socio-cultural and economic attributes. This constant evolution leads to urban expansion in all directions. Understanding the patterns and dynamics of urban built-up growth is crucial for policymakers, urban planners, and researchers, as it aids in resource management, decision-making, and the development of sustainable strategies to address the complexities associated with rapid urbanisation. Identifying spatio-temporal patterns of urban growth has emerged as a crucial challenge in monitoring and assessing present and future trends in urban development. Analysing urban growth patterns and tracking changes in land use is an important aspect of urban studies. This study analyses spatio-temporal urban transformations and land-use and land cover changes using remote sensing and GIS techniques. Built-up growth analysis has been done for the city of Surat as a case example, using the GIS tools of NDBI and GIS models of the Built-up Urban Density Index and Shannon Entropy Index to identify trends and the geographical direction of transformation from 2005 to 2020. Surat is one of the fastest-growing urban centres in both the state and the nation, ranking as the 4th fastest-growing city globally. This study analyses the dynamics of urban built-up area transformations both zone-wise and geographical direction-wise, in which their trend, rate, and magnitude were calculated for the period of 15 years. This study also highlights the need for analysing and monitoring the urban growth pattern of class-I cities in India using spatio-temporal and quantitative techniques like GIS for improved urban management.

Keywords: urban expansion, built-up, geographic information system, remote sensing, Shannon’s entropy

Procedia PDF Downloads 26
380 Input-Output Analysis in Laptop Computer Manufacturing

Authors: H. Z. Ulukan, E. Demircioğlu, M. Erol Genevois

Abstract:

The scope of this paper and the aim of proposed model were to apply monetary Input –Output (I-O) analysis to point out the importance of reusing know-how and other requirements in order to reduce the production costs in a manufacturing process for a laptop computer. I-O approach using the monetary input-output model is employed to demonstrate the impacts of different factors in a manufacturing process. A sensitivity analysis showing the correlation between these different factors is also presented. It is expected that the recommended model would have an advantageous effect in the cost minimization process.

Keywords: input-output analysis, monetary input-output model, manufacturing process, laptop computer

Procedia PDF Downloads 364
379 Co-Gasification of Petroleum Waste and Waste Tires: A Numerical and CFD Study

Authors: Thomas Arink, Isam Janajreh

Abstract:

The petroleum industry generates significant amounts of waste in the form of drill cuttings, contaminated soil and oily sludge. Drill cuttings are a product of the off-shore drilling rigs, containing wet soil and total petroleum hydrocarbons (TPH). Contaminated soil comes from different on-shore sites and also contains TPH. The oily sludge is mainly residue or tank bottom sludge from storage tanks. The two main treatment methods currently used are incineration and thermal desorption (TD). Thermal desorption is a method where the waste material is heated to 450ºC in an anaerobic environment to release volatiles, the condensed volatiles can be used as a liquid fuel. For the thermal desorption unit dry contaminated soil is mixed with moist drill cuttings to generate a suitable mixture. By thermo gravimetric analysis (TGA) of the TD feedstock it was found that less than 50% of the TPH are released, the discharged material is stored in landfill. This study proposes co-gasification of petroleum waste with waste tires as an alternative to thermal desorption. Co-gasification with a high-calorific material is necessary since the petroleum waste consists of more than 60 wt% ash (soil/sand), causing its calorific value to be too low for gasification. Since the gasification process occurs at 900ºC and higher, close to 100% of the TPH can be released, according to the TGA. This work consists of three parts: 1. a mathematical gasification model, 2. a reactive flow CFD model and 3. experimental work on a drop tube reactor. Extensive material characterization was done by means of proximate analysis (TGA), ultimate analysis (CHNOS flash analysis) and calorific value measurements (Bomb calorimeter) for the input parameters of the mathematical and CFD model. The mathematical model is a zero dimensional model based on Gibbs energy minimization together with Lagrange multiplier; it is used to find the product species composition (molar fractions of CO, H2, CH4 etc.) for different tire/petroleum feedstock mixtures and equivalence ratios. The results of the mathematical model act as a reference for the CFD model of the drop-tube reactor. With the CFD model the efficiency and product species composition can be predicted for different mixtures and particle sizes. Finally both models are verified by experiments on a drop tube reactor (1540 mm long, 66 mm inner diameter, 1400 K maximum temperature).

Keywords: computational fluid dynamics (CFD), drop tube reactor, gasification, Gibbs energy minimization, petroleum waste, waste tires

Procedia PDF Downloads 492
378 Issues in Travel Demand Forecasting

Authors: Huey-Kuo Chen

Abstract:

Travel demand forecasting including four travel choices, i.e., trip generation, trip distribution, modal split and traffic assignment constructs the core of transportation planning. In its current application, travel demand forecasting has associated with three important issues, i.e., interface inconsistencies among four travel choices, inefficiency of commonly used solution algorithms, and undesirable multiple path solutions. In this paper, each of the three issues is extensively elaborated. An ideal unified framework for the combined model consisting of the four travel choices and variable demand functions is also suggested. Then, a few remarks are provided in the end of the paper.

Keywords: travel choices, B algorithm, entropy maximization, dynamic traffic assignment

Procedia PDF Downloads 420
377 Classification of Digital Chest Radiographs Using Image Processing Techniques to Aid in Diagnosis of Pulmonary Tuberculosis

Authors: A. J. S. P. Nileema, S. Kulatunga , S. H. Palihawadana

Abstract:

Computer aided detection (CAD) system was developed for the diagnosis of pulmonary tuberculosis using digital chest X-rays with MATLAB image processing techniques using a statistical approach. The study comprised of 200 digital chest radiographs collected from the National Hospital for Respiratory Diseases - Welisara, Sri Lanka. Pre-processing was done to remove identification details. Lung fields were segmented and then divided into four quadrants; right upper quadrant, left upper quadrant, right lower quadrant, and left lower quadrant using the image processing techniques in MATLAB. Contrast, correlation, homogeneity, energy, entropy, and maximum probability texture features were extracted using the gray level co-occurrence matrix method. Descriptive statistics and normal distribution analysis were performed using SPSS. Depending on the radiologists’ interpretation, chest radiographs were classified manually into PTB - positive (PTBP) and PTB - negative (PTBN) classes. Features with standard normal distribution were analyzed using an independent sample T-test for PTBP and PTBN chest radiographs. Among the six features tested, contrast, correlation, energy, entropy, and maximum probability features showed a statistically significant difference between the two classes at 95% confidence interval; therefore, could be used in the classification of chest radiograph for PTB diagnosis. With the resulting value ranges of the five texture features with normal distribution, a classification algorithm was then defined to recognize and classify the quadrant images; if the texture feature values of the quadrant image being tested falls within the defined region, it will be identified as a PTBP – abnormal quadrant and will be labeled as ‘Abnormal’ in red color with its border being highlighted in red color whereas if the texture feature values of the quadrant image being tested falls outside of the defined value range, it will be identified as PTBN–normal and labeled as ‘Normal’ in blue color but there will be no changes to the image outline. The developed classification algorithm has shown a high sensitivity of 92% which makes it an efficient CAD system and with a modest specificity of 70%.

Keywords: chest radiographs, computer aided detection, image processing, pulmonary tuberculosis

Procedia PDF Downloads 93
376 Market Solvency Capital Requirement Minimization: How Non-linear Solvers Provide Portfolios Complying with Solvency II Regulation

Authors: Abraham Castellanos, Christophe Durville, Sophie Echenim

Abstract:

In this article, a portfolio optimization problem is performed in a Solvency II context: it illustrates how advanced optimization techniques can help to tackle complex operational pain points around the monitoring, control, and stability of Solvency Capital Requirement (SCR). The market SCR of a portfolio is calculated as a combination of SCR sub-modules. These sub-modules are the results of stress-tests on interest rate, equity, property, credit and FX factors, as well as concentration on counter-parties. The market SCR is non convex and non differentiable, which does not make it a natural optimization criteria candidate. In the SCR formulation, correlations between sub-modules are fixed, whereas risk-driven portfolio allocation is usually driven by the dynamics of the actual correlations. Implementing a portfolio construction approach that is efficient on both a regulatory and economic standpoint is not straightforward. Moreover, the challenge for insurance portfolio managers is not only to achieve a minimal SCR to reduce non-invested capital but also to ensure stability of the SCR. Some optimizations have already been performed in the literature, simplifying the standard formula into a quadratic function. But to our knowledge, it is the first time that the standard formula of the market SCR is used in an optimization problem. Two solvers are combined: a bundle algorithm for convex non- differentiable problems, and a BFGS (Broyden-Fletcher-Goldfarb- Shanno)-SQP (Sequential Quadratic Programming) algorithm, to cope with non-convex cases. A market SCR minimization is then performed with historical data. This approach results in significant reduction of the capital requirement, compared to a classical Markowitz approach based on the historical volatility. A comparative analysis of different optimization models (equi-risk-contribution portfolio, minimizing volatility portfolio and minimizing value-at-risk portfolio) is performed and the impact of these strategies on risk measures including market SCR and its sub-modules is evaluated. A lack of diversification of market SCR is observed, specially for equities. This was expected since the market SCR strongly penalizes this type of financial instrument. It was shown that this direct effect of the regulation can be attenuated by implementing constraints in the optimization process or minimizing the market SCR together with the historical volatility, proving the interest of having a portfolio construction approach that can incorporate such features. The present results are further explained by the Market SCR modelling.

Keywords: financial risk, numerical optimization, portfolio management, solvency capital requirement

Procedia PDF Downloads 93
375 Design Optimization of the Primary Containment Building of a Pressurized Water Reactor

Authors: M. Hossain, A. H. Khan, M. A. R. Sarkar

Abstract:

Primary containment structure is one of the five safety layers of a nuclear facility which is needed to be designed in such a manner that it can withstand the pressure and excessive radioactivity during accidental situations. It is also necessary to ensure minimization of cost with maximum possible safety in order to make the design economically feasible and attractive. This paper attempts to identify the optimum design conditions for primary containment structure considering both mechanical and radiation safety keeping the economic aspects in mind. This work takes advantage of commercial simulation software to identify the suitable conditions without the requirement of costly experiments. Generated data may be helpful for further studies.

Keywords: PWR, concrete containment, finite element approach, neutron attenuation, Von Mises stress

Procedia PDF Downloads 155
374 Modeling the Current and Future Distribution of Anthus Pratensis under Climate Change

Authors: Zahira Belkacemi

Abstract:

One of the most important tools in conservation biology is information on the geographic distribution of species and the variables determining those patterns. In this study, we used maximum-entropy niche modeling (Maxent) to predict the current and future distribution of Anthus pratensis using climatic variables. The results showed that the species would not be highly affected by the climate change in shifting its distribution; however, the results of this study should be improved by taking into account other predictors, and that the NATURA 2000 protected sites will be efficient at 42% in protecting the species.

Keywords: anthus pratensis, climate change, Europe, species distribution model

Procedia PDF Downloads 107
373 Enumerative Search for Crane Schedule in Anodizing Operations

Authors: Kanate Pantusavase, Jaramporn Hassamontr

Abstract:

This research aims to develop an algorithm to generate a schedule of multiple cranes that will maximize load throughputs in anodizing operation. The algorithm proposed utilizes an enumerative strategy to search for constant time between successive loads and crane covering range over baths. The computer program developed is able to generate a near-optimal crane schedule within reasonable times, i.e. within 10 minutes. Its results are compared with existing solutions from an aluminum extrusion industry. The program can be used to generate crane schedules for mixed products, thus allowing mixed-model line balancing to improve overall cycle times.

Keywords: crane scheduling, anodizing operations, cycle time minimization

Procedia PDF Downloads 435
372 Multi Objective Near-Optimal Trajectory Planning of Mobile Robot

Authors: Amar Khoukhi, Mohamed Shahab

Abstract:

This paper presents the optimal control problem of mobile robot motion as a nonlinear programming problem (NLP) and solved using a direct method of numerical optimal control. The NLP is initialized with a B-Spline for which node locations are optimized using a genetic search. The system acceleration inputs and sampling periods are considered as optimization variables. Different scenarios with different objectives weights are implemented and investigated. Interesting results are found in terms of complying with the expected behavior of a mobile robot system and time-energy minimization.

Keywords: multi-objective control, non-holonomic systems, mobile robots, nonlinear programming, motion planning, B-spline, genetic algorithm

Procedia PDF Downloads 341
371 Nurse-Patient Assignment: Case of Pediatrics Department

Authors: Jihene Jlassi, Ahmed Frikha, Wazna Kortli

Abstract:

The objectives of Nurse-Patient Assignment are the minimization of the overall hospital cost and the maximization of nurses ‘preferences. This paper aims to assess nurses' satisfaction related to the implementation of patient acuity tool-based assignments. So, we used an integer linear program that assigns patients to nurses while balancing nurse workloads. Then, the proposed model is applied to the Paediatrics Department at Kasserine Hospital Tunisia. Where patients need special acuities and high-level nursing skills and care. Hence, numerical results suggested that proposed nurse-patient assignment models can achieve a balanced assignment

Keywords: nurse-patient assignment, mathematical model, logistics, pediatrics department, balanced assignment

Procedia PDF Downloads 117