Search results for: Latent Dirichlet Allocation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1001

Search results for: Latent Dirichlet Allocation

851 Deep learning with Noisy Labels : Learning True Labels as Discrete Latent Variable

Authors: Azeddine El-Hassouny, Chandrashekhar Meshram, Geraldin Nanfack

Abstract:

In recent years, learning from data with noisy labels (Label Noise) has been a major concern in supervised learning. This problem has become even more worrying in Deep Learning, where the generalization capabilities have been questioned lately. Indeed, deep learning requires a large amount of data that is generally collected by search engines, which frequently return data with unreliable labels. In this paper, we investigate the Label Noise in Deep Learning using variational inference. Our contributions are : (1) exploiting Label Noise concept where the true labels are learnt using reparameterization variational inference, while observed labels are learnt discriminatively. (2) the noise transition matrix is learnt during the training without any particular process, neither heuristic nor preliminary phases. The theoretical results shows how true label distribution can be learned by variational inference in any discriminate neural network, and the effectiveness of our approach is proved in several target datasets, such as MNIST and CIFAR32.

Keywords: label noise, deep learning, discrete latent variable, variational inference, MNIST, CIFAR32

Procedia PDF Downloads 87
850 Solving Dimensionality Problem and Finding Statistical Constructs on Latent Regression Models: A Novel Methodology with Real Data Application

Authors: Sergio Paez Moncaleano, Alvaro Mauricio Montenegro

Abstract:

This paper presents a novel statistical methodology for measuring and founding constructs in Latent Regression Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations on Item Response Theory (IRT). In addition, based on the fundamentals of submodel theory and with a convergence of many ideas of IRT, we propose an algorithm not just to solve the dimensionality problem (nowadays an open discussion) but a new research field that promises more fear and realistic qualifications for examiners and a revolution on IRT and educational research. In the end, the methodology is applied to a set of real data set presenting impressive results for the coherence, speed and precision. Acknowledgments: This research was financed by Colciencias through the project: 'Multidimensional Item Response Theory Models for Practical Application in Large Test Designed to Measure Multiple Constructs' and both authors belong to SICS Research Group from Universidad Nacional de Colombia.

Keywords: item response theory, dimensionality, submodel theory, factorial analysis

Procedia PDF Downloads 342
849 Surge in U. S. Citizens Expatriation: Testing Structual Equation Modeling to Explain the Underlying Policy Rational

Authors: Marco Sewald

Abstract:

Comparing present to past the numbers of Americans expatriating U. S. citizenship have risen. Even though these numbers are small compared to the immigrants, U. S. citizens expatriations have historically been much lower, making the uptick worrisome. In addition, the published lists and numbers from the U.S. government seems incomplete, with many not counted. Different branches of the U. S. government report different numbers and no one seems to know exactly how big the real number is, even though the IRS and the FBI both track and/or publish numbers of Americans who renounce. Since there is no single explanation, anecdotal evidence suggests this uptick is caused by global tax law and increased compliance burdens imposed by the U.S. lawmakers on U.S. citizens abroad. Within a research project the question arose about the reasons why a constant growing number of U.S. citizens are expatriating – the answers are believed helping to explain the underlying governmental policy rational, leading to such activities. While it is impossible to locate former U.S. citizens to conduct a survey on the reasons and the U.S. government is not commenting on the reasons given within the process of expatriation, the chosen methodology is Structural Equation Modeling (SEM), in the first step by re-using current surveys conducted by different researchers within the population of U. S. citizens residing abroad during the last years. Surveys questioning the personal situation in the context of tax, compliance, citizenship and likelihood to repatriate to the U. S. In general SEM allows: (1) Representing, estimating and validating a theoretical model with linear (unidirectional or not) relationships. (2) Modeling causal relationships between multiple predictors (exogenous) and multiple dependent variables (endogenous). (3) Including unobservable latent variables. (4) Modeling measurement error: the degree to which observable variables describe latent variables. Moreover SEM seems very appealing since the results can be represented either by matrix equations or graphically. Results: the observed variables (items) of the construct are caused by various latent variables. The given surveys delivered a high correlation and it is therefore impossible to identify the distinct effect of each indicator on the latent variable – which was one desired result. Since every SEM comprises two parts: (1) measurement model (outer model) and (2) structural model (inner model), it seems necessary to extend the given data by conducting additional research and surveys to validate the outer model to gain the desired results.

Keywords: expatriation of U. S. citizens, SEM, structural equation modeling, validating

Procedia PDF Downloads 186
848 A Fourier Method for Risk Quantification and Allocation of Credit Portfolios

Authors: Xiaoyu Shen, Fang Fang, Chujun Qiu

Abstract:

Herewith we present a Fourier method for credit risk quantification and allocation in the factor-copula model framework. The key insight is that, compared to directly computing the cumulative distribution function of the portfolio loss via Monte Carlo simulation, it is, in fact, more efficient to calculate the transformation of the distribution function in the Fourier domain instead and inverting back to the real domain can be done in just one step and semi-analytically, thanks to the popular COS method (with some adjustments). We also show that the Euler risk allocation problem can be solved in the same way since it can be transformed into the problem of evaluating a conditional cumulative distribution function. Once the conditional or unconditional cumulative distribution function is known, one can easily calculate various risk metrics. The proposed method not only fills the niche in literature, to the best of our knowledge, of accurate numerical methods for risk allocation but may also serve as a much faster alternative to the Monte Carlo simulation method for risk quantification in general. It can cope with various factor-copula model choices, which we demonstrate via examples of a two-factor Gaussian copula and a two-factor Gaussian-t hybrid copula. The fast error convergence is proved mathematically and then verified by numerical experiments, in which Value-at-Risk, Expected Shortfall, and conditional Expected Shortfall are taken as examples of commonly used risk metrics. The calculation speed and accuracy are tested to be significantly superior to the MC simulation for real-sized portfolios. The computational complexity is, by design, primarily driven by the number of factors instead of the number of obligors, as in the case of Monte Carlo simulation. The limitation of this method lies in the "curse of dimension" that is intrinsic to multi-dimensional numerical integration, which, however, can be relaxed with the help of dimension reduction techniques and/or parallel computing, as we will demonstrate in a separate paper. The potential application of this method has a wide range: from credit derivatives pricing to economic capital calculation of the banking book, default risk charge and incremental risk charge computation of the trading book, and even to other risk types than credit risk.

Keywords: credit portfolio, risk allocation, factor copula model, the COS method, Fourier method

Procedia PDF Downloads 118
847 Human Leukocyte Antigen Class 1 Phenotype Distribution and Analysis in Persons from Central Uganda with Active Tuberculosis and Latent Mycobacterium tuberculosis Infection

Authors: Helen K. Buteme, Rebecca Axelsson-Robertson, Moses L. Joloba, Henry W. Boom, Gunilla Kallenius, Markus Maeurer

Abstract:

Background: The Ugandan population is heavily affected by infectious diseases and Human leukocyte antigen (HLA) diversity plays a crucial role in the host-pathogen interaction and affects the rates of disease acquisition and outcome. The identification of HLA class 1 alleles and determining which alleles are associated with tuberculosis (TB) outcomes would help in screening individuals in TB endemic areas for susceptibility to TB and to predict resistance or progression to TB which would inevitably lead to better clinical management of TB. Aims: To be able to determine the HLA class 1 phenotype distribution in a Ugandan TB cohort and to establish the relationship between these phenotypes and active and latent TB. Methods: Blood samples were drawn from 32 HIV negative individuals with active TB and 45 HIV negative individuals with latent MTB infection. DNA was extracted from the blood samples and the DNA samples HLA typed by the polymerase chain reaction-sequence specific primer method. The allelic frequencies were determined by direct count. Results: HLA-A*02, A*01, A*74, A*30, B*15, B*58, C*07, C*03 and C*04 were the dominant phenotypes in this Ugandan cohort. There were differences in the distribution of HLA types between the individuals with active TB and the individuals with LTBI with only HLA-A*03 allele showing a statistically significant difference (p=0.0136). However, after FDR computation the corresponding q-value is above the expected proportion of false discoveries (q-value 0.2176). Key findings: We identified a number of HLA class I alleles in a population from Central Uganda which will enable us to carry out a functional characterization of CD8+ T-cell mediated immune responses to MTB. Our results also suggest that there may be a positive association between the HLA-A*03 allele and TB implying that individuals with the HLA-A*03 allele are at a higher risk of developing active TB.

Keywords: HLA, phenotype, tuberculosis, Uganda

Procedia PDF Downloads 379
846 A Cognitive Approach to the Optimization of Power Distribution across an Educational Campus

Authors: Mrinmoy Majumder, Apu Kumar Saha

Abstract:

The ever-increasing human population and its demand for energy is placing stress upon conventional energy sources; and as demand for power continues to outstrip supply, the need to optimize energy distribution and utilization is emerging as an important focus for various stakeholders. The distribution of available energy must be achieved in such a way that the needs of the consumer are satisfied. However, if the availability of resources is not sufficient to satisfy consumer demand, it is necessary to find a method to select consumers based on factors such as their socio-economic or environmental impacts. Weighting consumer types in this way can help separate them based on their relative importance, and cognitive optimization of the allocation process can then be carried out so that, even on days of particularly scarce supply, the socio-economic impacts of not satisfying the needs of consumers can be minimized. In this context, the present study utilized fuzzy logic to assign weightage to different types of consumers based at an educational campus in India, and then established optimal allocation by applying the non-linear mapping capability of neuro-genetic algorithms. The outputs of the algorithms were compared with similar outputs from particle swarm optimization and differential evolution algorithms. The results of the study demonstrate an option for the optimal utilization of available energy based on the socio-economic importance of consumers.

Keywords: power allocation, optimization problem, neural networks, environmental and ecological engineering

Procedia PDF Downloads 446
845 Global Health, Humanitarian Medical Aid, and the Ethics of Rationing

Authors: N. W. Paul, S. Michl

Abstract:

In our globalized world we need to appreciate the fact that questions of health and justice need to be addressed on a global scale, too. The way in which diverse governmental and non-governmental initiatives are trying to answer the need for humanitarian medical aid has long since been a visible result of globalized responsibility. While the intention of humanitarian medical aids seems to be evident, the allocation of resources has become more and more an ethical and societal challenge. With a rising number and growing dimension of humanitarian catastrophes around the globe the search for ethically justifiable ways to decide who might benefit from limited resources has become a pressing question. Rooted in theories of justice (Rawls) and concepts of social welfare (Sen) we developed and implemented a model for an ethically sound distribution of a limited annual budget for humanitarian care in one of the largest medical universities of Germany. Based on our long lasting experience with civil casualties of war (Afghanistan) and civil war (Libya) as well as with under- and uninsured and/or stateless patients we are now facing the on-going refugee crisis as our most recent challenge in terms of global health and justice. Against this background, the paper strives to a) explain key issues of humanitarian medical aid in the 21st century, b) explore the problem of rationing from an ethical point of view, c) suggest a tool for the rational allocation of scarce resources in humanitarian medical aid, d) present actual cases of humanitarian care that have been managed with our toolbox, and e) discuss the international applicability of our model beyond local contexts.

Keywords: humanitarian care, medical ethics, allocation, rationing

Procedia PDF Downloads 375
844 Sensing to Respond & Recover in Emergency

Authors: Alok Kumar, Raviraj Patil

Abstract:

The ability to respond to an incident of a disastrous event in a vulnerable area is very crucial an aspect of emergency management. The ability to constantly predict the likelihood of an event along with its severity in an area and react to those significant events which are likely to have a high impact allows the authorities to respond by allocating resources optimally in a timely manner. It provides for measuring, monitoring, and modeling facilities that integrate underlying systems into one solution to improve operational efficiency, planning, and coordination. We were particularly involved in this innovative incubation work on the current state of research and development in collaboration. technologies & systems for a disaster.

Keywords: predictive analytics, advanced analytics, area flood likelihood model, area flood severity model, level of impact model, mortality score, economic loss score, resource allocation, crew allocation

Procedia PDF Downloads 290
843 A Multi-Objective Programming Model to Supplier Selection and Order Allocation Problem in Stochastic Environment

Authors: Rouhallah Bagheri, Morteza Mahmoudi, Hadi Moheb-Alizadeh

Abstract:

This paper aims at developing a multi-objective model for supplier selection and order allocation problem in stochastic environment, where purchasing cost, percentage of delivered items with delay and percentage of rejected items provided by each supplier are supposed to be stochastic parameters following any arbitrary probability distribution. In this regard, dependent chance programming is used which maximizes probability of the event that total purchasing cost, total delivered items with delay and total rejected items are less than or equal to pre-determined values given by decision maker. The abovementioned stochastic multi-objective programming problem is then transformed into a stochastic single objective programming problem using minimum deviation method. In the next step, the further problem is solved applying a genetic algorithm, which performs a simulation process in order to calculate the stochastic objective function as its fitness function. Finally, the impact of stochastic parameters on the given solution is examined via a sensitivity analysis exploiting coefficient of variation. The results show that whatever stochastic parameters have greater coefficients of variation, the value of the objective function in the stochastic single objective programming problem is deteriorated.

Keywords: supplier selection, order allocation, dependent chance programming, genetic algorithm

Procedia PDF Downloads 286
842 The Analysis of Education Sector and Poverty Alleviation with Benefit Incidence Analysis Approach Budget Allocation Policy in East Java

Authors: Wildan Syafitri

Abstract:

The main purpose of the development is to embody public welfare. Its indication is shown by the increasing of the public prosperity in which it will be related to the consumption level as a consequence of the increasing of public income. One of the government’s efforts to increase public welfare is to create development equity in order to alleviate poor people. Poverty’s problem is not merely about the number and percentage of the poor people, but also it includes the gap and severity of poverty.the analysis method used is Benefit Incidence Analysis (BIA) that is an analysis method used to disclose the impact of government policy or individual access based on the income distribution in society. Further, the finding of the study revealed is that the highest number of the poor people in the village is those who are unemployed and have family members who are still in the Junior High School. The income distribution calculation shows a fairly good budget allocation applied with good mass ratio that is 0.31. In addition, the finding of this study also discloses that Indonesian Government policy to subsidize education cost for Elementary and Junior High School students has reached the right target. It is indicated by more benefits received by Elementary and Junior High School students who are poor and very poor than other income group.

Keywords: benefit incidence analysis, budget allocation, poverty, education

Procedia PDF Downloads 362
841 Comprehensive Analysis of Power Allocation Algorithms for OFDM Based Communication Systems

Authors: Rakesh Dubey, Vaishali Bahl, Dalveer Kaur

Abstract:

The spiralling urge for high rate data transmission over wireless mediums needs intelligent use of electromagnetic resources considering restrictions like power ingestion, spectrum competence, robustness against multipath propagation and implementation intricacy. Orthogonal frequency division multiplexing (OFDM) is a capable technique for next generation wireless communication systems. For such high rate data transfers there is requirement of proper allocation of resources like power and capacity amongst the sub channels. This paper illustrates various available methods of allocating power and the capacity requirement with the constraint of Shannon limit.

Keywords: Additive White Gaussian Noise, Multi-Carrier Modulation, Orthogonal Frequency Division Multiplexing (OFDM), Signal to Noise Ratio (SNR), Water Filling

Procedia PDF Downloads 526
840 An Efficient Subcarrier Scheduling Algorithm for Downlink OFDMA-Based Wireless Broadband Networks

Authors: Hassen Hamouda, Mohamed Ouwais Kabaou, Med Salim Bouhlel

Abstract:

The growth of wireless technology made opportunistic scheduling a widespread theme in recent research. Providing high system throughput without reducing fairness allocation is becoming a very challenging task. A suitable policy for resource allocation among users is of crucial importance. This study focuses on scheduling multiple streaming flows on the downlink of a WiMAX system based on orthogonal frequency division multiple access (OFDMA). In this paper, we take the first step in formulating and analyzing this problem scrupulously. As a result, we proposed a new scheduling scheme based on Round Robin (RR) Algorithm. Because of its non-opportunistic process, RR does not take in account radio conditions and consequently it affect both system throughput and multi-users diversity. Our contribution called MORRA (Modified Round Robin Opportunistic Algorithm) consists to propose a solution to this issue. MORRA not only exploits the concept of opportunistic scheduler but also takes into account other parameters in the allocation process. The first parameter is called courtesy coefficient (CC) and the second is called Buffer Occupancy (BO). Performance evaluation shows that this well-balanced scheme outperforms both RR and MaxSNR schedulers and demonstrate that choosing between system throughput and fairness is not required.

Keywords: OFDMA, opportunistic scheduling, fairness hierarchy, courtesy coefficient, buffer occupancy

Procedia PDF Downloads 264
839 Mathematical Modeling and Algorithms for the Capacitated Facility Location and Allocation Problem with Emission Restriction

Authors: Sagar Hedaoo, Fazle Baki, Ahmed Azab

Abstract:

In supply chain management, network design for scalable manufacturing facilities is an emerging field of research. Facility location allocation assigns facilities to customers to optimize the overall cost of the supply chain. To further optimize the costs, capacities of these facilities can be changed in accordance with customer demands. A mathematical model is formulated to fully express the problem at hand and to solve small-to-mid range instances. A dedicated constraint has been developed to restrict emissions in line with the Kyoto protocol. This problem is NP-Hard; hence, a simulated annealing metaheuristic has been developed to solve larger instances. A case study on the USA-Canada cross border crossing is used.

Keywords: emission, mixed integer linear programming, metaheuristic, simulated annealing

Procedia PDF Downloads 281
838 Commitment Based Revenue Sharing Contract

Authors: Muhammad Shafiq, Huynh Trung Luong

Abstract:

In this paper, we proposed a commitment based revenue sharing contract for a supply chain comprising one manufacturer and one retailer facing highly uncertain demand of a short life span fashionable product. In our model, the retailer reserves a commitment level with the manufacturer prior to the selling season. In response, the manufacturer allocates and produces a specific quantity which is the maximum available quantity for the retailer. The retailer is motivated to commit more by offering higher revenue sharing percentage for reserved capacity than non-reserved capacity. Due to asymmetric information, it is found that the manufacturer can optimize quantity allocation decision while the commitment level decision of the retailer may not be optimal.

Keywords: supply chain coordination, revenue sharing contract, commitment based revenue sharing, quantity allocation

Procedia PDF Downloads 460
837 An AI-Based Dynamical Resource Allocation Calculation Algorithm for Unmanned Aerial Vehicle

Authors: Zhou Luchen, Wu Yubing, Burra Venkata Durga Kumar

Abstract:

As the scale of the network becomes larger and more complex than before, the density of user devices is also increasing. The development of Unmanned Aerial Vehicle (UAV) networks is able to collect and transform data in an efficient way by using software-defined networks (SDN) technology. This paper proposed a three-layer distributed and dynamic cluster architecture to manage UAVs by using an AI-based resource allocation calculation algorithm to address the overloading network problem. Through separating services of each UAV, the UAV hierarchical cluster system performs the main function of reducing the network load and transferring user requests, with three sub-tasks including data collection, communication channel organization, and data relaying. In this cluster, a head node and a vice head node UAV are selected considering the Central Processing Unit (CPU), operational (RAM), and permanent (ROM) memory of devices, battery charge, and capacity. The vice head node acts as a backup that stores all the data in the head node. The k-means clustering algorithm is used in order to detect high load regions and form the UAV layered clusters. The whole process of detecting high load areas, forming and selecting UAV clusters, and moving the selected UAV cluster to that area is proposed as offloading traffic algorithm.

Keywords: k-means, resource allocation, SDN, UAV network, unmanned aerial vehicles

Procedia PDF Downloads 75
836 Effect of Compost Application on Uptake and Allocation of Heavy Metals and Plant Nutrients and Quality of Oriental Tobacco Krumovgrad 90

Authors: Violina R. Angelova, Venelina T. Popova, Radka V. Ivanova, Givko T. Ivanov, Krasimir I. Ivanov

Abstract:

A comparative research on the impact of compost on uptake and allocation of nutrients and heavy metals and quality of Oriental tobacco Krumovgrad 90 has been carried out. The experiment was performed on an agricultural field contaminated by the lead zinc smelter near the town of Kardzali, Bulgaria, after closing the lead production. The compost treatments had significant effects on the uptake and allocation of plant nutrients and heavy metals. The incorporation of compost leads to decrease in the amount of heavy metals present in the tobacco leaves, with Cd, Pb and Zn having values of 36%, 12% and 6%, respectively. Application of the compost leads to increased content of potassium, calcium and magnesium in the leaves of tobacco, and therefore, may favorably affect the burning properties of tobacco. The incorporation of compost in the soil has a negative impact on the quality and typicality of the oriental tobacco variety of Krumovgrad 90. The incorporation of compost leads to an increase in the size of the tobacco plant leaves, the leaves become darker in colour, less fleshy and undergo a change in form, becoming (much) broader in the second, third and fourth stalk position. This is accompanied by a decrease in the quality of the tobacco. The incorporation of compost also results in an increase in the mineral substances (pure ash), total nicotine and nitrogen, and a reduction in the amount of reducing sugars, which causes the quality of the tobacco leaves to deteriorate (particularly in the third and fourth harvests).

Keywords: chemical composition, compost, heavy metals, oriental tobacco, quality

Procedia PDF Downloads 238
835 Operations Research Applications in Audit Planning and Scheduling

Authors: Abdel-Aziz M. Mohamed

Abstract:

This paper presents a state-of-the-art survey of the operations research models developed for internal audit planning. Two alternative approaches have been followed in the literature for audit planning: (1) identifying the optimal audit frequency; and (2) determining the optimal audit resource allocation. The first approach identifies the elapsed time between two successive audits, which can be presented as the optimal number of audits in a given planning horizon, or the optimal number of transactions after which an audit should be performed. It also includes the optimal audit schedule. The second approach determines the optimal allocation of audit frequency among all auditable units in the firm. In our review, we discuss both the deterministic and probabilistic models developed for audit planning. In addition, game theory models are reviewed to find the optimal auditing strategy based on the interactions between the auditors and the clients.

Keywords: operations research applications, audit frequency, audit-staff scheduling, audit planning

Procedia PDF Downloads 788
834 Capacitated Multiple Allocation P-Hub Median Problem on a Cluster Based Network under Congestion

Authors: Çağrı Özgün Kibiroğlu, Zeynep Turgut

Abstract:

This paper considers a hub location problem where the network service area partitioned into predetermined zones (represented by node clusters is given) and potential hub nodes capacity levels are determined a priori as a selection criteria of hub to investigate congestion effect on network. The objective is to design hub network by determining all required hub locations in the node clusters and also allocate non-hub nodes to hubs such that the total cost including transportation cost, opening cost of hubs and penalty cost for exceed of capacity level at hubs is minimized. A mixed integer linear programming model is developed introducing additional constraints to the traditional model of capacitated multiple allocation hub location problem and empirically tested.

Keywords: hub location problem, p-hub median problem, clustering, congestion

Procedia PDF Downloads 457
833 Artificial Intelligent-Based Approaches for Task ‎Offloading, ‎Resource ‎Allocation and Service ‎Placement of ‎Internet of Things ‎Applications: State of the Art

Authors: Fatima Z. Cherhabil, Mammar Sedrati, Sonia-Sabrina Bendib‎

Abstract:

In order to support the continued growth, critical latency of ‎IoT ‎applications, and ‎various obstacles of traditional data centers, ‎mobile edge ‎computing (MEC) has ‎emerged as a promising solution that extends cloud data-processing and decision-making to edge devices. ‎By adopting a MEC structure, IoT applications could be executed ‎locally, on ‎an edge server, different fog nodes, or distant cloud ‎data centers. However, we are ‎often ‎faced with wanting to optimize conflicting criteria such as ‎minimizing energy ‎consumption of limited local capabilities (in terms of CPU, RAM, storage, bandwidth) of mobile edge ‎devices and trying to ‎keep ‎high performance (reducing ‎response time, increasing throughput and service availability) ‎at the same ‎time‎. Achieving one goal may affect the other, making task offloading (TO), ‎resource allocation (RA), and service placement (SP) complex ‎processes. ‎It is a nontrivial multi-objective optimization ‎problem ‎to study the trade-off between conflicting criteria. ‎The paper provides a survey on different TO, SP, and RA recent multi-‎objective optimization (MOO) approaches used in edge computing environments, particularly artificial intelligent (AI) ones, to satisfy various objectives, constraints, and dynamic conditions related to IoT applications‎.

Keywords: mobile edge computing, multi-objective optimization, artificial ‎intelligence ‎approaches, task offloading, resource allocation, ‎ service placement

Procedia PDF Downloads 84
832 A Sustainable Supplier Selection and Order Allocation Based on Manufacturing Processes and Product Tolerances: A Multi-Criteria Decision Making and Multi-Objective Optimization Approach

Authors: Ravi Patel, Krishna K. Krishnan

Abstract:

In global supply chains, appropriate and sustainable suppliers play a vital role in supply chain development and feasibility. In a larger organization with huge number of suppliers, it is necessary to divide suppliers based on their past history of quality and delivery of each product category. Since performance of any organization widely depends on their suppliers, well evaluated selection criteria and decision-making models lead to improved supplier assessment and development. In this paper, SCOR® performance evaluation approach and ISO standards are used to determine selection criteria for better utilization of supplier assessment by using hybrid model of Analytic Hierchchy Problem (AHP) and Fuzzy Techniques for Order Preference by Similarity to Ideal Solution (FTOPSIS). AHP is used to determine the global weightage of criteria which helps TOPSIS to get supplier score by using triangular fuzzy set theory. Both qualitative and quantitative criteria are taken into consideration for the proposed model. In addition, a multi-product and multi-time period model is selected for order allocation. The optimization model integrates multi-objective integer linear programming (MOILP) for order allocation and a hybrid approach for supplier selection. The proposed MOILP model optimizes order allocation based on manufacturing process and product tolerances as per manufacturer’s requirement for quality product. The integrated model and solution approach are tested to find optimized solutions for different scenario. The detailed analysis shows the superiority of proposed model over other solutions which considered individual decision making models.

Keywords: AHP, fuzzy set theory, multi-criteria decision making, multi-objective integer linear programming, TOPSIS

Procedia PDF Downloads 146
831 Life Table and Functional Response of Scolothrips takahashii (Thysanoptera: Thripidae) on Tetranychus urticae (Acari:Tetranychidae)

Authors: Kuang-Chi Pan, Shu-Jen Tuan

Abstract:

Scolothrips takahashii Priesner (Thysanoptera: Thripidae) is a common predatory thrips which feeds on spider mites; it is considered an important natural enemy and a potential biological control agent against spider mites. In order to evaluate the efficacy of S. takahashii against tetranychid mites, life table and functional response study were conducted at 25±1°C, with Tetranychus urticae Priesner as prey. The intrinsic rate of increase (r), finite rate of increase (λ), net reproduction rate (R₀), mean generation time (T) were 0.1674 d⁻¹, 1.1822d⁻¹, 62.26 offspring/individual, and 24.68d. The net consumption rate (C₀) was 846.15, mean daily consumption rate was 51.92 eggs for females and 19.28 eggs for males. S. takahashii exhibited type III functional response when offered T. urticae deutonymphs. Based on the random predator equation, the estimated maximum attack rate (a) and handling time (Th) were 0.1376h⁻¹ and 0.7883h. In addition, a life table experiment was conducted to evaluate the offspring sex allocation and population dynamic of Tetranychus ludeni Zacher under group-rearing conditions with different sex ratios. All bisexual groups produced offspring with similar sex allocation patterns, which started with the majority of females, then transited during the middle of the oviposition period and turned male-biased at the end of the oviposition period.

Keywords: Scolothrips takahashii, Tetranychus urticae, Tetranychus ludeni, two-sex life table, functional response, sex allocation

Procedia PDF Downloads 47
830 Supplier Selection and Order Allocation Using a Stochastic Multi-Objective Programming Model and Genetic Algorithm

Authors: Rouhallah Bagheri, Morteza Mahmoudi, Hadi Moheb-Alizadeh

Abstract:

In this paper, we develop a supplier selection and order allocation multi-objective model in stochastic environment in which purchasing cost, percentage of delivered items with delay and percentage of rejected items provided by each supplier are supposed to be stochastic parameters following any arbitrary probability distribution. To do so, we use dependent chance programming (DCP) that maximizes probability of the event that total purchasing cost, total delivered items with delay and total rejected items are less than or equal to pre-determined values given by decision maker. After transforming the above mentioned stochastic multi-objective programming problem into a stochastic single objective problem using minimum deviation method, we apply a genetic algorithm to get the later single objective problem solved. The employed genetic algorithm performs a simulation process in order to calculate the stochastic objective function as its fitness function. At the end, we explore the impact of stochastic parameters on the given solution via a sensitivity analysis exploiting coefficient of variation. The results show that as stochastic parameters have greater coefficients of variation, the value of objective function in the stochastic single objective programming problem is worsened.

Keywords: dependent chance programming, genetic algorithm, minimum deviation method, order allocation, supplier selection

Procedia PDF Downloads 223
829 Authentic and Transformational Leadership Model of the Directors of Tambon Health Promoting Hospitals Effecting to the Effectiveness of Southern Tambon Health Promoting Hospitals: The Interaction and Invariance Tests of Gender Factor

Authors: Suphap Sikkhaphan, Muwanga Zake, Johnnie Wycliffe Frank

Abstract:

The purposes of the study included a) investigating the authentic and transformational leadership model of the directors of tambon health promoting hospitals b) evaluating the relation between the authentic and transformation leadership of the directors of tambon health promoting hospitals and the effectiveness of their hospitals and c) assessing the invariance test of the authentic and transformation leadership of the directors of tambon health promoting hospitals. All 400 southern tambon health promoting hospital directors were enrolled into the study. Half were males (200), and another half were females (200). They were sampled via a stratified method. A research tool was a questionnaire paper containing 4 different sections. The Alpha-Cronbach’s Coefficient was equally to .98. Descriptive analysis was used for demographic data, and inferential statistics was used for the relation and invariance tests of authentic and transformational leadership of the directors of tambon health promoting hospitals. The findings revealed overall the authentic and transformation leadership model of the directors of tambon health promoting hospitals has the relation to the effectiveness of the hospitals. Only the factor of “strong community support” was statistically significantly related to the authentic leadership (p < .05). However, there were four latent variables statistically related to the transformational leadership including, competency and work climate, management system, network cooperation, and strong community support (p = .01). Regarding the relation between the authentic and transformation leadership of the directors of tambon health promoting hospitals and the effectiveness of their hospitals, four casual variables of authentic leadership were not related to those latent variables. In contrast, all four latent variables of transformational leadership has statistically significantly related to the effectiveness of tambon health promoting hospitals (p = .001). Furthermore, only management system variable was significantly related to those casual variables of the authentic leadership (p < .05). Regarding the invariance test, the result found no statistical significance of the authentic and transformational leadership model of the directors of tambon health promoting hospitals, especially between male and female genders (p > .05).

Keywords: authentic leadership, transformational leadership, tambon health promoting hospital

Procedia PDF Downloads 411
828 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement

Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti

Abstract:

Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.

Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing

Procedia PDF Downloads 77
827 Poly(Ethylene Glycol)-Silicone Containing Phase Change Polymer for Thermal Energy Storage

Authors: Swati Sundararajan, , Asit B. Samui, Prashant S. Kulkarni

Abstract:

The global energy crisis has led to extensive research on alternative sources of energy. The gap between energy supply and demand can be met by thermal energy storage techniques, of which latent heat storage is most effective in the form of phase change materials (PCMs). Phase change materials utilize latent heat absorbed or released over a narrow temperature range of the material undergoing phase transformation, to store energy. The latent heat can be utilized for heating or cooling purposes. It can also be used for converting to electricity. All these actions amount to minimizing the load on electricity demand. These materials retain this property over repeated number of cycles. Different PCMs differ in the phase change temperature and the heat storage capacities. Poly(ethylene glycol) (PEG) was cross-linked to hydroxyl-terminated poly(dimethyl siloxane) (PDMS) in the presence of cross-linker, tetraethyl orthosilicate (TEOS) and catalyst, dibutyltin dilaurate. Four different ratios of PEG and PDMS were reacted together, and the composition with the lowest PEG concentration resulted in the formation of a flexible solid-solid phase change membrane. The other compositions are obtained in powder form. The enthalpy values of the prepared PCMs were studied by using differential scanning calorimetry and the crystallization properties were analyzed by using X-ray diffraction and polarized optical microscopy. The incorporation of silicone moiety was expected to reduce the hydrophilic character of PEG, which was evaluated by measurement of contact angle. The membrane forming ability of this crosslinked polymer can be extended to several smart packaging, building and textile applications. The detailed synthesis, characterization and performance evaluation of the crosslinked polymer blend will be incorporated in the presentation.

Keywords: phase change materials, poly(ethylene glycol), poly(dimethyl siloxane), thermal energy storage

Procedia PDF Downloads 331
826 Discrete Breeding Swarm for Cost Minimization of Parallel Job Shop Scheduling Problem

Authors: Tarek Aboueldahab, Hanan Farag

Abstract:

Parallel Job Shop Scheduling Problem (JSP) is a multi-objective and multi constrains NP- optimization problem. Traditional Artificial Intelligence techniques have been widely used; however, they could be trapped into the local minimum without reaching the optimum solution, so we propose a hybrid Artificial Intelligence model (AI) with Discrete Breeding Swarm (DBS) added to traditional Artificial Intelligence to avoid this trapping. This model is applied in the cost minimization of the Car Sequencing and Operator Allocation (CSOA) problem. The practical experiment shows that our model outperforms other techniques in cost minimization.

Keywords: parallel job shop scheduling problem, artificial intelligence, discrete breeding swarm, car sequencing and operator allocation, cost minimization

Procedia PDF Downloads 150
825 Multi-Criteria Based Robust Markowitz Model under Box Uncertainty

Authors: Pulak Swain, A. K. Ojha

Abstract:

Portfolio optimization is based on dealing with the problems of efficient asset allocation. Risk and Expected return are two conflicting criteria in such problems, where the investor prefers the return to be high and the risk to be low. Using multi-objective approach we can solve those type of problems. However the information which we have for the input parameters are generally ambiguous and the input values can fluctuate around some nominal values. We can not ignore the uncertainty in input values, as they can affect the asset allocation drastically. So we use Robust Optimization approach to the problems where the input parameters comes under box uncertainty. In this paper, we solve the multi criteria robust problem with the help of  E- constraint method.

Keywords: portfolio optimization, multi-objective optimization, ϵ - constraint method, box uncertainty, robust optimization

Procedia PDF Downloads 111
824 Detecting Manipulated Media Using Deep Capsule Network

Authors: Joseph Uzuazomaro Oju

Abstract:

The ease at which manipulated media can be created, and the increasing difficulty in identifying fake media makes it a great threat. Most of the applications used for the creation of these high-quality fake videos and images are built with deep learning. Hence, the use of deep learning in creating a detection mechanism cannot be overemphasized. Any successful fake media that is being detected before it reached the populace will save people from the self-doubt of either a content is genuine or fake and will ensure the credibility of videos and images. The methodology introduced in this paper approaches the manipulated media detection challenge using a combo of VGG-19 and a deep capsule network. In the case of videos, they are converted into frames, which, in turn, are resized and cropped to the face region. These preprocessed images/videos are fed to the VGG-19 network to extract the latent features. The extracted latent features are inputted into a deep capsule network enhanced with a 3D -convolution dynamic routing agreement. The 3D –convolution dynamic routing agreement algorithm helps to reduce the linkages between capsules networks. Thereby limiting the poor learning shortcoming of multiple capsule network layers. The resultant output from the deep capsule network will indicate a media to be either genuine or fake.

Keywords: deep capsule network, dynamic routing, fake media detection, manipulated media

Procedia PDF Downloads 98
823 Occult Haemolacria Paradigm in the Study of Tears

Authors: Yuliya Huseva

Abstract:

To investigate the contents of tears to determine latent blood. Methods: Tear samples from 72 women were studied with the microscopy of tears aspirated with a capillary and stained by Nocht and with a chemical method of test strips with chromogen. Statistical data processing was carried out using statistical packages Statistica 10.0 for Windows, calculation of Pearson's chi-square test, Yule association coefficient, the method of determining sensitivity and specificity. Results:, In 30.6% (22) of tear samples erythrocytes were revealed microscopically. Correlations between the presence of erythrocytes in the tear and the phase of the menstrual cycle has been discovered. In the follicular phase of the cycle, erythrocytes were found in 59.1% (13) people, which is significantly more (x2=4.2, p=0.041) compared to the luteal phase - in 40.9% (9) women. In the first seven days of the follicular phase of the menstrual cycle the erythrocytes were predominanted of in the tears of women examined testifies in favour of the vicarious bleeding from the mucous membranes of extragenital organs in sync with menstruation. Of the other cellular elements in tear samples with latent haemolacria, neutrophils prevailed - in 45.5% (10), while lymphocytes were less common - in 27.3% (6), because neutrophil exudation is accompanied by vasodilatation of the conjunctiva and the release of erythrocytes into the conjunctival cavity. It was found that the prognostic significance of the chemical method was 0.53 of the microscopic method. In contrast to microscopy, which detected blood in tear samples from 30.6% (22) of women, blood was detected chemically in tears of 16.7% (12). An association between latent haemolacria and endometriosis was found (k=0.75, p≤0.05). Microscopically, in the tears of patients with endometriosis, erythrocytes were detected in 70% of cases, while in healthy women without endometriosis - in 25% of cases. The proportion of women with erythrocytes in tears, determined by a chemical method, was 41.7% among patients with endometriosis, which is significantly more (x2=6.5, p=0.011) than 11.7% among women without endometriosis. The data obtained can be explained by the etiopathogenesis of the extragenital endometriosis which is caused by hematogenous spread of endometrial tissue into the orbit. In endometriosis, erythrocytes are found against the background of accumulations of epithelial cells. In the tear samples of 4 women with endometriosis, glandular cuboidal epithelial cells, morphologically similar to endometrial cells, were found, which may indicate a generalization of the disease. Conclusions: Single erythrocytes can normally be found in the tears, their number depends on the phase of the menstrual cycle, increasing in the follicular phase. Erythrocytes found in tears against the background of accumulations of epitheliocytes and their glandular atypia may indicate a manifestation of extragenital endometriosis. Both used methods (microscopic and chemical) are informative in revealing latent haemolacria. The microscopic method is more sensitive, reveals intact erythrocytes, and besides, it provides information about other cells. At the same time, the chemical method is faster and technically simpler, it determines the presence of haemoglobin and its metabolic products, and can be used as a screening.

Keywords: tear, blood, microscopy, epitheliocytes

Procedia PDF Downloads 92
822 Identification and Prioritisation of Students Requiring Literacy Intervention and Subsequent Communication with Key Stakeholders

Authors: Emilie Zimet

Abstract:

During networking and NCCD moderation meetings, best practices for identifying students who require Literacy Intervention are often discussed. Once these students are identified, consideration is given to the most effective process for prioritising those who have the greatest need for Literacy Support and the allocation of resources, tracking of intervention effectiveness and communicating with teachers/external providers/parents. Through a workshop, the group will investigate best practices to identify students who require literacy support and strategies to communicate and track their progress. In groups, participants will examine what they do in their settings and then compare with other models, including the researcher’s model, to decide the most effective path to identification and communication. Participants will complete a worksheet at the beginning of the session to deeply consider their current approaches. The participants will be asked to critically analyse their own identification processes for Literacy Intervention, ensuring students are not overlooked if they fall into the borderline category. A cut-off for students to access intervention will be considered so as not to place strain on already stretched resources along with the most effective allocation of resources. Furthermore, communicating learning needs and differentiation strategies to staff is paramount to the success of an intervention, and participants will look at the frequency of communication to share such strategies and updates. At the end of the session, the group will look at creating or evolving models that allow for best practices for the identification and communication of Literacy Interventions. The proposed outcome for this research is to develop a model of identification of students requiring Literacy Intervention that incorporates the allocation of resources and communication to key stakeholders. This will be done by pooling information and discussing a variety of models used in the participant's school settings.

Keywords: identification, student selection, communication, special education, school policy, planning for intervention

Procedia PDF Downloads 17