Search results for: Semi-Markov Decision Process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6374

Search results for: Semi-Markov Decision Process

3494 Variability in Near-Surface Ultraviolet Radiation and Its Dependence on Atmospheric Parameters

Authors: Yusuff Idowu Moshood, Sanni Mohammed

Abstract:

Natural radiations such as ultraviolet (UV) radiation sourced from sun are known to be the main causes of skin cancer, sunburn, eye damage, premature aging of skin and other skin related diseases. Its percentage of radiation reaching the earth populace and its impacts are not well known. Its variability in near-surface relating to its impacts on populace depends on some atmospheric parameters. Hence, this work was embarked on to determine the variability in near-surface UV radiation and its dependency on some atmospheric parameters at different time of the day in Offa, Nigeria. The variability was determined using the data obtained from meteorological garden, Science Laboratory Technology Department, Federal Polytechnic Offa, Nigeria. The data obtained were solar UV radiation, solar radiation, temperature, humidity and pressure at 30 minutes interval. Relationships were determined and correlations were derived using SPSS Pearson Correlation tool. The results showed a significant level of correlation with p-value of 0.01 and 0.05 levels. Thus, the results revealed some good relationships between the solar UV radiation and other atmospheric parameters with significance level less than p-value obtained. Inferentially, interdependent relationships were found to exist. Therefore, the nature of relationship obtained could be a yardstick for decision making in short term environmental planning on solar UV radiation depending of some atmospheric parameters within Offa locality.

Keywords: Correlation, inferential, radiation, yardstick.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 760
3493 Load Balancing in Heterogeneous P2P Systems using Mobile Agents

Authors: Neeraj Nehra, R. B. Patel, V. K. Bhat

Abstract:

Use of the Internet and the World-Wide-Web (WWW) has become widespread in recent years and mobile agent technology has proliferated at an equally rapid rate. In this scenario load balancing becomes important for P2P systems. Beside P2P systems can be highly heterogeneous, i.e., they may consists of peers that range from old desktops to powerful servers connected to internet through high-bandwidth lines. There are various loads balancing policies came into picture. Primitive one is Message Passing Interface (MPI). Its wide availability and portability make it an attractive choice; however the communication requirements are sometimes inefficient when implementing the primitives provided by MPI. In this scenario we use the concept of mobile agent because Mobile agent (MA) based approach have the merits of high flexibility, efficiency, low network traffic, less communication latency as well as highly asynchronous. In this study we present decentralized load balancing scheme using mobile agent technology in which when a node is overloaded, task migrates to less utilized nodes so as to share the workload. However, the decision of which nodes receive migrating task is made in real-time by defining certain load balancing policies. These policies are executed on PMADE (A Platform for Mobile Agent Distribution and Execution) in decentralized manner using JuxtaNet and various load balancing metrics are discussed.

Keywords: Mobile Agents, Agent host, Agent Submitter, PMADE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1721
3492 Implementation of Conceptual Real-Time Embedded Functional Design via Drive-by-Wire ECU Development

Authors: A. Ukaew, C. Chauypen

Abstract:

Design concepts of real-time embedded system can be realized initially by introducing novel design approaches. In this literature, model based design approach and in-the-loop testing were employed early in the conceptual and preliminary phase to formulate design requirements and perform quick real-time verification. The design and analysis methodology includes simulation analysis, model based testing, and in-the-loop testing. The design of conceptual driveby- wire, or DBW, algorithm for electronic control unit, or ECU, was presented to demonstrate the conceptual design process, analysis, and functionality evaluation. The concepts of DBW ECU function can be implemented in the vehicle system to improve electric vehicle, or EV, conversion drivability. However, within a new development process, conceptual ECU functions and parameters are needed to be evaluated. As a result, the testing system was employed to support conceptual DBW ECU functions evaluation. For the current setup, the system components were consisted of actual DBW ECU hardware, electric vehicle models, and control area network or CAN protocol. The vehicle models and CAN bus interface were both implemented as real-time applications where ECU and CAN protocol functionality were verified according to the design requirements. The proposed system could potentially benefit in performing rapid real-time analysis of design parameters for conceptual system or software algorithm development.

Keywords: Drive-by-wire ECU, in-the-loop testing, modelbased design, real-time embedded system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2156
3491 A Software Tool Design for Cerebral Infarction of MR Images

Authors: Kyoung-Jong Park, Woong-Gi Jeon, Hee-Cheol Kim, Dong-Eog Kim, Heung-Kook Choi

Abstract:

The brain MR imaging-based clinical research and analysis system were specifically built and the development for a large-scale data was targeted. We used the general clinical data available for building large-scale data. Registration period for the selection of the lesion ROI and the region growing algorithm was used and the Mesh-warp algorithm for matching was implemented. The accuracy of the matching errors was modified individually. Also, the large ROI research data can accumulate by our developed compression method. In this way, the correctly decision criteria to the research result was suggested. The experimental groups were age, sex, MR type, patient ID and smoking which can easily be queries. The result data was visualized of the overlapped images by a color table. Its data was calculated by the statistical package. The evaluation for the utilization of this system in the chronic ischemic damage in the area has done from patients with the acute cerebral infarction. This is the cause of neurologic disability index location in the center portion of the lateral ventricle facing. The corona radiate was found in the position. Finally, the system reliability was measured both inter-user and intra-user registering correlation.

Keywords: Software tool design, Cerebral infarction, Brain MR image, Registration

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639
3490 Solid Waste Pollution and the Importance of Environmental Planning in Managing and Preserving the Public Environment in Benghazi City and Its Surrounding Areas

Authors: Abdelsalam Omran Gebril Ali

Abstract:

Pollution and solid waste are the most important environmental problems plaguing the city of Benghazi as well as other cities and towns in Libya. These problems are caused by the lack of environmental planning and sound environmental management. Environmental planning is very important at present for the development of projects that preserve the environment; therefore, the planning process should be prioritized over the management process. Pollution caused by poor planning and environmental management exists not only in Benghazi but also in all other Libyan cities. This study was conducted through various field visits to several neighborhoods and areas within Benghazi as well as its neighboring regions. Follow-ups in these areas were conducted from March 2013 to October 2013 as documented by photographs. The existing methods of waste collection and means of transportation were investigated. Interviews were conducted with relevant authorities, including the Environment Public Authority in Benghazi and the Public Service Company of Benghazi. The objective of this study is to determine the causes of solid waste pollution in Benghazi City and its surrounding areas. Results show that solid waste pollution in Benghazi and its surrounding areas is the result of poor planning and environmental management, population growth, and the lack of hardware and equipment for the collection and transport of waste from the city to the landfill site. One of the most important recommendations in this study is the development of a complete and comprehensive plan that includes environmental planning and environmental management to reduce solid waste pollution.

Keywords: Solid waste, pollution, environmental planning, management, Benghazi, Libya.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6387
3489 Removal of Cationic Heavy Metal and HOC from Soil-Washed Water Using Activated Carbon

Authors: Chi Kyu Ahn, Young Mi Kim, Seung Han Woo, Jong Moon Park

Abstract:

Soil washing process with a surfactant solution is a potential technology for the rapid removal of hydrophobic organic compound (HOC) from soil. However, large amount of washed water would be produced during operation and this should be treated effectively by proper methods. The soil washed water for complex contaminated site with HOC and heavy metals might contain high amount of pollutants such as HOC and heavy metals as well as used surfactant. The heavy metals in the soil washed water have toxic effects on microbial activities thus these should be removed from the washed water before proceeding to a biological waste-water treatment system. Moreover, the used surfactant solutions are necessary to be recovered for reducing the soil washing operation cost. In order to simultaneously remove the heavy metals and HOC from soil-washed water, activated carbon (AC) was used in the present study. In an anionic-nonionic surfactant mixed solution, the Cd(II) and phenanthrene (PHE) were effectively removed by adsorption on activated carbon. The removal efficiency for Cd(II) was increased from 0.027 mmol-Cd/g-AC to 0.142 mmol-Cd/g-AC as the mole ratio of SDS increased in the presence of PHE. The adsorptive capacity of PHE was also increased according to the SDS mole ratio due to the decrement of molar solubilization ratios (MSR) for PHE in an anionic-nonionic surfactant mixture. The simultaneous adsorption of HOC and cationic heavy metals using activated carbon could be a useful method for surfactant recovery and the reduction of heavy metal toxicity in a surfactant-enhanced soil washing process.

Keywords: Activated carbon, Anionic-nonionic surfactant mixture, Cationic heavy metal, HOC, Soil washing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1710
3488 Vulnerability of Indian Agriculture to Climate Change: A Study of the Himalayan Region State

Authors: Rajendra Kumar Isaac, Monisha Isaac

Abstract:

Climate variability and changes are the emerging challenges for Indian agriculture with the growing population to ensure national food security. A study was conducted to assess the Climatic Change effects in medium to low altitude areas of the Himalayan region causing changes in land use and cereal crop productivity with the various climatic parameters. The rainfall and temperature changes from 1951 to 2013 were studied at four locations of varying altitudes, namely Hardwar, Rudra Prayag, Uttar Kashi and Tehri Garwal. It was observed that there is noticeable increment in temperature on all the four locations. It was surprisingly observed that the mean rainfall intensity of 30 minutes duration has increased at the rate of 0.1 mm/hours since 2000. The study shows that the combined effect of increasing temperature, rainfall, runoff and urbanization at the mid-Himalayan region is causing an increase in various climatic disasters and changes in agriculture patterns. A noticeable change in cropping patterns, crop productivity and land use change was observed. Appropriate adaptation and mitigation strategies are necessary to ensure that sustainable and climate-resilient agriculture. Appropriate information is necessary for farmers, as well as planners and decision makers for developing, disseminating and adopting climate-smart technologies.

Keywords: Climate variability, agriculture, land use, mitigation strategies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2118
3487 The Model of the Genre of Literary Portrait in Modern Literary Criticism

Authors: B. K. Bazylova, Zh. D. Suleimenova

Abstract:

In modern literary criticism the problem of genre is one of discussion. Genre is a phenomenon, located in the intersection of the synchronous and diachronic processes in the development of literature, and this is due to the complexity of its solutions. It defines the place of contact between literary works and literary process.

Keywords: Literary, criticism, literary portrait.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2004
3486 The Survey Research and Evaluation of Green Residential Building Based on the Improved Group Analytical Hierarchy Process Method in Yinchuan

Authors: Yun-na Wu, Zhen Wang

Abstract:

Due to the economic downturn and the deterioration of the living environment, the development of residential buildings as high energy consuming building is gradually changing from “extensive” to green building in China. So, the evaluation system of green building is continuously improved, but the current evaluation work has the following problems: (1) There are differences in the cost of the actual investment and the purchasing power of residents, also construction target of green residential building is single and lacks multi-objective performance development. (2) Green building evaluation lacks regional characteristics and cannot reflect the different regional residents demand. (3) In the process of determining the criteria weight, the experts’ judgment matrix is difficult to meet the requirement of consistency. Therefore, to solve those problems, questionnaires which are about the green residential building for Ningxia area are distributed, and the results of questionnaires can feedback the purchasing power of residents and the acceptance of the green building cost. Secondly, combined with the geographical features of Ningxia minority areas, the evaluation criteria system of green residential building is constructed. Finally, using the improved group AHP method and the grey clustering method, the criteria weight is determined, and a real case is evaluated, which is located in Xing Qing district, Ningxia. A conclusion can be obtained that the professional evaluation for this project and good social recognition is basically the same.

Keywords: Evaluation, green residential building, grey clustering method, group AHP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 810
3485 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools

Authors: M. Kaya, M. Eris

Abstract:

Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.

Keywords: Block matching, digital evidence, hash list.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1333
3484 Implementation of a Paraconsistent-Fuzzy Digital PID Controller in a Level Control Process

Authors: H. M. Côrtes, J. I. Da Silva Filho, M. F. Blos, B. S. Zanon

Abstract:

In a modern society the factor corresponding to the increase in the level of quality in industrial production demand new techniques of control and machinery automation. In this context, this work presents the implementation of a Paraconsistent-Fuzzy Digital PID controller. The controller is based on the treatment of inconsistencies both in the Paraconsistent Logic and in the Fuzzy Logic. Paraconsistent analysis is performed on the signals applied to the system inputs using concepts from the Paraconsistent Annotated Logic with annotation of two values (PAL2v). The signals resulting from the paraconsistent analysis are two values defined as Dc - Degree of Certainty and Dct - Degree of Contradiction, which receive a treatment according to the Fuzzy Logic theory, and the resulting output of the logic actions is a single value called the crisp value, which is used to control dynamic system. Through an example, it was demonstrated the application of the proposed model. Initially, the Paraconsistent-Fuzzy Digital PID controller was built and tested in an isolated MATLAB environment and then compared to the equivalent Digital PID function of this software for standard step excitation. After this step, a level control plant was modeled to execute the controller function on a physical model, making the tests closer to the actual. For this, the control parameters (proportional, integral and derivative) were determined for the configuration of the conventional Digital PID controller and of the Paraconsistent-Fuzzy Digital PID, and the control meshes in MATLAB were assembled with the respective transfer function of the plant. Finally, the results of the comparison of the level control process between the Paraconsistent-Fuzzy Digital PID controller and the conventional Digital PID controller were presented.

Keywords: Fuzzy logic, paraconsistent annotated logic, level control, digital PID.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1217
3483 Removal of Malachite Green from Aqueous Solution using Hydrilla verticillata -Optimization, Equilibrium and Kinetic Studies

Authors: R. Rajeshkannan, M. Rajasimman, N. Rajamohan

Abstract:

In this study, the sorption of Malachite green (MG) on Hydrilla verticillata biomass, a submerged aquatic plant, was investigated in a batch system. The effects of operating parameters such as temperature, adsorbent dosage, contact time, adsorbent size, and agitation speed on the sorption of Malachite green were analyzed using response surface methodology (RSM). The proposed quadratic model for central composite design (CCD) fitted very well to the experimental data that it could be used to navigate the design space according to ANOVA results. The optimum sorption conditions were determined as temperature - 43.5oC, adsorbent dosage - 0.26g, contact time - 200min, adsorbent size - 0.205mm (65mesh), and agitation speed - 230rpm. The Langmuir and Freundlich isotherm models were applied to the equilibrium data. The maximum monolayer coverage capacity of Hydrilla verticillata biomass for MG was found to be 91.97 mg/g at an initial pH 8.0 indicating that the optimum sorption initial pH. The external and intra particle diffusion models were also applied to sorption data of Hydrilla verticillata biomass with MG, and it was found that both the external diffusion as well as intra particle diffusion contributes to the actual sorption process. The pseudo-second order kinetic model described the MG sorption process with a good fitting.

Keywords: Response surface methodology, Hydrilla verticillata, malachite green, adsorption, central composite design

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1969
3482 An Adaptive Memetic Algorithm With Dynamic Population Management for Designing HIV Multidrug Therapies

Authors: Hassan Zarei, Ali Vahidian Kamyad, Sohrab Effati

Abstract:

In this paper, a mathematical model of human immunodeficiency virus (HIV) is utilized and an optimization problem is proposed, with the final goal of implementing an optimal 900-day structured treatment interruption (STI) protocol. Two type of commonly used drugs in highly active antiretroviral therapy (HAART), reverse transcriptase inhibitors (RTI) and protease inhibitors (PI), are considered. In order to solving the proposed optimization problem an adaptive memetic algorithm with population management (AMAPM) is proposed. The AMAPM uses a distance measure to control the diversity of population in genotype space and thus preventing the stagnation and premature convergence. Moreover, the AMAPM uses diversity parameter in phenotype space to dynamically set the population size and the number of crossovers during the search process. Three crossover operators diversify the population, simultaneously. The progresses of crossover operators are utilized to set the number of each crossover per generation. In order to escaping the local optima and introducing the new search directions toward the global optima, two local searchers assist the evolutionary process. In contrast to traditional memetic algorithms, the activation of these local searchers is not random and depends on both the diversity parameters in genotype space and phenotype space. The capability of AMAPM in finding optimal solutions compared with three popular metaheurestics is introduced.

Keywords: HIV therapy design, memetic algorithms, adaptivealgorithms, nonlinear integer programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1598
3481 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: G. Candel, D. Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embedding. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic, and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n2) to O(n2/k), and the memory requirement from n2 to 2(n/k)2 which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: Concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 459
3480 Finding Pareto Optimal Front for the Multi-Mode Time, Cost Quality Trade-off in Project Scheduling

Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo

Abstract:

Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.

Keywords: FastPGA, Multi-Execution Activity Mode, ParetoOptimality, Project Scheduling, Time-Cost-Quality Trade-Off.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1663
3479 The Modulation of Self-interest Instruction on the Fair-Proposing Behavior in Ultimatum Game

Authors: N. S. Yen, T. H. Yang, W. H. Huang, Y. F. Fang, H. W. Cho

Abstract:

Ultimatum game is an experimental paradigm to study human decision making. There are two players, a proposer and a responder, to split a fixed amount of money. According to the traditional economic theory on ultimatum game, proposer should propose the selfish offers to responder as much as possible to maximize proposer’s own outcomes. However, most evidences had showed that people chose more fair offers, hence two hypotheses – fairness favoring and strategic concern were proposed. In current study, we induced the motivation in participants to be either selfish or altruistic, and manipulated the task variables, the stake sizes (NT$100, 1000, 10000) and the share sizes (the 40%, 30%, 20%, 10% of the sum as selfish offers, and the 60%, 70%, 80%, 90% of the sum as altruistic offers), to examine the two hypotheses. The results showed that most proposers chose more fair offers with longer reaction times (RTs) no matter in choosing between the fair and selfish offers, or between the fair and altruistic offers. However, the proposers received explicit self-interest instruction chose more selfish offers accompanied with longer RTs in choosing between the fair and selfish offers. Therefore, the results supported the strategic concern hypothesis that previous proposers choosing the fair offers might be resulted from the fear of rejection by responders. Proposers would become more self-interest if the fear of being rejected is eliminated.

Keywords: Ultimatum game, self-interest, altruistic, fear of rejection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 843
3478 Design and Implementation of a Software Platform Based on Artificial Intelligence for Product Recommendation

Authors: G. Settanni, A. Panarese, R. Vaira, A. Galiano

Abstract:

Nowadays, artificial intelligence is used successfully in the field of e-commerce for its ability to learn from a large amount of data. In this research study, a prototype software platform was designed and implemented in order to suggest to users the most suitable products for their needs. The platform includes a recommender system based on artificial intelligence algorithms that provide suggestions and decision support to the customer. Specifically, support vector machine algorithms have been implemented combined with natural language processing techniques that allow the user to interact with the system, express their requests and receive suggestions. The interested user can access the web platform on the internet using a computer, tablet or mobile phone, register, provide the necessary information and view the products that the system deems them the most appropriate. The platform also integrates a dashboard that allows the use of the various functions, which the platform is equipped with, in an intuitive and simple way. Also, Long Short-Term Memory algorithms have been implemented and trained on historical data in order to predict customer scores of the different items. Items with the highest scores are recommended to customers.

Keywords: Deep Learning, Long Short-Term Memory, Machine Learning, Recommender Systems, Support Vector Machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 278
3477 Studying the Value-Added Chain for the Fish Distribution Process at Quang Binh Fishing Port in Vietnam

Authors: Van Chung Nguyen

Abstract:

The purpose of this research is to study the current status of the value chain for fish distribution at Quang Binh Fishing Port with 360 research samples, in which the research subjects are fishermen, traders, retailers, and businesses. The research uses the approach of applying the value chain theoretical framework of Kaplinsky and Morris to quantify and describe market channels and actors participating in the value chain and analyze the value-added process of these companies according to market channels. The analysis results show that fishermen directly catch fish with high economic efficiency, but processing enterprises and, especially retailers, are the agents to obtain higher added value. Processing enterprises play a role that is not really clear due to outdated processing technology; in contrast, retailers have the highest added value. This shows that the added value of the fish supply chain at Quang Binh fishing port is still limited, leading to low output quality. Therefore, the selling price of fish to the market is still high compared to the abundant fish resources, leading to low consumption and limiting exports due to the quality of processing enterprises. This reduces demand and fishing capacity, and productivity is lower than potential. To improve the fish value chain at fishing ports, it is necessary to focus on improving product quality, strengthening linkages between actors, building brands and product consumption markets at the same time, improving the capacity of export processing enterprises.

Keywords: Quang Binh fishing port, value chain, fish market, distributions channel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24
3476 Logic Programming and Artificial Neural Networks in Pharmacological Screening of Schinus Essential Oils

Authors: José Neves, M. Rosário Martins, Fátima Candeias, Diana Ferreira, Sílvia Arantes, Júlio Cruz-Morais, Guida Gomes, Joaquim Macedo, António Abelha, Henrique Vicente

Abstract:

Some plants of genus Schinus have been used in the folk medicine as topical antiseptic, digestive, purgative, diuretic, analgesic or antidepressant, and also for respiratory and urinary infections. Chemical composition of essential oils of S. molle and S. terebinthifolius had been evaluated and presented high variability according with the part of the plant studied and with the geographic and climatic regions. The pharmacological properties, namely antimicrobial, anti-tumoural and anti-inflammatory activities are conditioned by chemical composition of essential oils. Taking into account the difficulty to infer the pharmacological properties of Schinus essential oils without hard experimental approach, this work will focus on the development of a decision support system, in terms of its knowledge representation and reasoning procedures, under a formal framework based on Logic Programming, complemented with an approach to computing centered on Artificial Neural Networks and the respective Degree-of-Confidence that one has on such an occurrence.

Keywords: Artificial neuronal networks, essential oils, knowledge representation and reasoning, logic programming, Schinus molle L, Schinus terebinthifolius raddi.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2394
3475 Application of Generalized Stochastic Petri Nets(GSPN) in Modeling and Evaluating a Resource Sharing Flexible Manufacturing System

Authors: Aryanejad Mir Bahador Goli, Zahra Honarmand Shah Zileh

Abstract:

In most study fields, a phenomenon may not be studied directly but it will be examined indirectly by phenomenon model. Making an accurate model of system, there is attained new information from modeled phenomenon without any charge, danger, etc... there have been developed more solutions for describing and analyzing the recent complicated systems but few of them have analyzed the performance in the range of system description. Petri nets are of limited solutions which may make such union. Petri nets are being applied in problems related to modeling and designing the systems. Theory of Petri nets allow a system to model mathematically by a Petri net and analyzing the Petri net can then determine main information of modeled system-s structure and dynamic. This information can be used for assessing the performance of systems and suggesting corrections in the system. In this paper, beside the introduction of Petri nets, a real case study will be studied in order to show the application of generalized stochastic Petri nets in modeling a resource sharing production system and evaluating the efficiency of its machines and robots. The modeling tool used here is SHARP software which calculates specific indicators helping to make decision.

Keywords: Flexible manufacturing system, generalizedstochastic Petri nets, Markov chain, performance evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1882
3474 A New High Speed Neural Model for Fast Character Recognition Using Cross Correlation and Matrix Decomposition

Authors: Hazem M. El-Bakry

Abstract:

Neural processors have shown good results for detecting a certain character in a given input matrix. In this paper, a new idead to speed up the operation of neural processors for character detection is presented. Such processors are designed based on cross correlation in the frequency domain between the input matrix and the weights of neural networks. This approach is developed to reduce the computation steps required by these faster neural networks for the searching process. The principle of divide and conquer strategy is applied through image decomposition. Each image is divided into small in size sub-images and then each one is tested separately by using a single faster neural processor. Furthermore, faster character detection is obtained by using parallel processing techniques to test the resulting sub-images at the same time using the same number of faster neural networks. In contrast to using only faster neural processors, the speed up ratio is increased with the size of the input image when using faster neural processors and image decomposition. Moreover, the problem of local subimage normalization in the frequency domain is solved. The effect of image normalization on the speed up ratio of character detection is discussed. Simulation results show that local subimage normalization through weight normalization is faster than subimage normalization in the spatial domain. The overall speed up ratio of the detection process is increased as the normalization of weights is done off line.

Keywords: Fast Character Detection, Neural Processors, Cross Correlation, Image Normalization, Parallel Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1518
3473 Risk Management and Security Practice in Customs Supply Chain: Application of Cross ABC Method to the Moroccan Customs

Authors: Lamia Hammadi, Abdellah Ait Ouhman, Aomar Ibourk

Abstract:

It is widely assumed that the case of Customs Supply Chain is classified as a complex system, due to not only the variety and large number of actors, but also their complex structural links, and the interactions between these actors, that’s why this system is subject to various types of Risks. The economic, political and social impacts of those risks are highly detrimental to countries, businesses and the public, for this reason, Risk management in the customs supply chain is becoming a crucial issue to ensure the sustainability, security and safety. The main characteristic of customs risk management approach is determining which goods and means of transport should be examined? To what extend? And where future compliance resources should be directed? The purposes of this article are, firstly to deal with the concept of customs supply chain, secondly present our risk management approach based on Cross Activity Based Costing (ABC) Method as an interactive tool to support decision making in customs risk management. Finally, analysis of case study of Moroccan customs to putting theory into practice and will thus draw together the various elements of a structured and efficient risk management approach.

Keywords: Cross ABC Method, Customs Supply Chain, Risk, Risk Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3445
3472 Predictive Analytics of Student Performance Determinants in Education

Authors: Mahtab Davari, Charles Edward Okon, Somayeh Aghanavesi

Abstract:

Every institute of learning is usually interested in the performance of enrolled students. The level of these performances determines the approach an institute of study may adopt in rendering academic services. The focus of this paper is to evaluate students' academic performance in given courses of study using machine learning methods. This study evaluated various supervised machine learning classification algorithms such as Logistic Regression (LR), Support Vector Machine (SVM), Random Forest, Decision Tree, K-Nearest Neighbors, Linear Discriminant Analysis (LDA), and Quadratic Discriminant Analysis, using selected features to predict study performance. The accuracy, precision, recall, and F1 score obtained from a 5-Fold Cross-Validation were used to determine the best classification algorithm to predict students’ performances. SVM (using a linear kernel), LDA, and LR were identified as the best-performing machine learning methods. Also, using the LR model, this study identified students' educational habits such as reading and paying attention in class as strong determinants for a student to have an above-average performance. Other important features include the academic history of the student and work. Demographic factors such as age, gender, high school graduation, etc., had no significant effect on a student's performance.

Keywords: Student performance, supervised machine learning, prediction, classification, cross-validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 513
3471 Estimation of Asphalt Pavement Surfaces Using Image Analysis Technique

Authors: Mohammad A. Khasawneh

Abstract:

Asphalt concrete pavements gradually lose their skid resistance causing safety problems especially under wet conditions and high driving speeds. In order to enact the actual field polishing and wearing process of asphalt pavement surfaces in a laboratory setting, several laboratory-scale accelerated polishing devices were developed by different agencies. To mimic the actual process, friction and texture measuring devices are needed to quantify surface deterioration at different polishing intervals that reflect different stages of the pavement life. The test could still be considered lengthy and to some extent labor-intensive. Therefore, there is a need to come up with another method that can assist in investigating the bituminous pavement surface characteristics in a practical and time-efficient test procedure.

The purpose of this paper is to utilize a well-developed image analysis technique to characterize asphalt pavement surfaces without the need to use conventional friction and texture measuring devices in an attempt to shorten and simplify the polishing procedure in the lab.

Promising findings showed the possibility of using image analysis in lieu of the labor-sensitive-variable-in-nature friction and texture measurements. It was found that the exposed aggregate surface area of asphalt specimens made from limestone and gravel aggregates produced solid evidence of the validity of this method in describing asphalt pavement surfaces. Image analysis results correlated well with the British Pendulum Numbers (BPN), Polish Values (PV) and Mean Texture Depth (MTD) values.

Keywords: Friction, Image Analysis, Polishing, Statistical Analysis, Texture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2534
3470 Packet Forwarding with Multiprotocol Label Switching

Authors: R.N.Pise, S.A.Kulkarni, R.V.Pawar

Abstract:

MultiProtocol Label Switching (MPLS) is an emerging technology that aims to address many of the existing issues associated with packet forwarding in today-s Internetworking environment. It provides a method of forwarding packets at a high rate of speed by combining the speed and performance of Layer 2 with the scalability and IP intelligence of Layer 3. In a traditional IP (Internet Protocol) routing network, a router analyzes the destination IP address contained in the packet header. The router independently determines the next hop for the packet using the destination IP address and the interior gateway protocol. This process is repeated at each hop to deliver the packet to its final destination. In contrast, in the MPLS forwarding paradigm routers on the edge of the network (label edge routers) attach labels to packets based on the forwarding Equivalence class (FEC). Packets are then forwarded through the MPLS domain, based on their associated FECs , through swapping the labels by routers in the core of the network called label switch routers. The act of simply swapping the label instead of referencing the IP header of the packet in the routing table at each hop provides a more efficient manner of forwarding packets, which in turn allows the opportunity for traffic to be forwarded at tremendous speeds and to have granular control over the path taken by a packet. This paper deals with the process of MPLS forwarding mechanism, implementation of MPLS datapath , and test results showing the performance comparison of MPLS and IP routing. The discussion will focus primarily on MPLS IP packet networks – by far the most common application of MPLS today.

Keywords: Forwarding equivalence class, incoming label map, label, next hop label forwarding entry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2676
3469 Data Mining to Capture User-Experience: A Case Study in Notebook Product Appearance Design

Authors: Rhoann Kerh, Chen-Fu Chien, Kuo-Yi Lin

Abstract:

In the era of rapidly increasing notebook market, consumer electronics manufacturers are facing a highly dynamic and competitive environment. In particular, the product appearance is the first part for user to distinguish the product from the product of other brands. Notebook product should differ in its appearance to engage users and contribute to the user experience (UX). The UX evaluates various product concepts to find the design for user needs; in addition, help the designer to further understand the product appearance preference of different market segment. However, few studies have been done for exploring the relationship between consumer background and the reaction of product appearance. This study aims to propose a data mining framework to capture the user’s information and the important relation between product appearance factors. The proposed framework consists of problem definition and structuring, data preparation, rules generation, and results evaluation and interpretation. An empirical study has been done in Taiwan that recruited 168 subjects from different background to experience the appearance performance of 11 different portable computers. The results assist the designers to develop product strategies based on the characteristics of consumers and the product concept that related to the UX, which help to launch the products to the right customers and increase the market shares. The results have shown the practical feasibility of the proposed framework.

Keywords: Consumers Decision Making, Product Design, Rough Set Theory, User Experience.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3483
3468 Evaluation of the Quality of Education Offered to Students with Special Needs in Public Schools in the City of Bauru, Brazil

Authors: V. L. M. F. Capellini, A. P. P. M. Maturana, N. C. M. Brondino, M. B. C. L. B. M. Peixoto, A. J. Broughton

Abstract:

A paradigm shift is a process. The process of implementing inclusive education, a system constructed to support all learners, requires planning, identification, experimentation, and evaluation. In this vein, the purpose of the present study was to evaluate the capacity of one Brazilian state school systems to provide special education students with a quality inclusive education. This study originated at the behest of concerned families of students with special needs who filed complaints with the Municipality of Bauru, São Paulo. These families claimed, 1) children with learning differences and educational needs had not been identified for services, and 2) those who had been identified had not received sufficient specialized educational assistance (SEA) in schools across the City of Bauru. Hence, the Office of Civil Rights for the state of São Paulo (Ministério Público de São Paulo) summoned the local higher education institution, UNESP, to design a research study to investigate these allegations. In this exploratory study, descriptive data were gathered from all elementary and middle schools including 58 state schools and 17 city schools, for a total of 75 schools overall. Data collection consisted of each school's annual strategic action plan, surveys and interviews with all school stakeholders to determine their perceptions of the inclusive education available to students with Special Education Needs (SEN). The data were collected as one of four stages in a larger study which also included field observations of a focal students' experience and a continuing education course for all teachers and administrators in both state and city schools. For the purposes of this study, the researchers were interested in understanding the perceptions of school staff, parents, and students across all schools. Therefore, documents and surveys from 75 schools were analyzed for adherence to federal legislation guaranteeing students with SEN the right to special education assistance within the regular school setting. Results shows that while some schools recognized the legal rights of SEN students to receive special education, the plans to actually deliver services were absent. In conclusion, the results of this study revealed both school staff and families have insufficient planning and accessibility resources, and the schools have inadequate infrastructure for full-time support to SEN students, i.e., structures and systems to support the identification of SEN and delivery of services within schools of Bauru, SP. Having identified the areas of need, the city is now prepared to take next steps in the process toward preparing all schools to be inclusive.

Keywords: Inclusive education, special education, special needs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 993
3467 Finding Pareto Optimal Front for the Multi- Mode Time, Cost Quality Trade-off in Project Scheduling

Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo

Abstract:

Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.

Keywords: FastPGA, Multi-Execution Activity Mode, Pareto Optimality, Project Scheduling, Time-Cost-Quality Trade-Off.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784
3466 Ghost Frequency Noise Reduction through Displacement Deviation Analysis

Authors: Paua Ketan, Bhagate Rajkumar, Adiga Ganesh, M. Kiran

Abstract:

Low gear noise is an important sound quality feature in modern passenger cars. Annoying gear noise from the gearbox is influenced by the gear design, gearbox shaft layout, manufacturing deviations in the components, assembly errors and the mounting arrangement of the complete gearbox. Geometrical deviations in the form of profile and lead errors are often present on the flanks of the inspected gears. Ghost frequencies of a gear are very challenging to identify in standard gear measurement and analysis process due to small wavelengths involved. In this paper, gear whine noise occurring at non-integral multiples of gear mesh frequency of passenger car gearbox is investigated and the root cause is identified using the displacement deviation analysis (DDA) method. DDA method is applied to identify ghost frequency excitations on the flanks of gears arising out of generation grinding. Frequency identified through DDA correlated with the frequency of vibration and noise on the end-of-line machine as well as vehicle level measurements. With the application of DDA method along with standard lead profile measurement, gears with ghost frequency geometry deviations were identified on the production line to eliminate defective parts and thereby eliminate ghost frequency noise from a vehicle. Further, displacement deviation analysis can be used in conjunction with the manufacturing process simulation to arrive at suitable countermeasures for arresting the ghost frequency.

Keywords: Displacement deviation analysis, gear whine, ghost frequency, sound quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 753
3465 Evaluating Efficiency of Nina Distribution Company Using Window Data Envelopment Analysis and Malmquist Index

Authors: Hossein Taherian Far, Ali Bazaee

Abstract:

Achieving continuous sustained economic growth and following economic development can be the target for all countries which are looking for it. In this regard, distribution industry plays an important role in growth and development of any nation. So, estimating the efficiency and productivity of the so called industry and identifying factors influencing it, is very necessary. The objective of the present study is to measure the efficiency and productivity of seven branches of Nina Distribution Company using window data envelopment analysis and Malmquist productivity index from spring 2013 to summer 2015. In this study, using criteria of fixed assets, payroll personnel, operating costs and duration of collection of receivables were selected as inputs and people and net sales, gross profit and percentage of coverage to customers were selected as outputs. Then, the process of performance window data envelopment analysis was driven and process efficiency has been measured using Malmquist index. The results indicate that the average technical efficiency of window Data Envelopment Analysis (DEA) model and fluctuating trend is sustainable. But the average management efficiency in window DEA model is related with negative growth (decline) of about 13%. The mean scale efficiency in all windows, except in the second one which is faced with 8%, shows growth of 18% compared to the first window. On the other hand, the mean change in total factor productivity in all branches of the industry shows average negative growth (decrease) of 12% which are the result of a negative change in technology.

Keywords: Nina Distribution Company branches, window data envelopment analysis, Malmquist productivity index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1148