Search results for: maximum likelihood approach
5764 HIV Modelling - Parallel Implementation Strategies
Authors: Dimitri Perrin, Heather J. Ruskin, Martin Crane
Abstract:
We report on the development of a model to understand why the range of experience with respect to HIV infection is so diverse, especially with respect to the latency period. To investigate this, an agent-based approach is used to extract highlevel behaviour which cannot be described analytically from the set of interaction rules at the cellular level. A network of independent matrices mimics the chain of lymph nodes. Dealing with massively multi-agent systems requires major computational effort. However, parallelisation methods are a natural consequence and advantage of the multi-agent approach and, using the MPI library, are here implemented, tested and optimized. Our current focus is on the various implementations of the data transfer across the network. Three communications strategies are proposed and tested, showing that the most efficient approach is communication based on the natural lymph-network connectivity.Keywords: HIV, Immune modelling, MPI, Parallelisation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15505763 Introduction to Techno-Sectoral Innovation System Modeling and Functions Formulating
Authors: S. M. Azad, H. Ghodsipour, F. Roshannafas
Abstract:
In recent years ‘technology management and policymaking’ is one of the most important problems in management science. In this field, different generations of innovation and technology management are presented which the earliest one is Innovation System (IS) approach. In a general classification, innovation systems are divided in to 4 approaches: technical, sectoral, regional, and national. There are many researches in relation to each of these approaches in different academic fields. Every approach has some benefits. If two or more approaches hybrid, their benefits would be combined. In addition, according to the sectoral structure of the governance model in Iran, in many sectors, such as information technology, the combination of three other approaches with sectoral approach is essential. Hence, in this paper, combining two IS approaches (technical and sectoral) and using system dynamics, a generic model is presented for a sample of software industry. As a complimentary point, this article is introducing a new hybrid approach called Techno-Sectoral Innovation System. This TSIS model is accomplished by Changing concepts of the ‘functions’-which came from Technological IS literature- and using them into sectoral system as measurable indicators.
Keywords: Innovation system, technology, techno-sectoral system, functional indicators, system dynamics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17705762 Decision Tree Based Scheduling for Flexible Job Shops with Multiple Process Plans
Authors: H.-H. Doh, J.-M. Yu, Y.-J. Kwon, J.-H. Shin, H.-W. Kim, S.-H. Nam, D.-H. Lee
Abstract:
This paper suggests a decision tree based approach for flexible job shop scheduling with multiple process plans, i.e. each job can be processed through alternative operations, each of which can be processed on alternative machines. The main decision variables are: (a) selecting operation/machine pair; and (b) sequencing the jobs assigned to each machine. As an extension of the priority scheduling approach that selects the best priority rule combination after many simulation runs, this study suggests a decision tree based approach in which a decision tree is used to select a priority rule combination adequate for a specific system state and hence the burdens required for developing simulation models and carrying out simulation runs can be eliminated. The decision tree based scheduling approach consists of construction and scheduling modules. In the construction module, a decision tree is constructed using a four-stage algorithm, and in the scheduling module, a priority rule combination is selected using the decision tree. To show the performance of the decision tree based approach suggested in this study, a case study was done on a flexible job shop with reconfigurable manufacturing cells and a conventional job shop, and the results are reported by comparing it with individual priority rule combinations for the objectives of minimizing total flow time and total tardiness.
Keywords: Flexible job shop scheduling, Decision tree, Priority rules, Case study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33185761 Towards Development of Solution for Business Process-Oriented Data Analysis
Authors: M. Klimavicius
Abstract:
This paper proposes a modeling methodology for the development of data analysis solution. The Author introduce the approach to address data warehousing issues at the at enterprise level. The methodology covers the process of the requirements eliciting and analysis stage as well as initial design of data warehouse. The paper reviews extended business process model, which satisfy the needs of data warehouse development. The Author considers that the use of business process models is necessary, as it reflects both enterprise information systems and business functions, which are important for data analysis. The Described approach divides development into three steps with different detailed elaboration of models. The Described approach gives possibility to gather requirements and display them to business users in easy manner.Keywords: Data warehouse, data analysis, business processmanagement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13925760 Seamless Flow of Voluminous Data in High Speed Network without Congestion Using Feedback Mechanism
Abstract:
Continuously growing needs for Internet applications that transmit massive amount of data have led to the emergence of high speed network. Data transfer must take place without any congestion and hence feedback parameters must be transferred from the receiver end to the sender end so as to restrict the sending rate in order to avoid congestion. Even though TCP tries to avoid congestion by restricting the sending rate and window size, it never announces the sender about the capacity of the data to be sent and also it reduces the window size by half at the time of congestion therefore resulting in the decrease of throughput, low utilization of the bandwidth and maximum delay. In this paper, XCP protocol is used and feedback parameters are calculated based on arrival rate, service rate, traffic rate and queue size and hence the receiver informs the sender about the throughput, capacity of the data to be sent and window size adjustment, resulting in no drastic decrease in window size, better increase in sending rate because of which there is a continuous flow of data without congestion. Therefore as a result of this, there is a maximum increase in throughput, high utilization of the bandwidth and minimum delay. The result of the proposed work is presented as a graph based on throughput, delay and window size. Thus in this paper, XCP protocol is well illustrated and the various parameters are thoroughly analyzed and adequately presented.Keywords: Bandwidth-Delay Product, Congestion Control, Congestion Window, TCP/IP
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14875759 Effect of Concrete Strength and Aspect Ratio on Strength and Ductility of Concrete Columns
Authors: Mohamed A. Shanan, Ashraf H. El-Zanaty, Kamal G. Metwally
Abstract:
This paper presents the effect of concrete compressive strength and rectangularity ratio on strength and ductility of normal and high strength reinforced concrete columns confined with transverse steel under axial compressive loading. Nineteen normal strength concrete rectangular columns with different variables tested in this research were used to study the effect of concrete compressive strength and rectangularity ratio on strength and ductility of columns. The paper also presents a nonlinear finite element analysis for these specimens and another twenty high strength concrete square columns tested by other researchers using ANSYS 15 finite element software. The results indicate that the axial force – axial strain relationship obtained from the analytical model using ANSYS are in good agreement with the experimental data. The comparison shows that the ANSYS is capable of modeling and predicting the actual nonlinear behavior of confined normal and high-strength concrete columns under concentric loading. The maximum applied load and the maximum strain have also been confirmed to be satisfactory. Depending on this agreement between the experimental and analytical results, a parametric numerical study was conducted by ANSYS 15 to clarify and evaluate the effect of each variable on strength and ductility of the columns.
Keywords: ANSYS, concrete compressive strength effect, ductility, rectangularity ratio, strength.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18935758 A Heuristic Algorithm Approach for Scheduling of Multi-criteria Unrelated Parallel Machines
Authors: Farhad Kolahan, Vahid Kayvanfar
Abstract:
In this paper we address a multi-objective scheduling problem for unrelated parallel machines. In unrelated parallel systems, the processing cost/time of a given job on different machines may vary. The objective of scheduling is to simultaneously determine the job-machine assignment and job sequencing on each machine. In such a way the total cost of the schedule is minimized. The cost function consists of three components, namely; machining cost, earliness/tardiness penalties and makespan related cost. Such scheduling problem is combinatorial in nature. Therefore, a Simulated Annealing approach is employed to provide good solutions within reasonable computational times. Computational results show that the proposed approach can efficiently solve such complicated problems.
Keywords: Makespan, Parallel machines, Scheduling, Simulated Annealing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16545757 Augmented Reality in Advertising and Brand Communication: An Experimental Study
Authors: O. Mauroner, L. Le, S. Best
Abstract:
Digital technologies offer many opportunities in the design and implementation of brand communication and advertising. Augmented reality (AR) is an innovative technology in marketing communication that focuses on the fact that virtual interaction with a product ad offers additional value to consumers. AR enables consumers to obtain (almost) real product experiences by the way of virtual information even before the purchase of a certain product. Aim of AR applications in relation with advertising is in-depth examination of product characteristics to enhance product knowledge as well as brand knowledge. Interactive design of advertising provides observers with an intense examination of a specific advertising message and therefore leads to better brand knowledge. The elaboration likelihood model and the central route to persuasion strongly support this argumentation. Nevertheless, AR in brand communication is still in an initial stage and therefore scientific findings about the impact of AR on information processing and brand attitude are rare. The aim of this paper is to empirically investigate the potential of AR applications in combination with traditional print advertising. To that effect an experimental design with different levels of interactivity is built to measure the impact of interactivity of an ad on different variables o advertising effectiveness.Keywords: Advertising effectiveness, augmented reality, brand communication, brand recall, interactivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 49165756 A Proposed Approach for Emotion Lexicon Enrichment
Authors: Amr Mansour Mohsen, Hesham Ahmed Hassan, Amira M. Idrees
Abstract:
Document Analysis is an important research field that aims to gather the information by analyzing the data in documents. As one of the important targets for many fields is to understand what people actually want, sentimental analysis field has been one of the vital fields that are tightly related to the document analysis. This research focuses on analyzing text documents to classify each document according to its opinion. The aim of this research is to detect the emotions from text documents based on enriching the lexicon with adapting their content based on semantic patterns extraction. The proposed approach has been presented, and different experiments are applied by different perspectives to reveal the positive impact of the proposed approach on the classification results.Keywords: Document analysis, sentimental analysis, emotion detection, WEKA tool, NRC Lexicon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14565755 Designing Pictogram for Food Portion Size
Authors: Y.C. Liu, S.J. Lu, Y.C. Weng, H. Su
Abstract:
The objective of this paper is to investigate a new approach based on the idea of pictograms for food portion size. This approach adopts the model of the United States Pharmacopeia- Drug Information (USP-DI). The representation of each food portion size composed of three parts: frame, the connotation of dietary portion sizes and layout. To investigate users- comprehension based on this approach, two experiments were conducted, included 122 Taiwanese people, 60 male and 62 female with ages between 16 and 64 (divided into age groups of 16-30, 31-45 and 46-64). In Experiment 1, the mean correcting rate of the understanding level of food items is 48.54% (S.D.= 95.08) and the mean response time 2.89sec (S.D.=2.14). The difference on the correct rates for different age groups is significant (P*=0.00<0.05). In Experiment 2, the correcting rate of selecting the right life-size measurement aid is 65.02% (S.D.=21.31). The result showed the potential of the approach for certain food potion sizes. Issues raised for discussions including comprehension on numerous food varieties in an open environment, selection of photograph or drawing, reasons of different correcting rates for the measurement aid. This research also could be used for those interested in systematic and pictorial representation of dietary portion size information.Keywords: Comprehension, Food Portion Size, Model of DietaryInformation, Pictogram Design, USP-DI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19365754 Experimental Investigation of the Effect of Compression Ratio in a Direct Injection Diesel Engine Running on Different Blends of Rice Bran Oil and Ethanol
Authors: Perminderjit Singh, Randeep Singh
Abstract:
The performance, emission and combustion characteristics of a single cylinder four stroke variable compression ratio multi fuel engine when fueled with different blends of rice bran oil methyl ester and ethanol are investigated and compared with the results of standard diesel. Bio diesel produced from Rice bran oil by transesterification process has been used in this study. Experiment has been conducted at a fixed engine speed of 1500 rpm, 50% load and at compression ratios of 16.5:1, 17:1, 17.5:1 and 18:1. The impact of compression ratio on fuel consumption, brake thermal efficiency and exhaust gas emissions has been investigated and presented. Optimum compression ratio which gives best performance has been identified. The results indicate longer ignition delay, maximum rate of pressure rise, lower heat release rate and higher mass fraction burnt at higher compression ratio for waste cooking oil methyl ester when compared to that of diesel. The brake thermal efficiency at 50% load for Rice bran oil methyl ester blends and diesel has been calculated and the blend B40 is found to give maximum thermal efficiency. The blends when used as fuel results in reduction of carbon monoxide, hydrocarbon and increase in nitrogen oxides emissions.
Keywords: Biodiesel, Rice bran oil, Transesterification, Ethanol, Compression Ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38535753 Multi-Case Multi-Objective Simulated Annealing (MC-MOSA): New Approach to Adapt Simulated Annealing to Multi-objective Optimization
Authors: Abdelfatteh Haidine, Ralf Lehnert
Abstract:
In this paper a new approach is proposed for the adaptation of the simulated annealing search in the field of the Multi-Objective Optimization (MOO). This new approach is called Multi-Case Multi-Objective Simulated Annealing (MC-MOSA). It uses some basics of a well-known recent Multi-Objective Simulated Annealing proposed by Ulungu et al., which is referred in the literature as U-MOSA. However, some drawbacks of this algorithm have been found, and are substituted by other ones, especially in the acceptance decision criterion. The MC-MOSA has shown better performance than the U-MOSA in the numerical experiments. This performance is further improved by some other subvariants of the MC-MOSA, such as Fast-annealing MC-MOSA, Re-annealing MCMOSA and the Two-Stage annealing MC-MOSA.Keywords: Simulated annealing, multi-objective optimization, acceptance decision criteria, re-annealing, two-stage annealing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17575752 Entropy Based Spatial Design: A Genetic Algorithm Approach (Case Study)
Authors: Abbas Siefi, Mohammad Javad Karimifar
Abstract:
We study the spatial design of experiment and we want to select a most informative subset, having prespecified size, from a set of correlated random variables. The problem arises in many applied domains, such as meteorology, environmental statistics, and statistical geology. In these applications, observations can be collected at different locations and possibly at different times. In spatial design, when the design region and the set of interest are discrete then the covariance matrix completely describe any objective function and our goal is to choose a feasible design that minimizes the resulting uncertainty. The problem is recast as that of maximizing the determinant of the covariance matrix of the chosen subset. This problem is NP-hard. For using these designs in computer experiments, in many cases, the design space is very large and it's not possible to calculate the exact optimal solution. Heuristic optimization methods can discover efficient experiment designs in situations where traditional designs cannot be applied, exchange methods are ineffective and exact solution not possible. We developed a GA algorithm to take advantage of the exploratory power of this algorithm. The successful application of this method is demonstrated in large design space. We consider a real case of design of experiment. In our problem, design space is very large and for solving the problem, we used proposed GA algorithm.
Keywords: Spatial design of experiments, maximum entropy sampling, computer experiments, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16575751 Technological Environment - International Marketing Strategy Relationship
Authors: Suthawan Chirapanda
Abstract:
International trade involves both large and small firms engaged in business overseas. Possible drivers that force companies to enter international markets include increasing competition at the domestic market, maturing domestic markets, and limited domestic market opportunities. Technology is an important driving factor in shaping international marketing strategy as well as in driving force towards a more global marketplace, especially technology in communication. It includes telephones, the internet, computer systems and e-mail. There are three main marketing strategy choices, namely standardization approach, adaptation approach and middleof- the-road approach that companies implement to overseas markets. The decision depends on situations and factors facing the companies in the international markets. In this paper, the contingency concept is considered that no single strategy can be effective in all contexts. The effect of strategy on performance depends on specific situational variables. Strategic fit is employed to investigate export marketing strategy adaptation under certain environmental conditions, which in turn can lead to superior performance.Keywords: Contingency approach, international marketing strategy, strategic fit, technological environment
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 67815750 Some New Bounds for a Real Power of the Normalized Laplacian Eigenvalues
Authors: Ayşe Dilek Maden
Abstract:
For a given a simple connected graph, we present some new bounds via a new approach for a special topological index given by the sum of the real number power of the non-zero normalized Laplacian eigenvalues. To use this approach presents an advantage not only to derive old and new bounds on this topic but also gives an idea how some previous results in similar area can be developed.
Keywords: Degree Kirchhoff index, normalized Laplacian eigenvalue, spanning tree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22015749 Multimethod Approach to Research in Interlanguage Pragmatics
Authors: Saad Al-Gahtani, Ghassan H Al Shatter
Abstract:
Argument over the use of particular method in interlanguage pragmatics has increased recently. Researchers argued the advantages and disadvantages of each method either natural or elicited. Findings of different studies indicated that the use of one method may not provide enough data to answer all its questions. The current study investigated the validity of using multimethod approach in interlanguage pragmatics to understand the development of requests in Arabic as a second language (Arabic L2). To this end, the study adopted two methods belong to two types of data sources: the institutional discourse (natural data), and the role play (elicited data). Participants were 117 learners of Arabic L2 at the university level, representing four levels (beginners, low-intermediate, highintermediate, and advanced). Results showed that using two or more methods in interlanguage pragmatics affect the size and nature of data.
Keywords: Arabic L2, Development of requests, Interlanguage Pragmatics, Multimethod approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18305748 Supplier Selection by Bi-Objectives Mixed Integer Program Approach
Authors: K.-H. Yang
Abstract:
In the past, there was a lot of excellent research studies conducted on topics related to supplier selection. Because the considered factors of supplier selection are complicated and difficult to be quantified, most researchers deal supplier selection issues by qualitative approaches. Compared to qualitative approaches, quantitative approaches are less applicable in the real world. This study tried to apply the quantitative approach to study a supplier selection problem with considering operation cost and delivery reliability. By those factors, this study applies Normalized Normal Constraint Method to solve the dual objectives mixed integer program of the supplier selection problem.Keywords: Bi-objectives MIP, normalized normal constraint method, supplier selection, quantitative approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9735747 Resource Allocation and Task Scheduling with Skill Level and Time Bound Constraints
Authors: Salam Saudagar, Ankit Kamboj, Niraj Mohan, Satgounda Patil, Nilesh Powar
Abstract:
Task Assignment and Scheduling is a challenging Operations Research problem when there is a limited number of resources and comparatively higher number of tasks. The Cost Management team at Cummins needs to assign tasks based on a deadline and must prioritize some of the tasks as per business requirements. Moreover, there is a constraint on the resources that assignment of tasks should be done based on an individual skill level, that may vary for different tasks. Another constraint is for scheduling the tasks that should be evenly distributed in terms of number of working hours, which adds further complexity to this problem. The proposed greedy approach to solve assignment and scheduling problem first assigns the task based on management priority and then by the closest deadline. This is followed by an iterative selection of an available resource with the least allocated total working hours for a task, i.e. finding the local optimal choice for each task with the goal of determining the global optimum. The greedy approach task allocation is compared with a variant of Hungarian Algorithm, and it is observed that the proposed approach gives an equal allocation of working hours among the resources. The comparative study of the proposed approach is also done with manual task allocation and it is noted that the visibility of the task timeline has increased from 2 months to 6 months. An interactive dashboard app is created for the greedy assignment and scheduling approach and the tasks with more than 2 months horizon that were waiting in a queue without a delivery date initially are now analyzed effectively by the business with expected timelines for completion.
Keywords: Assignment, deadline, greedy approach, hungarian algorithm, operations research, scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12015746 A Simulation Modeling Approach for Optimization of Storage Space Allocation in Container Terminal
Authors: Gamal Abd El-Nasser A. Said, El-Sayed M. El-Horbaty
Abstract:
Container handling problems at container terminals are NP-hard problems. This paper presents an approach using discrete-event simulation modeling to optimize solution for storage space allocation problem, taking into account all various interrelated container terminal handling activities. The proposed approach is applied on a real case study data of container terminal at Alexandria port. The computational results show the effectiveness of the proposed model for optimization of storage space allocation in container terminal where 54% reduction in containers handling time in port is achieved.
Keywords: Container terminal, discrete-event simulation, optimization, storage space allocation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29415745 A Study on Human Musculoskeletal Model for Cycle Fitting: Comparison with EMG
Authors: Yoon- Ho Shin, Jin-Seung Choi, Dong-Won Kang, Jeong-Woo Seo, Joo-Hack Lee, Ju-Young Kim, Dae-Hyeok Kim, Seung-Tae Yang, Gye-Rae Tack
Abstract:
It is difficult to study the effect of various variables on cycle fitting through actual experiment. To overcome such difficulty, the forward dynamics of a musculoskeletal model was applied to cycle fitting in this study. The measured EMG data weres compared with the muscle activities of the musculoskeletal model through forward dynamics. EMG data were measured from five cyclists who do not have musculoskeletal diseases during three minutes pedaling with a constant load (150 W) and cadence (90 RPM). The muscles used for the analysis were the Vastus Lateralis (VL), Tibialis Anterior (TA), Bicep Femoris (BF), and Gastrocnemius Medial (GM). Person’s correlation coefficients of the muscle activity patterns, the peak timing of the maximum muscle activities, and the total muscle activities were calculated and compared. BIKE3D model of AnyBody (Anybodytech, Denmark) was used for the musculoskeletal model simulation. The comparisons of the actual experiments with the simulation results showed significant correlations in the muscle activity patterns (VL: 0.789, TA: 0.503, BF: 0.468, GM: 0.670). The peak timings of the maximum muscle activities were distributed at particular phases. The total muscle activities were compared with the normalized muscle activities, and the comparison showed about 10% difference in the VL (+10%), TA (+9.7%), and BF (+10%), excluding the GM (+29.4%). Thus, it can be concluded that muscle activities of model & experiment showed similar results. The results of this study indicated that it was possible to apply the simulation of further improved musculoskeletal model to cycle fitting.
Keywords: Cycle fitting, EMG, Musculoskeletal modeling, Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31755744 DODR : Delay On-Demand Routing
Authors: Dong Wan-li, Gu Nai-jie, Tu Kun, Bi Kun, Liu Gang
Abstract:
As originally designed for wired networks, TCP (transmission control protocol) congestion control mechanism is triggered into action when packet loss is detected. This implicit assumption for packet loss mostly due to network congestion does not work well in Mobile Ad Hoc Network, where there is a comparatively high likelihood of packet loss due to channel errors and node mobility etc. Such non-congestion packet loss, when dealt with by congestion control mechanism, causes poor TCP performance in MANET. In this study, we continue to investigate the impact of the interaction between transport protocols and on-demand routing protocols on the performance and stability of 802.11 multihop networks. We evaluate the important wireless networking events caused routing change, and propose a cross layer method to delay the unnecessary routing changes, only need to add a sensitivity parameter α , which represents the on-demand routing-s reaction to link failure of MAC layer. Our proposal is applicable to the plain 802.11 networking environment, the simulation results that this method can remarkably improve the stability and performance of TCP without any modification on TCP and MAC protocol.
Keywords: Mobile ad hoc networks (MANET), on-demandrouting, performance, transmission control protocol (TCP).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17925743 Fuzzy Risk-Based Life Cycle Assessment for Estimating Environmental Aspects in EMS
Authors: Kevin Fong-Rey Liu, Ken Yeh, Cheng-Wu Chen, Han-Hsi Liang
Abstract:
Environmental aspects plays a central role in environmental management system (EMS) because it is the basis for the identification of an organization-s environmental targets. The existing methods for the assessment of environmental aspects are grouped into three categories: risk assessment-based (RA-based), LCA-based and criterion-based methods. To combine the benefits of these three categories of research, this study proposes an integrated framework, combining RA-, LCA- and criterion-based methods. The integrated framework incorporates LCA techniques for the identification of the causal linkage for aspect, pathway, receptor and impact, uses fuzzy logic to assess aspects, considers fuzzy conditions, in likelihood assessment, and employs a new multi-criteria decision analysis method - multi-criteria and multi-connection comprehensive assessment (MMCA) - to estimate significant aspects in EMS. The proposed model is verified, using a real case study and the results show that this method successfully prioritizes the environmental aspects.Keywords: Environmental management system, environmental aspect, risk assessment, life cycle assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22195742 A Combinatorial Model for ECG Interpretation
Authors: Costas S. Iliopoulos, Spiros Michalakopoulos
Abstract:
A new, combinatorial model for analyzing and inter- preting an electrocardiogram (ECG) is presented. An application of the model is QRS peak detection. This is demonstrated with an online algorithm, which is shown to be space as well as time efficient. Experimental results on the MIT-BIH Arrhythmia database show that this novel approach is promising. Further uses for this approach are discussed, such as taking advantage of its small memory requirements and interpreting large amounts of pre-recorded ECG data.Keywords: Combinatorics, ECG analysis, MIT-BIH Arrhythmia Database, QRS Detection, String Algorithms
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19395741 Numerical Modelling of Dust Propagation in the Atmosphere of Tbilisi City in Case of Western Background Light Air
Authors: N. Gigauri, V. Kukhalashvili, A. Surmava, L. Intskirveli, L. Gverdtsiteli
Abstract:
Tbilisi, a large city of the South Caucasus, is a junction point connecting Asia and Europe, Russia and republics of the Asia Minor. Over the last years, its atmosphere has been experienced an increasing anthropogenic load. Numerical modeling method is used for study of Tbilisi atmospheric air pollution. By means of 3D non-linear non-steady numerical model a peculiarity of city atmosphere pollution is investigated during background western light air. Dust concentration spatial and time changes are determined. There are identified the zones of high, average and less pollution, dust accumulation areas, transfer directions etc. By numerical modeling, there is shown that the process of air pollution by the dust proceeds in four stages, and they depend on the intensity of motor traffic, the micro-relief of the city, and the location of city mains. In the interval of time 06:00-09:00 the intensive growth, 09:00-15:00 a constancy or weak decrease, 18:00-21:00 an increase, and from 21:00 to 06:00 a reduction of the dust concentrations take place. The highly polluted areas are located in the vicinity of the city center and at some peripherical territories of the city, where the maximum dust concentration at 9PM is equal to 2 maximum allowable concentrations. The similar investigations conducted in case of various meteorological situations will enable us to compile the map of background urban pollution and to elaborate practical measures for ambient air protection.
Keywords: Numerical modelling, source of pollution, dust propagation, western light air.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4895740 A Neural Approach for Color-Textured Images Segmentation
Authors: Khalid Salhi, El Miloud Jaara, Mohammed Talibi Alaoui
Abstract:
In this paper, we present a neural approach for unsupervised natural color-texture image segmentation, which is based on both Kohonen maps and mathematical morphology, using a combination of the texture and the image color information of the image, namely, the fractal features based on fractal dimension are selected to present the information texture, and the color features presented in RGB color space. These features are then used to train the network Kohonen, which will be represented by the underlying probability density function, the segmentation of this map is made by morphological watershed transformation. The performance of our color-texture segmentation approach is compared first, to color-based methods or texture-based methods only, and then to k-means method.Keywords: Segmentation, color-texture, neural networks, fractal, watershed.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13745739 Multi-Level Meta-Modeling for Enabling Dynamic Subtyping for Industrial Automation
Authors: Zoltan Theisz, Gergely Mezei
Abstract:
Modern industrial automation relies on service oriented concepts of Internet of Things (IoT) device modeling in order to provide a flexible and extendable environment for service meta-repository. However, state-of-the-art meta-modeling techniques prefer design-time modeling, which results in a heavy usage of class sometimes unnecessary static subtyping. Although this approach benefits from clear-cut object-oriented design principles, it also seals the model repository for further dynamic extensions. In this paper, a dynamic multi-level modeling approach is introduced that enables dynamic subtyping through a more relaxed partial instantiation mechanism. The approach is demonstrated on a simple sensor network example.Keywords: Meta-modeling, dynamic subtyping, DMLA, industrial automation, arrowhead.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11245738 Utilization of Whey for the Production of β-Galactosidase Using Yeast and Fungal Culture
Authors: Rupinder Kaur, Parmjit S. Panesar, Ram S. Singh
Abstract:
Whey is the lactose rich by-product of the dairy industry, having good amount of nutrient reservoir. Most abundant nutrients are lactose, soluble proteins, lipids and mineral salts. Disposing of whey by most of milk plants which do not have proper pre-treatment system is the major issue. As a result of which, there can be significant loss of potential food and energy source. Thus, whey has been explored as the substrate for the synthesis of different value added products such as enzymes. β-galactosidase is one of the important enzymes and has become the major focus of research due to its ability to catalyze both hydrolytic as well as transgalactosylation reaction simultaneously. The enzyme is widely used in dairy industry as it catalyzes the transformation of lactose to glucose and galactose, making it suitable for the lactose intolerant people. The enzyme is intracellular in both bacteria and yeast, whereas for molds, it has an extracellular location. The present work was carried to utilize the whey for the production of β-galactosidase enzyme using both yeast and fungal cultures. The yeast isolate Kluyveromyces marxianus WIG2 and various fungal strains have been used in the present study. Different disruption techniques have also been investigated for the extraction of the enzyme produced intracellularly from yeast cells. Among the different methods tested for the disruption of yeast cells, SDS-chloroform showed the maximum β-galactosidase activity. In case of the tested fungal cultures, Aureobasidium pullulans NCIM 1050 was observed to be the maximum extracellular enzyme producer.Keywords: β-galactosidase, fungus, yeast, whey.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 55795737 Application New Approach with Two Networks Slow and Fast on the Asynchronous Machine
Authors: Samia Salah, M’hamed Hadj Sadok, Abderrezak Guessoum
Abstract:
In this paper, we propose a new modular approach called neuroglial consisting of two neural networks slow and fast which emulates a biological reality recently discovered. The implementation is based on complex multi-time scale systems; validation is performed on the model of the asynchronous machine. We applied the geometric approach based on the Gerschgorin circles for the decoupling of fast and slow variables, and the method of singular perturbations for the development of reductions models.
This new architecture allows for smaller networks with less complexity and better performance in terms of mean square error and convergence than the single network model.
Keywords: Gerschgorin’s Circles, Neuroglial Network, Multi time scales systems, Singular perturbation method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16055736 EASEL: Evaluation of Algorithmic Skills in an Environment Learning
Authors: A. Bey, T. Bensebaa, H. Benselem
Abstract:
This paper attempts to explore a new method to improve the teaching of algorithmic for beginners. It is well known that algorithmic is a difficult field to teach for teacher and complex to assimilate for learner. These difficulties are due to intrinsic characteristics of this field and to the manner that teachers (the majority) apprehend its bases. However, in a Technology Enhanced Learning environment (TEL), assessment, which is important and indispensable, is the most delicate phase to implement, for all problems that generate (noise...). Our objective registers in the confluence of these two axes. For this purpose, EASEL focused essentially to elaborate an assessment approach of algorithmic competences in a TEL environment. This approach consists in modeling an algorithmic solution according to basic and elementary operations which let learner draw his/her own step with all autonomy and independently to any programming language. This approach assures a trilateral assessment: summative, formative and diagnostic assessment.Keywords: Algorithmic, assessment of competences, Technology Enhanced Learning (TEL).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14425735 Another Approach of Similarity Solution in Reversed Stagnation-point Flow
Authors: Vai Kuong Sin, Chon Kit Chio
Abstract:
In this paper, the two-dimensional reversed stagnationpoint flow is solved by means of an anlytic approach. There are similarity solutions in case the similarity equation and the boundary condition are modified. Finite analytic method are applied to obtain the similarity velocity function.Keywords: reversed stagnation-point flow, similarity solutions, asymptotic solution
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1749