Search results for: Fuzzy similarity approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5899

Search results for: Fuzzy similarity approach

3799 Developing New Media Credibility Scale: A Multidimensional Perspective

Authors: Hanaa Farouk Saleh

Abstract:

The main purposes of this study are to develop a scale that reflects emerging theoretical understandings of new media credibility, based on the evolution of credibility studies in western researches, identification of the determinants of credibility in the media and its components by comparing traditional and new media credibility scales and building accumulative scale to test new media credibility. This approach was built on western researches using conceptualizations of media credibility, which focuses on four principal components: Source (journalist), message (article), medium (newspaper, radio, TV, web, etc.), and organization (owner of the medium), and adding user and cultural context as key components to assess new media credibility in particular. This study’s value lies in its contribution to the conceptualization and development of new media credibility through the creation of a theoretical measurement tool. Future studies should explore this scale to test new media credibility, which represents a promising new approach in the efforts to define and measure credibility of all media types.

Keywords: Credibility scale, media credibility components, new media credibility scale, scale development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2825
3798 Design Process and Real-Time Validation of an Innovative Autonomous Mid-Air Flight and Landing System

Authors: De Lellis E., Di Vito V., Garbarino L., Lai C., Corraro F.

Abstract:

This paper describes the design process and the realtime validation of an innovative autonomous mid-air flight and landing system developed by the Italian Aerospace Research Center in the framework of the Italian national funded project TECVOL (Technologies for the Autonomous Flight). In the paper it is provided an insight of the whole development process of the system under study. In particular, the project framework is illustrated at first, then the functional context and the adopted design and testing approach are described, and finally the on-ground validation test rig on purpose designed is addressed in details. Furthermore, the hardwarein- the-loop validation of the autonomous mid-air flight and landing system by means of the real-time test rig is described and discussed.

Keywords: Autonomous landing, autonomous mid-air flight, design and test approach, real-time hardware-in-the-loop validation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1612
3797 Augmented Reality Sandbox and Constructivist Approach for Geoscience Teaching and Learning

Authors: Muhammad Nawaz, Sandeep N. Kundu, Farha Sattar

Abstract:

Augmented reality sandbox adds new dimensions to education and learning process. It can be a core component of geoscience teaching and learning to understand the geographic contexts and landform processes. Augmented reality sandbox is a useful tool not only to create an interactive learning environment through spatial visualization but also it can provide an active learning experience to students and enhances the cognition process of learning. Augmented reality sandbox can be used as an interactive learning tool to teach geomorphic and landform processes. This article explains the augmented reality sandbox and the constructivism approach for geoscience teaching and learning, and endeavours to explore the ways to teach the geographic processes using the three-dimensional digital environment for the deep learning of the geoscience concepts interactively.

Keywords: Augmented Reality Sandbox, constructivism, deep learning, geoscience.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1482
3796 Formosa3: A Cloud-Enabled HPC Cluster in NCHC

Authors: Chin-Hung Li, Te-Ming Chen, Ying-Chuan Chen, Shuen-Tai Wang

Abstract:

This paper proposes a new approach to offer a private cloud service in HPC clusters. In particular, our approach relies on automatically scheduling users- customized environment request as a normal job in batch system. After finishing virtualization request jobs, those guest operating systems will dismiss so that compute nodes will be released again for computing. We present initial work on the innovative integration of HPC batch system and virtualization tools that aims at coexistence such that they suffice for meeting the minimizing interference required by a traditional HPC cluster. Given the design of initial infrastructure, the proposed effort has the potential to positively impact on synergy model. The results from the experiment concluded that goal for provisioning customized cluster environment indeed can be fulfilled by using virtual machines, and efficiency can be improved with proper setup and arrangements.

Keywords: Cloud Computing, HPC Cluster, Private Cloud, Virtualization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2000
3795 Brain Image Segmentation Using Conditional Random Field Based On Modified Artificial Bee Colony Optimization Algorithm

Authors: B. Thiagarajan, R. Bremananth

Abstract:

Tumor is an uncontrolled growth of tissues in any part of the body. Tumors are of different types and they have different characteristics and treatments. Brain tumor is inherently serious and life-threatening because of its character in the limited space of the intracranial cavity (space formed inside the skull). Locating the tumor within MR (magnetic resonance) image of brain is integral part of the treatment of brain tumor. This segmentation task requires classification of each voxel as either tumor or non-tumor, based on the description of the voxel under consideration. Many studies are going on in the medical field using Markov Random Fields (MRF) in segmentation of MR images. Even though the segmentation process is better, computing the probability and estimation of parameters is difficult. In order to overcome the aforementioned issues, Conditional Random Field (CRF) is used in this paper for segmentation, along with the modified artificial bee colony optimization and modified fuzzy possibility c-means (MFPCM) algorithm. This work is mainly focused to reduce the computational complexities, which are found in existing methods and aimed at getting higher accuracy. The efficiency of this work is evaluated using the parameters such as region non-uniformity, correlation and computation time. The experimental results are compared with the existing methods such as MRF with improved Genetic Algorithm (GA) and MRF-Artificial Bee Colony (MRF-ABC) algorithm.

Keywords: Conditional random field, Magnetic resonance, Markov random field, Modified artificial bee colony.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2912
3794 Learning Undergraduate Mathematics in a Discovery-Enriched Approach

Authors: Kam-moon Liu, Kwok-chi Chim, Kwok-wai Chung, Daniel Wing-cheong Ho

Abstract:

Students often adopt routine practicing as learning strategy for mathematics. The reason is they are often bound and trained to solving conventional-typed questions in Mathematics in high school. This will be problematic if students further consolidate this practice in university. Therefore, the Department of Mathematics emphasized and integrated the Discovery-enriched approach in the undergraduate curriculum. This paper presents the details of implementing the Discovery-enriched Curriculum by providing adequate platform for project-learning, expertise for guidance and internship opportunities for students majoring in Mathematics. The Department also provided project-learning opportunities to mathematics courses targeted for students majoring in other science or engineering disciplines. The outcome is promising: the research ability and problem solving skills of students are enhanced.

Keywords: Discovery-enriched curriculum, higher education, mathematics education, project learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1787
3793 Case Study Approach Using Scenario Analysis to Analyze Unabsorbed Head Office Overheads

Authors: K. C. Iyer, T. Gupta, Y. M. Bindal

Abstract:

Head office overhead (HOOH) is an indirect cost and is recovered through individual project billings by the contractor. Delay in a project impacts the absorption of HOOH cost allocated to that particular project and thus diminishes the expected profit of the contractor. This unabsorbed HOOH cost is later claimed by contractors as damages. The subjective nature of the available formulae to compute unabsorbed HOOH is the difficulty that contractors and owners face and thus dispute it. The paper attempts to bring together the rationale of various HOOH formulae by gathering contractor’s HOOH cost data on all of its project, using case study approach and comparing variations in values of HOOH using scenario analysis. The case study approach uses project data collected from four construction projects of a contractor in India to calculate unabsorbed HOOH costs from various available formulae. Scenario analysis provides further variations in HOOH values after considering two independent situations mainly scope changes and new projects during the delay period. Interestingly, one of the findings in this study reveals that, in spite of HOOH getting absorbed by additional works available during the period of delay, a few formulae depict an increase in the value of unabsorbed HOOH, neglecting any absorption by the increase in scope. This indicates that these formulae are inappropriate for use in case of a change to the scope of work. Results of this study can help both parties in deciding on an appropriate formula more objectively, considering the events on a project causing the delay and contractor's position in respect of obtaining new projects.

Keywords: Absorbed and unabsorbed overheads, head office overheads, scenario analysis, scope variation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 774
3792 Some Issues of Measurement of Impairment of Non-Financial Assets in the Public Sector

Authors: Mariam Vardiashvili

Abstract:

The economic value of the asset impairment process is quite large. Impairment reflects the reduction of future economic benefits or service potentials itemized in the asset. The assets owned by public sector entities bring economic benefits or are used for delivery of the free-of-charge services. Consequently, they are classified as cash-generating and non-cash-generating assets. IPSAS 21 - Impairment of non-cash-generating assets, and IPSAS 26 - Impairment of cash-generating assets, have been designed considering this specificity.  When measuring impairment of assets, it is important to select the relevant methods. For measurement of the impaired Non-Cash-Generating Assets, IPSAS 21 recommends three methods: Depreciated Replacement Cost Approach, Restoration Cost Approach, and  Service Units Approach. Impairment of Value in Use of Cash-Generating Assets (according to IPSAS 26) is measured by discounted value of the money sources to be received in future. Value in use of the cash-generating asserts (as per IPSAS 26) is measured by the discounted value of the money sources to be received in the future. The article provides classification of the assets in the public sector  as non-cash-generating assets and cash-generating assets and, deals also with the factors which should be considered when evaluating  impairment of assets. An essence of impairment of the non-financial assets and the methods of measurement thereof evaluation are formulated according to IPSAS 21 and IPSAS 26. The main emphasis is put on different methods of measurement of the value in use of the impaired Cash-Generating Assets and Non-Cash-Generation Assets and the methods of their selection. The traditional and the expected cash flow approaches for calculation of the discounted value are reviewed. The article also discusses the issues of recognition of impairment loss and its reflection in the financial reporting. The article concludes that despite a functional purpose of the impaired asset, whichever method is used for measuring the asset, presentation of realistic information regarding the value of the assets should be ensured in the financial reporting. In the theoretical development of the issue, the methods of scientific abstraction, analysis and synthesis were used. The research was carried out with a systemic approach. The research process uses international standards of accounting, theoretical researches and publications of Georgian and foreign scientists.

Keywords: Non-cash-generating assets, cash-generating assets, recoverable value, recoverable service amount, value in use.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 655
3791 Dynamic Threshold Adjustment Approach For Neural Networks

Authors: Hamza A. Ali, Waleed A. J. Rasheed

Abstract:

The use of neural networks for recognition application is generally constrained by their inherent parameters inflexibility after the training phase. This means no adaptation is accommodated for input variations that have any influence on the network parameters. Attempts were made in this work to design a neural network that includes an additional mechanism that adjusts the threshold values according to the input pattern variations. The new approach is based on splitting the whole network into two subnets; main traditional net and a supportive net. The first deals with the required output of trained patterns with predefined settings, while the second tolerates output generation dynamically with tuning capability for any newly applied input. This tuning comes in the form of an adjustment to the threshold values. Two levels of supportive net were studied; one implements an extended additional layer with adjustable neuronal threshold setting mechanism, while the second implements an auxiliary net with traditional architecture performs dynamic adjustment to the threshold value of the main net that is constructed in dual-layer architecture. Experiment results and analysis of the proposed designs have given quite satisfactory conducts. The supportive layer approach achieved over 90% recognition rate, while the multiple network technique shows more effective and acceptable level of recognition. However, this is achieved at the price of network complexity and computation time. Recognition generalization may be also improved by accommodating capabilities involving all the innate structures in conjugation with Intelligence abilities with the needs of further advanced learning phases.

Keywords: Classification, Recognition, Neural Networks, Pattern Recognition, Generalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1593
3790 High Quality Speech Coding using Combined Parametric and Perceptual Modules

Authors: M. Kulesza, G. Szwoch, A. Czyżewski

Abstract:

A novel approach to speech coding using the hybrid architecture is presented. Advantages of parametric and perceptual coding methods are utilized together in order to create a speech coding algorithm assuring better signal quality than in traditional CELP parametric codec. Two approaches are discussed. One is based on selection of voiced signal components that are encoded using parametric algorithm, unvoiced components that are encoded perceptually and transients that remain unencoded. The second approach uses perceptual encoding of the residual signal in CELP codec. The algorithm applied for precise transient selection is described. Signal quality achieved using the proposed hybrid codec is compared to quality of some standard speech codecs.

Keywords: CELP residual coding, hybrid codec architecture, perceptual speech coding, speech codecs comparison.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1492
3789 Intelligent Agent Communication by Using DAML to Build Agent Community Ontology

Authors: Cheng-Hsiung Hung, Hong-Jie Dai, Jason Jen-Yen Chen

Abstract:

This paper presents a new approach for intelligent agent communication based on ontology for agent community. DARPA agent markup language (DAML) is used to build the community ontology. This paper extends the agent management specification by the foundation for intelligent physical agents (FIPA) to develop an agent role called community facilitator (CF) that manages community directory and community ontology. CF helps build agent community. Precise description of agent service in this community can thus be achieved. This facilitates agent communication. Furthermore, through ontology update, agents with different ontology are capable of communicating with each other. An example of advanced traveler information system is included to illustrate practicality of this approach.

Keywords: Intelligent agent communication, DARPA agent markup language (DAML), Community ontology, Advanced Traveler Information System (ATIS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1559
3788 Analyzing Periurban Fringe with Rough Set

Authors: Benedetto Manganelli, Beniamino Murgante

Abstract:

The distinction among urban, periurban and rural areas represents a classical example of uncertainty in land classification. Satellite images, geostatistical analysis and all kinds of spatial data are very useful in urban sprawl studies, but it is important to define precise rules in combining great amounts of data to build complex knowledge about territory. Rough Set theory may be a useful method to employ in this field. It represents a different mathematical approach to uncertainty by capturing the indiscernibility. Two different phenomena can be indiscernible in some contexts and classified in the same way when combining available information about them. This approach has been applied in a case of study, comparing the results achieved with both Map Algebra technique and Spatial Rough Set. The study case area, Potenza Province, is particularly suitable for the application of this theory, because it includes 100 municipalities with different number of inhabitants and morphologic features.

Keywords: Land Classification, Map Algebra, Periurban Fringe, Rough Set, Urban Planning, Urban Sprawl.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692
3787 Hybrid Method Using Wavelets and Predictive Method for Compression of Speech Signal

Authors: Karima Siham Aoubid, Mohamed Boulemden

Abstract:

The development of the signal compression algorithms is having compressive progress. These algorithms are continuously improved by new tools and aim to reduce, an average, the number of bits necessary to the signal representation by means of minimizing the reconstruction error. The following article proposes the compression of Arabic speech signal by a hybrid method combining the wavelet transform and the linear prediction. The adopted approach rests, on one hand, on the original signal decomposition by ways of analysis filters, which is followed by the compression stage, and on the other hand, on the application of the order 5, as well as, the compression signal coefficients. The aim of this approach is the estimation of the predicted error, which will be coded and transmitted. The decoding operation is then used to reconstitute the original signal. Thus, the adequate choice of the bench of filters is useful to the transform in necessary to increase the compression rate and induce an impercevable distortion from an auditive point of view.

Keywords: Compression, linear prediction analysis, multiresolution analysis, speech signal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1296
3786 Physical Modeling of Oil Well Fire Extinguishing Using a Turbojet on a Barge

Authors: M. Abbaspour, D. Mansouri, N. Mansouri

Abstract:

There are reports of gas and oil wells fire due to different accidents. Many different methods are used for fire fighting in gas and oil industry. Traditional fire extinguishing techniques are mostly faced with many problems and are usually time consuming and needs lots of equipments. Besides, they cause damages to facilities, and create health and environmental problems. This article proposes innovative approach in fire extinguishing techniques in oil and gas industry, especially applicable for burning oil wells located offshore. Fire extinguishment employing a turbojet is a novel approach which can help to extinguishment the fire in short period of time. Divergent and convergent turbojets modeled in laboratory scale along with a high pressure flame were used. Different experiments were conducted to determine the relationship between output discharges of trumpet and oil wells. The results were corrected and the relationship between dimensionless parameters of flame and fire extinguishment distances and also the output discharge of turbojet and oil wells in specified distances are demonstrated by specific curves.

Keywords: Burning well, fire extinguishment, gas/oil industry, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1602
3785 An Approach for Blind Source Separation using the Sliding DFT and Time Domain Independent Component Analysis

Authors: Koji Yamanouchi, Masaru Fujieda, Takahiro Murakami, Yoshihisa Ishida

Abstract:

''Cocktail party problem'' is well known as one of the human auditory abilities. We can recognize the specific sound that we want to listen by this ability even if a lot of undesirable sounds or noises are mixed. Blind source separation (BSS) based on independent component analysis (ICA) is one of the methods by which we can separate only a special signal from their mixed signals with simple hypothesis. In this paper, we propose an online approach for blind source separation using the sliding DFT and the time domain independent component analysis. The proposed method can reduce calculation complexity in comparison with conventional methods, and can be applied to parallel processing by using digital signal processors (DSPs) and so on. We evaluate this method and show its availability.

Keywords: Cocktail party problem, blind Source Separation(BSS), independent component analysis, sliding DFT, onlineprocessing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1603
3784 Energy Map Construction using Adaptive Alpha Grey Prediction Model in WSNs

Authors: Surender Kumar Soni, Dhirendra Pratap Singh

Abstract:

Wireless Sensor Networks can be used to monitor the physical phenomenon in such areas where human approach is nearly impossible. Hence the limited power supply is the major constraint of the WSNs due to the use of non-rechargeable batteries in sensor nodes. A lot of researches are going on to reduce the energy consumption of sensor nodes. Energy map can be used with clustering, data dissemination and routing techniques to reduce the power consumption of WSNs. Energy map can also be used to know which part of the network is going to fail in near future. In this paper, Energy map is constructed using the prediction based approach. Adaptive alpha GM(1,1) model is used as the prediction model. GM(1,1) is being used worldwide in many applications for predicting future values of time series using some past values due to its high computational efficiency and accuracy.

Keywords: Adaptive Alpha GM(1, 1) Model, Energy Map, Prediction Based Data Reduction, Wireless Sensor Networks

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1772
3783 Prioritization of Mutation Test Generation with Centrality Measure

Authors: Supachai Supmak, Yachai Limpiyakorn

Abstract:

Mutation testing can be applied for the quality assessment of test cases. Prioritization of mutation test generation has been a critical element of the industry practice that would contribute to the evaluation of test cases. The industry generally delivers the product under the condition of time to the market and thus, inevitably sacrifices software testing tasks, even though many test cases are required for software verification. This paper presents an approach of applying a social network centrality measure, PageRank, to prioritize mutation test generation. The source code with the highest values of PageRank, will be focused first when developing their test cases as these modules are vulnerable for defects or anomalies which may cause the consequent defects in many other associated modules. Moreover, the approach would help identify the reducible test cases in the test suite, still maintaining the same criteria as the original number of test cases.

Keywords: Software testing, mutation test, network centrality measure, test case prioritization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 460
3782 A Discrete-Event-Simulation Approach for Logistic Systems with Real Time Resource Routing and VR Integration

Authors: Gerrit Alves, Jürgen Roßmann, Roland Wischnewski

Abstract:

Today, transport and logistic systems are often tightly integrated in the production. Lean production and just-in-time delivering create multiple constraints that have to be fulfilled. As transport networks often have evolved over time they are very expensive to change. This paper describes a discrete-event-simulation system which simulates transportation models using real time resource routing and collision avoidance. It allows for the specification of own control algorithms and validation of new strategies. The simulation is integrated into a virtual reality (VR) environment and can be displayed in 3-D to show the progress. Simulation elements can be selected through VR metaphors. All data gathered during the simulation can be presented as a detailed summary afterwards. The included cost-benefit calculation can help to optimize the financial outcome. The operation of this approach is shown by the example of a timber harvest simulation.

Keywords: Discrete-Event-Simulation, Logistic, Simulation, Virtual Reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1849
3781 Development of Genetic-based Machine Learning for Network Intrusion Detection (GBML-NID)

Authors: Wafa' S.Al-Sharafat, Reyadh Naoum

Abstract:

Society has grown to rely on Internet services, and the number of Internet users increases every day. As more and more users become connected to the network, the window of opportunity for malicious users to do their damage becomes very great and lucrative. The objective of this paper is to incorporate different techniques into classier system to detect and classify intrusion from normal network packet. Among several techniques, Steady State Genetic-based Machine Leaning Algorithm (SSGBML) will be used to detect intrusions. Where Steady State Genetic Algorithm (SSGA), Simple Genetic Algorithm (SGA), Modified Genetic Algorithm and Zeroth Level Classifier system are investigated in this research. SSGA is used as a discovery mechanism instead of SGA. SGA replaces all old rules with new produced rule preventing old good rules from participating in the next rule generation. Zeroth Level Classifier System is used to play the role of detector by matching incoming environment message with classifiers to determine whether the current message is normal or intrusion and receiving feedback from environment. Finally, in order to attain the best results, Modified SSGA will enhance our discovery engine by using Fuzzy Logic to optimize crossover and mutation probability. The experiments and evaluations of the proposed method were performed with the KDD 99 intrusion detection dataset.

Keywords: MSSGBML, Network Intrusion Detection, SGA, SSGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
3780 An Approach for Transient Response Calculation of large Nonproportionally Damped Structures using Component Mode Synthesis

Authors: Alexander A. Muravyov

Abstract:

A minimal complexity version of component mode synthesis is presented that requires simplified computer programming, but still provides adequate accuracy for modeling lower eigenproperties of large structures and their transient responses. The novelty is that a structural separation into components is done along a plane/surface that exhibits rigid-like behavior, thus only normal modes of each component is sufficient to use, without computing any constraint, attachment, or residual-attachment modes. The approach requires only such input information as a few (lower) natural frequencies and corresponding undamped normal modes of each component. A novel technique is shown for formulation of equations of motion, where a double transformation to generalized coordinates is employed and formulation of nonproportional damping matrix in generalized coordinates is shown.

Keywords: component mode synthesis, finite element models, transient response, nonproportional damping

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1776
3779 Contemplating Preference Ratings of Corporate Social Responsibility Practices for Supply Chain Performance System Implementation

Authors: Mohit Tyagi, Pradeep Kumar

Abstract:

The objective of this research work is to identify and analyze the significant corporate social responsibility (CSR) practices with an aim to improve the supply chain performance of automobile industry located at National Capital Region (NCR) of India. To achieve the objective, 6 CSR practices have been considered and analyzed using expert’s preference rating (EPR) approach. The considered CSR practices are namely, Top management and employee awareness about CSR (P1), Employee involvement in social and environmental problems (P2), Protection of human rights (P3), Waste reduction, energy saving and water conservation (P4), Proper visibility of CSR guidelines (P5) and Broad perception towards CSR initiatives (P6). The outcomes of this research may help mangers in decision making processes and framing polices for SCP implementation under CSR context.

Keywords: Supply chain performance, corporate social responsibility, CSR practices, expert’s preference rating approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 804
3778 Applying Case-Based Reasoning in Supporting Strategy Decisions

Authors: S. M. Seyedhosseini, A. Makui, M. Ghadami

Abstract:

Globalization and therefore increasing tight competition among companies, have resulted to increase the importance of making well-timed decision. Devising and employing effective strategies, that are flexible and adaptive to changing market, stand a greater chance of being effective in the long-term. In other side, a clear focus on managing the entire product lifecycle has emerged as critical areas for investment. Therefore, applying wellorganized tools to employ past experience in new case, helps to make proper and managerial decisions. Case based reasoning (CBR) is based on a means of solving a new problem by using or adapting solutions to old problems. In this paper, an adapted CBR model with k-nearest neighbor (K-NN) is employed to provide suggestions for better decision making which are adopted for a given product in the middle of life phase. The set of solutions are weighted by CBR in the principle of group decision making. Wrapper approach of genetic algorithm is employed to generate optimal feature subsets. The dataset of the department store, including various products which are collected among two years, have been used. K-fold approach is used to evaluate the classification accuracy rate. Empirical results are compared with classical case based reasoning algorithm which has no special process for feature selection, CBR-PCA algorithm based on filter approach feature selection, and Artificial Neural Network. The results indicate that the predictive performance of the model, compare with two CBR algorithms, in specific case is more effective.

Keywords: Case based reasoning, Genetic algorithm, Groupdecision making, Product management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2144
3777 Predictive Clustering Hybrid Regression(pCHR) Approach and Its Application to Sucrose-Based Biohydrogen Production

Authors: Nikhil, Ari Visa, Chin-Chao Chen, Chiu-Yue Lin, Jaakko A. Puhakka, Olli Yli-Harja

Abstract:

A predictive clustering hybrid regression (pCHR) approach was developed and evaluated using dataset from H2- producing sucrose-based bioreactor operated for 15 months. The aim was to model and predict the H2-production rate using information available about envirome and metabolome of the bioprocess. Selforganizing maps (SOM) and Sammon map were used to visualize the dataset and to identify main metabolic patterns and clusters in bioprocess data. Three metabolic clusters: acetate coupled with other metabolites, butyrate only, and transition phases were detected. The developed pCHR model combines principles of k-means clustering, kNN classification and regression techniques. The model performed well in modeling and predicting the H2-production rate with mean square error values of 0.0014 and 0.0032, respectively.

Keywords: Biohydrogen, bioprocess modeling, clusteringhybrid regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738
3776 Managing a Cross-Disciplinary Research Project in a University: The Case of LEARNIT

Authors: Yulia Stukalina

Abstract:

This paper explores the main issues related to implementing a cross-disciplinary research project (LEARNIT) based on collaboration between universities from three European countries. The paper discusses the importance of using the holistic approach to managing scientific projects with due account for the complicated nature of the educational environment of a modern university. To illustrate this approach, the author describes some actions to be taken for supporting different focus areas of LEARNIT project, in the process using integrated tangible, non-tangible, and semi-tangible resources of the partner university. The methodology of the paper is based on the academic literature and research papers analysis within management discipline. The analysis reported in the paper is also based on the author’s professional experience in the area of managing international research projects in a university.

Keywords: LEARNIT, focus area, project management, resources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1386
3775 Community Perceptions and Attitudes Regarding Wildlife Crime in South Africa

Authors: Louiza C. Duncker, Duarte Gonçalves

Abstract:

Wildlife crime is a complex problem with many interconnected facets, which are generally responded to in parts or fragments in efforts to “break down” the complexity into manageable components. However, fragmentation increases complexity as coherence and cooperation become diluted. A whole-of-society approach has been developed towards finding a common goal and integrated approach to preventing wildlife crime. As part of this development, research was conducted in rural communities adjacent to conservation areas in South Africa to define and comprehend the challenges faced by them, and to understand their perceptions of wildlife crime. The results of the research showed that the perceptions of community members varied - most were in favor of conservation and of protecting rhinos, only if they derive adequate benefit from it. Regardless of gender, income level, education level, or access to services, conservation was perceived to be good and bad by the same people. Even though people in the communities are poor, a willingness to stop rhino poaching does exist amongst them, but their perception of parks not caring about people triggered an attitude of not being willing to stop, prevent or report poaching. Understanding the nuances, the history, the interests and values of community members, and the drivers behind poaching mind-sets (intrinsic or driven by transnational organized crime) is imperative to create sustainable and resilient communities on multiple levels that make a substantial positive impact on people’s lives, but also conserve wildlife for posterity.

Keywords: Conservation, community perceptions, wildlife crime, rhino poaching, interest and value creation, whole-of-society approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1840
3774 Modular Data and Calculation Framework for a Technology-Based Mapping of the Manufacturing Process According to the Value Stream Management Approach

Authors: Tim Wollert, Fabian Behrendt

Abstract:

Value Stream Management (VSM) is a widely used methodology in the context of Lean Management for improving end-to-end material and information flows from a supplier to a customer from a company’s perspective. Whereas the design principles, e.g. Pull, value-adding, customer-orientation and further ones are still valid against the background of an increasing digitalized and dynamic environment, the methodology itself for mapping a value stream is characterized as time- and resource-intensive due to the high degree of manual activities. The digitalization of processes in the context of Industry 4.0 enables new opportunities to reduce these manual efforts and make the VSM approach more agile. The paper at hand aims at providing a modular data and calculation framework, utilizing the available business data, provided by information and communication technologies for automizing the value stream mapping process with focus on the manufacturing process.

Keywords: Industry 4.0, lean management 4.0, value stream management 4.0, value stream mapping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 304
3773 Clustering based Voltage Control Areas for Localized Reactive Power Management in Deregulated Power System

Authors: Saran Satsangi, Ashish Saini, Amit Saraswat

Abstract:

In this paper, a new K-means clustering based approach for identification of voltage control areas is developed. Voltage control areas are important for efficient reactive power management in power systems operating under deregulated environment. Although, voltage control areas are formed using conventional hierarchical clustering based method, but the present paper investigate the capability of K-means clustering for the purpose of forming voltage control areas. The proposed method is tested and compared for IEEE 14 bus and IEEE 30 bus systems. The results show that this K-means based method is competing with conventional hierarchical approach

Keywords: Voltage control areas, reactive power management, K-means clustering algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2354
3772 NANCY: Combining Adversarial Networks with Cycle-Consistency for Robust Multi-Modal Image Registration

Authors: Mirjana Ruppel, Rajendra Persad, Amit Bahl, Sanja Dogramadzi, Chris Melhuish, Lyndon Smith

Abstract:

Multimodal image registration is a profoundly complex task which is why deep learning has been used widely to address it in recent years. However, two main challenges remain: Firstly, the lack of ground truth data calls for an unsupervised learning approach, which leads to the second challenge of defining a feasible loss function that can compare two images of different modalities to judge their level of alignment. To avoid this issue altogether we implement a generative adversarial network consisting of two registration networks GAB, GBA and two discrimination networks DA, DB connected by spatial transformation layers. GAB learns to generate a deformation field which registers an image of the modality B to an image of the modality A. To do that, it uses the feedback of the discriminator DB which is learning to judge the quality of alignment of the registered image B. GBA and DA learn a mapping from modality A to modality B. Additionally, a cycle-consistency loss is implemented. For this, both registration networks are employed twice, therefore resulting in images ˆA, ˆB which were registered to ˜B, ˜A which were registered to the initial image pair A, B. Thus the resulting and initial images of the same modality can be easily compared. A dataset of liver CT and MRI was used to evaluate the quality of our approach and to compare it against learning and non-learning based registration algorithms. Our approach leads to dice scores of up to 0.80 ± 0.01 and is therefore comparable to and slightly more successful than algorithms like SimpleElastix and VoxelMorph.

Keywords: Multimodal image registration, GAN, cycle consistency, deep learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 755
3771 An Impulse-Momentum Approach to Swing-Up Control of Double Inverted Pendulum on a Cart

Authors: Thamer Ali Albahkali

Abstract:

The challenge in the swing-up problem of double inverted pendulum on a cart (DIPC) is to design a controller that bring all DIPC's states, especially the joint angles of the two links, into the region of attraction of the desired equilibrium. This paper proposes a new method to swing-up DIPC based on a series of restto- rest maneuvers of the first link about its vertically upright configuration while holding the cart fixed at the origin. The rest-torest maneuvers are designed such that each one results in a net gain in energy of the second link. This results in swing-up of DIPC-s configuration to the region of attraction of the desired equilibrium. A three-step algorithm is provided for swing-up control followed by the stabilization step. Simulation results with a comparison to an experimental work done in the literature are presented to demonstrate the efficacy of the approach.

Keywords: Double Inverted pendulum, Impulse, momentum, underactuated

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1912
3770 Boosting Method for Automated Feature Space Discovery in Supervised Quantum Machine Learning Models

Authors: Vladimir Rastunkov, Jae-Eun Park, Abhijit Mitra, Brian Quanz, Steve Wood, Christopher Codella, Heather Higgins, Joseph Broz

Abstract:

Quantum Support Vector Machines (QSVM) have become an important tool in research and applications of quantum kernel methods. In this work we propose a boosting approach for building ensembles of QSVM models and assess performance improvement across multiple datasets. This approach is derived from the best ensemble building practices that worked well in traditional machine learning and thus should push the limits of quantum model performance even further. We find that in some cases, a single QSVM model with tuned hyperparameters is sufficient to simulate the data, while in others - an ensemble of QSVMs that are forced to do exploration of the feature space via proposed method is beneficial.

Keywords: QSVM, Quantum Support Vector Machines, quantum kernel, boosting, ensemble.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 385