Search results for: operational approach
5052 Signed Approach for Mining Web Content Outliers
Authors: G. Poonkuzhali, K.Thiagarajan, K.Sarukesi, G.V.Uma
Abstract:
The emergence of the Internet has brewed the revolution of information storage and retrieval. As most of the data in the web is unstructured, and contains a mix of text, video, audio etc, there is a need to mine information to cater to the specific needs of the users without loss of important hidden information. Thus developing user friendly and automated tools for providing relevant information quickly becomes a major challenge in web mining research. Most of the existing web mining algorithms have concentrated on finding frequent patterns while neglecting the less frequent ones that are likely to contain outlying data such as noise, irrelevant and redundant data. This paper mainly focuses on Signed approach and full word matching on the organized domain dictionary for mining web content outliers. This Signed approach gives the relevant web documents as well as outlying web documents. As the dictionary is organized based on the number of characters in a word, searching and retrieval of documents takes less time and less space.Keywords: Outliers, Relevant document, , Signed Approach, Web content mining, Web documents..
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23495051 Approach of Measuring System Analyses for Automotive Part Manufacturing
Authors: S. Homrossukon, S. Sansureerungsigun
Abstract:
This work aims to introduce an efficient and to standardize the measuring system analyses for automotive industrial. The study started by literature reviewing about the management and analyses measurement system. The approach of measuring system management, then, was constructed. Such approach was validated by collecting the current measuring system data using the equipments of interest including vernier caliper and micrometer. Their accuracy and precision of measurements were analyzed. Finally, the measuring system was improved and evaluated. The study showed that vernier did not meet its measuring characteristics based on the linearity whereas all equipments were lacking of the measuring precision characteristics. Consequently, the causes of measuring variation via the equipments of interest were declared. After the improvement, it was found that their measuring performance could be accepted as the standard required. Finally, the standardized approach for analyzing the measuring system of automotive was concluded.
Keywords: Automotive part manufacturing measurement, measuring accuracy, measuring precision, measurement system analyses.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19115050 Approach Based on Fuzzy C-Means for Band Selection in Hyperspectral Images
Authors: Diego Saqui, José H. Saito, José R. Campos, Lúcio A. de C. Jorge
Abstract:
Hyperspectral images and remote sensing are important for many applications. A problem in the use of these images is the high volume of data to be processed, stored and transferred. Dimensionality reduction techniques can be used to reduce the volume of data. In this paper, an approach to band selection based on clustering algorithms is presented. This approach allows to reduce the volume of data. The proposed structure is based on Fuzzy C-Means (or K-Means) and NWHFC algorithms. New attributes in relation to other studies in the literature, such as kurtosis and low correlation, are also considered. A comparison of the results of the approach using the Fuzzy C-Means and K-Means with different attributes is performed. The use of both algorithms show similar good results but, particularly when used attributes variance and kurtosis in the clustering process, however applicable in hyperspectral images.
Keywords: Band selection, fuzzy C-means, K-means, hyperspectral image.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18155049 A Distributed Approach to Extract High Utility Itemsets from XML Data
Authors: S. Kannimuthu, K. Premalatha
Abstract:
This paper investigates a new data mining capability that entails mining of High Utility Itemsets (HUI) in a distributed environment. Existing research in data mining deals with only presence or absence of an items and do not consider the semantic measures like weight or cost of the items. Thus, HUI mining algorithm has evolved. HUI mining is the one kind of utility mining concept, aims to identify itemsets whose utility satisfies a given threshold. Although, the approach of mining HUIs in a distributed environment and mining of the same from XML data have not explored yet. In this work, a novel approach is proposed to mine HUIs from the XML based data in a distributed environment. This work utilizes Service Oriented Computing (SOC) paradigm which provides Knowledge as a Service (KaaS). The interesting patterns are provided via the web services with the help of knowledge server to answer the queries of the consumers. The performance of the approach is evaluated on various databases using execution time and memory consumption.
Keywords: Data mining, Knowledge as a Service, service oriented computing, utility mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24545048 Automata Theory Approach for Solving Frequent Pattern Discovery Problems
Authors: Renáta Iváncsy, István Vajk
Abstract:
The various types of frequent pattern discovery problem, namely, the frequent itemset, sequence and graph mining problems are solved in different ways which are, however, in certain aspects similar. The main approach of discovering such patterns can be classified into two main classes, namely, in the class of the levelwise methods and in that of the database projection-based methods. The level-wise algorithms use in general clever indexing structures for discovering the patterns. In this paper a new approach is proposed for discovering frequent sequences and tree-like patterns efficiently that is based on the level-wise issue. Because the level-wise algorithms spend a lot of time for the subpattern testing problem, the new approach introduces the idea of using automaton theory to solve this problem.Keywords: Frequent pattern discovery, graph mining, pushdownautomaton, sequence mining, state machine, tree mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16285047 Foreign Direct Investment on Economic Growth by Industries in Central and Eastern European Countries
Authors: Shorena Pharjiani
Abstract:
Present empirical paper investigates the relationship between FDI and economic growth by 10 selected industries in 10 Central and Eastern European countries from the period 1995 to 2012. Different estimation approaches were used to explore the connection between FDI and economic growth, for example OLS, RE, FE with and without time dummies. Obtained empirical results leads to some main consequences: First, the Central and East European countries (CEEC) attracted foreign direct investment, which raised the productivity of industries they entered in. It should be concluded that the linkage between FDI and output growth by industries is positive and significant enough to suggest that foreign firm’s participation enhanced the productivity of the industries they occupied. There had been an endogeneity problem in the regression and fixed effects estimation approach was used which partially corrected the regression analysis in order to make the results less biased. Second, it should be stressed that the results show that time has an important role in making FDI operational for enhancing output growth by industries via total factor productivity. Third, R&D positively affected economic growth and at the same time, it should take some time for research and development to influence economic growth. Fourth, the general trends masked crucial differences at the country level: over the last 20 years, the analysis of the tables and figures at the country level show that the main recipients of FDI of the 11 Central and Eastern European countries were Hungary, Poland and the Czech Republic. The main reason was that these countries had more open door policies for attracting the FDI. Fifth, according to the graphical analysis, while Hungary had the highest FDI inflow in this region, it was not reflected in the GDP growth as much as in other Central and Eastern European countries.Keywords: Central and East European countries (CEEC), economic growth, FDI, panel data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16655046 Connectionist Approach to Generic Text Summarization
Authors: Rajesh S.Prasad, U. V. Kulkarni, Jayashree.R.Prasad
Abstract:
As the enormous amount of on-line text grows on the World-Wide Web, the development of methods for automatically summarizing this text becomes more important. The primary goal of this research is to create an efficient tool that is able to summarize large documents automatically. We propose an Evolving connectionist System that is adaptive, incremental learning and knowledge representation system that evolves its structure and functionality. In this paper, we propose a novel approach for Part of Speech disambiguation using a recurrent neural network, a paradigm capable of dealing with sequential data. We observed that connectionist approach to text summarization has a natural way of learning grammatical structures through experience. Experimental results show that our approach achieves acceptable performance. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15915045 A Sub-Pixel Image Registration Technique with Applications to Defect Detection
Authors: Zhen-Hui Hu, Jyh-Shong Ju, Ming-Hwei Perng
Abstract:
This paper presents a useful sub-pixel image registration method using line segments and a sub-pixel edge detector. In this approach, straight line segments are first extracted from gray images at the pixel level before applying the sub-pixel edge detector. Next, all sub-pixel line edges are mapped onto the orientation-distance parameter space to solve for line correspondence between images. Finally, the registration parameters with sub-pixel accuracy are analytically solved via two linear least-square problems. The present approach can be applied to various fields where fast registration with sub-pixel accuracy is required. To illustrate, the present approach is applied to the inspection of printed circuits on a flat panel. Numerical example shows that the present approach is effective and accurate when target images contain a sufficient number of line segments, which is true in many industrial problems.Keywords: Defect detection, Image registration, Straight line segment, Sub-pixel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19605044 Processing Web-Cam Images by a Neuro-Fuzzy Approach for Vehicular Traffic Monitoring
Authors: A. Faro, D. Giordano, C. Spampinato
Abstract:
Traffic management in an urban area is highly facilitated by the knowledge of the traffic conditions in every street or highway involved in the vehicular mobility system. Aim of the paper is to propose a neuro-fuzzy approach able to compute the main parameters of a traffic system, i.e., car density, velocity and flow, by using the images collected by the web-cams located at the crossroads of the traffic network. The performances of this approach encourage its application when the traffic system is far from the saturation. A fuzzy model is also outlined to evaluate when it is suitable to use more accurate, even if more time consuming, algorithms for measuring traffic conditions near to saturation.
Keywords: Neuro-fuzzy networks, computer vision, Fuzzy systems, intelligent transportation system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15925043 A Generic, Functionally Comprehensive Approach to Maintaining an Ontology as a Relational Database
Authors: Jennifer Leopold, Alton Coalter, Leong Lee
Abstract:
An ontology is a data model that represents a set of concepts in a given field and the relationships among those concepts. As the emphasis on achieving a semantic web continues to escalate, ontologies for all types of domains increasingly will be developed. These ontologies may become large and complex, and as their size and complexity grows, so will the need for multi-user interfaces for ontology curation. Herein a functionally comprehensive, generic approach to maintaining an ontology as a relational database is presented. Unlike many other ontology editors that utilize a database, this approach is entirely domain-generic and fully supports Webbased, collaborative editing including the designation of different levels of authorization for users.Keywords: Ontology Editor, Relational Database, CollaborativeCuration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14465042 Fourier Galerkin Approach to Wave Equation with Absorbing Boundary Conditions
Authors: Alexandra Leukauf, Alexander Schirrer, Emir Talic
Abstract:
Numerical computation of wave propagation in a large domain usually requires significant computational effort. Hence, the considered domain must be truncated to a smaller domain of interest. In addition, special boundary conditions, which absorb the outward travelling waves, need to be implemented in order to describe the system domains correctly. In this work, the linear one dimensional wave equation is approximated by utilizing the Fourier Galerkin approach. Furthermore, the artificial boundaries are realized with absorbing boundary conditions. Within this work, a systematic work flow for setting up the wave problem, including the absorbing boundary conditions, is proposed. As a result, a convenient modal system description with an effective absorbing boundary formulation is established. Moreover, the truncated model shows high accuracy compared to the global domain.Keywords: Absorbing boundary conditions, boundary control, Fourier Galerkin approach, modal approach, wave equation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8885041 A Post Processing Method for Quantum Prime Factorization Algorithm based on Randomized Approach
Authors: Mir Shahriar Emami, Mohammad Reza Meybodi
Abstract:
Prime Factorization based on Quantum approach in two phases has been performed. The first phase has been achieved at Quantum computer and the second phase has been achieved at the classic computer (Post Processing). At the second phase the goal is to estimate the period r of equation xrN ≡ 1 and to find the prime factors of the composite integer N in classic computer. In this paper we present a method based on Randomized Approach for estimation the period r with a satisfactory probability and the composite integer N will be factorized therefore with the Randomized Approach even the gesture of the period is not exactly the real period at least we can find one of the prime factors of composite N. Finally we present some important points for designing an Emulator for Quantum Computer Simulation.Keywords: Quantum Prime Factorization, RandomizedAlgorithms, Quantum Computer Simulation, Quantum Computation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14945040 Towards a Suitable and Systematic Approach for Component Based Software Development
Authors: Kuljit Kaur, Parminder Kaur, Jaspreet Bedi, Hardeep Singh
Abstract:
Software crisis refers to the situation in which the developers are not able to complete the projects within time and budget constraints and moreover these overscheduled and over budget projects are of low quality as well. Several methodologies have been adopted form time to time to overcome this situation and now in the focus is component based software engineering. In this approach, emphasis is on reuse of already existing software artifacts. But the results can not be achieved just by preaching the principles; they need to be practiced as well. This paper highlights some of the very basic elements of this approach, which has to be in place to get the desired goals of high quality, low cost with shorter time-to-market software products.Keywords: Component Model, Software Components, SoftwareRepository, Process Models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17655039 A Model-Driven Approach of User Interface for MVP Rich Internet Application
Authors: Sarra Roubi, Mohammed Erramdani, Samir Mbarki
Abstract:
This paper presents an approach for the model-driven generating of Rich Internet Application (RIA) focusing on the graphical aspect. We used well known Model-Driven Engineering (MDE) frameworks and technologies, such as Eclipse Modeling Framework (EMF), Graphical Modeling Framework (GMF), Query View Transformation (QVTo) and Acceleo to enable the design and the code automatic generation of the RIA. During the development of the approach, we focused on the graphical aspect of the application in terms of interfaces while opting for the Model View Presenter pattern that is designed for graphics interfaces. The paper describes the process followed to define the approach, the supporting tool and presents the results from a case study.Keywords: Code generation, Design Pattern, metamodel, Model Driven Engineering, MVP, Rich Internet Application, transformation, User Interface.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17425038 Multimodal Biometric Authentication Using Choquet Integral and Genetic Algorithm
Authors: Anouar Ben Khalifa, Sami Gazzah, Najoua Essoukri BenAmara
Abstract:
The Choquet integral is a tool for the information fusion that is very effective in the case where fuzzy measures associated with it are well chosen. In this paper, we propose a new approach for calculating fuzzy measures associated with the Choquet integral in a context of data fusion in multimodal biometrics. The proposed approach is based on genetic algorithms. It has been validated in two databases: the first base is relative to synthetic scores and the second one is biometrically relating to the face, fingerprint and palmprint. The results achieved attest the robustness of the proposed approach.
Keywords: Multimodal biometrics, data fusion, Choquet integral, fuzzy measures, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25165037 Enhancement of Higher Order Thinking Skills among Teacher Trainers by Fun Game Learning Approach
Authors: Malathi Balakrishnan, Gananathan M. Nadarajah, Saraswathy Vellasamy, Evelyn Gnanam William George
Abstract:
The purpose of the study is to explore how the fun game-learning approach enhances teacher trainers’ higher order thinking skills. Two-day fun filled fun game learning-approach was introduced to teacher trainers as a Continuous Professional Development Program (CPD). 26 teacher trainers participated in this Transformation of Teaching and Learning Fun Way Program, organized by Institute of Teacher Education Malaysia. Qualitative research technique was adopted as the researchers observed the participants’ higher order thinking skills developed during the program. Data were collected from observational checklist; interview transcriptions of four participants and participants’ reflection notes. All the data were later analyzed with NVivo data analysis process. The finding of this study presented five main themes, which are critical thinking, hands on activities, creating, application and use of technology. The studies showed that the teacher trainers’ higher order thinking skills were enhanced after the two-day CPD program. Therefore, Institute of Teacher Education will have more success using the fun way game-learning approach to develop higher order thinking skills among its teacher trainers who can implement these skills to their trainee teachers in future. This study also added knowledge to Constructivism learning theory, which will further highlight the prominence of the fun way learning approach to enhance higher order thinking skills.
Keywords: Constructivism, game-learning approach, higher order thinking skill, teacher trainer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28185036 Performance and Availability Analyses of PV Generation Systems in Taiwan
Authors: H. S. Huang, J. C. Jao, K. L. Yen, C. T. Tsai
Abstract:
The purpose of this article applies the monthly final energy yield and failure data of 202 PV systems installed in Taiwan to analyze the PV operational performance and system availability. This data is collected by Industrial Technology Research Institute through manual records. Bad data detection and failure data estimation approaches are proposed to guarantee the quality of the received information. The performance ratio value and system availability are then calculated and compared with those of other countries. It is indicated that the average performance ratio of Taiwan-s PV systems is 0.74 and the availability is 95.7%. These results are similar with those of Germany, Switzerland, Italy and Japan.Keywords: availability, performance ratio, PV system, Taiwan
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44395035 New Approaches on Stability Analysis for Neural Networks with Time-Varying Delay
Authors: Qingqing Wang, Shouming Zhong
Abstract:
Utilizing the Lyapunov functional method and combining linear matrix inequality (LMI) techniques and integral inequality approach (IIA) to analyze the global asymptotic stability for delayed neural networks (DNNs),a new sufficient criterion ensuring the global stability of DNNs is obtained.The criteria are formulated in terms of a set of linear matrix inequalities,which can be checked efficiently by use of some standard numercial packages.In order to show the stability condition in this paper gives much less conservative results than those in the literature,numerical examples are considered.
Keywords: Neural networks, Globally asymptotic stability , LMI approach , IIA approach , Time-varying delay.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19395034 BeamGA Median: A Hybrid Heuristic Search Approach
Authors: Ghada Badr, Manar Hosny, Nuha Bintayyash, Eman Albilali, Souad Larabi Marie-Sainte
Abstract:
The median problem is significantly applied to derive the most reasonable rearrangement phylogenetic tree for many species. More specifically, the problem is concerned with finding a permutation that minimizes the sum of distances between itself and a set of three signed permutations. Genomes with equal number of genes but different order can be represented as permutations. In this paper, an algorithm, namely BeamGA median, is proposed that combines a heuristic search approach (local beam) as an initialization step to generate a number of solutions, and then a Genetic Algorithm (GA) is applied in order to refine the solutions, aiming to achieve a better median with the smallest possible reversal distance from the three original permutations. In this approach, any genome rearrangement distance can be applied. In this paper, we use the reversal distance. To the best of our knowledge, the proposed approach was not applied before for solving the median problem. Our approach considers true biological evolution scenario by applying the concept of common intervals during the GA optimization process. This allows us to imitate a true biological behavior and enhance genetic approach time convergence. We were able to handle permutations with a large number of genes, within an acceptable time performance and with same or better accuracy as compared to existing algorithms.Keywords: Median problem, phylogenetic tree, permutation, genetic algorithm, beam search, genome rearrangement distance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9795033 A Visual Analytics Tool for the Structural Health Monitoring of an Aircraft Panel
Authors: F. M. Pisano, M. Ciminello
Abstract:
Aerospace, mechanical, and civil engineering infrastructures can take advantages from damage detection and identification strategies in terms of maintenance cost reduction and operational life improvements, as well for safety scopes. The challenge is to detect so called “barely visible impact damage” (BVID), due to low/medium energy impacts, that can progressively compromise the structure integrity. The occurrence of any local change in material properties, that can degrade the structure performance, is to be monitored using so called Structural Health Monitoring (SHM) systems, in charge of comparing the structure states before and after damage occurs. SHM seeks for any "anomalous" response collected by means of sensor networks and then analyzed using appropriate algorithms. Independently of the specific analysis approach adopted for structural damage detection and localization, textual reports, tables and graphs describing possible outlier coordinates and damage severity are usually provided as artifacts to be elaborated for information extraction about the current health conditions of the structure under investigation. Visual Analytics can support the processing of monitored measurements offering data navigation and exploration tools leveraging the native human capabilities of understanding images faster than texts and tables. Herein, a SHM system enrichment by integration of a Visual Analytics component is investigated. Analytical dashboards have been created by combining worksheets, so that a useful Visual Analytics tool is provided to structural analysts for exploring the structure health conditions examined by a Principal Component Analysis based algorithm.
Keywords: Interactive dashboards, optical fibers, structural health monitoring, visual analytics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8305032 Qualification and Provisioning of xDSL Broadband Lines using a GIS Approach
Authors: Mavroidis Athanasios, Karamitsos Ioannis, Saletti Paola
Abstract:
In this paper is presented a Geographic Information System (GIS) approach in order to qualify and monitor the broadband lines in efficient way. The methodology used for interpolation is the Delaunay Triangular Irregular Network (TIN). This method is applied for a case study in ISP Greece monitoring 120,000 broadband lines.
Keywords: GIS loop qualification, GIS xDSL, LLU TIN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14655031 Diagnosis of Induction Machine Faults by DWT
Authors: Hamidreza Akbari
Abstract:
In this paper, for detection of inclined eccentricity in an induction motor, time–frequency analysis of the stator startup current is carried out. For this purpose, the discrete wavelet transform is used. Data are obtained from simulations, using winding function approach. The results show the validity of the approach for detecting the fault and discriminating with respect to other faults.
Keywords: Induction machine, Fault, DWT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21315030 Comparative Study of Sedimentation in Hydraulic Structures using Sharc and Ssiim Soft Wares - A Case of the Dez and Hamidieh Intake Structures in Iran
Authors: A.H. Sajedipoor, N. Hedayat , M. Mashal, R. Nazarzadeh
Abstract:
Sedimentation formation is a complex hydraulic phenomenon that has emerged as a major operational and maintenance consideration in modern hydraulic engineering in general and river engineering in particular. Sediments accumulation along the river course and their eventual storage in a form of islands affect water intake in the canal systems that are fed by the storage reservoirs. Without proper management, sediment transport can lead to major operational challenges in water distribution system of arid regions like the Dez and Hamidieh command areas. The paper aims to investigate sedimentation in the Western Canal of Dez Diversion Weir using the SHARC model and compare the results with the two intake structures of the Hamidieh dam in Iran using SSIIM model. The objective was to identify the factors which influence the process, check reliability of outcome and provide ways in which to mitigate the implications on operation and maintenance of the structures. Results estimated sand and silt bed loads concentrations to be 193 ppm and 827ppm respectively. This followed ,ore or less similar pattern in Hamidieh where the sediment formation impeded water intake in the canal system. Given the available data on average annual bed loads and average suspended sediment loads of 165ppm and 837ppm in the Dez, there was a significant statistical difference (16%) between the sand grains, whereas no significant difference (1.2%) was find in the silt grain sizes. One explanation for such finding being that along the 6 Km river course there was considerable meandering effects which explains recent shift in the hydraulic behavior along the stream course under investigation. The sand concentration in downstream relative to present state of the canal showed a steep descending curve. Sediment trapping on the other hand indicated a steep ascending curve. These occurred because the diversion weir was not considered in the simulation model. The comparative study showed very close similarities in the results which explains the fact that both software can be used as accurate and reliable analytical tools for simulation of the sedimentation in hydraulic engineering.
Keywords: SHARC, SSIIM, sedimentation, Dez diversion weir, Hamidieh dam, Intake structures
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17565029 Quantification of Technology Innovation Usinga Risk-Based Framework
Authors: Gerard E. Sleefe
Abstract:
There is significant interest in achieving technology innovation through new product development activities. It is recognized, however, that traditional project management practices focused only on performance, cost, and schedule attributes, can often lead to risk mitigation strategies that limit new technology innovation. In this paper, a new approach is proposed for formally managing and quantifying technology innovation. This approach uses a risk-based framework that simultaneously optimizes innovation attributes along with traditional project management and system engineering attributes. To demonstrate the efficacy of the new riskbased approach, a comprehensive product development experiment was conducted. This experiment simultaneously managed the innovation risks and the product delivery risks through the proposed risk-based framework. Quantitative metrics for technology innovation were tracked and the experimental results indicate that the risk-based approach can simultaneously achieve both project deliverable and innovation objectives.Keywords: innovation, risk assessment, product development, technology management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15995028 An Approach for a Bidding Process Knowledge Capitalization
Authors: R. Chalal, A. R. Ghomari
Abstract:
Preparation and negotiation of innovative and future projects can be characterized as a strategic-type decision situation, involving many uncertainties and an unpredictable environment. We will focus in this paper on the bidding process. It includes cooperative and strategic decisions. Our approach for bidding process knowledge capitalization is aimed at information management in project-oriented organizations, based on the MUSIC (Management and Use of Co-operative Information Systems) model. We will show how to capitalize the company strategic knowledge and also how to organize the corporate memory. The result of the adopted approach is improvement of corporate memory quality.Keywords: Bidding process, corporate memory, Knowledge capitalization, knowledge acquisition, strategic decisions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16405027 An Approach to Improvement of Information Integrity in Key Areas of Portfolio Management
Authors: Victoria A. Bakhtina
Abstract:
At a time of growing market turbulence and a strong shifts towards increasingly complex risk models and more stringent audit requirements, it is more critical than ever to maintain the highest quality of financial and credit information. IFC implemented an approach that helps increase data integrity and quality significantly. This approach is called “Screening". Screening is based on linking information from different sources to identify potential inconsistencies in key financial and credit data. That, in turn, can help to ease the trials of portfolio supervision, and improve overall company global reporting and assessment systems. IFC experience showed that when used regularly, Screening led to improved information.Keywords: Information Integrity, Information Quality, Business Rules, Portfolio Management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14525026 Design and Social Innovation: A Systemic Approach
Authors: Marco Ogê Muniz, Luiz Fernando Gonçalves De Figueiredo
Abstract:
Design, as an area of knowledge, is subject to changes that affect it through different approaches, both theoretical and practical; its include matters related with responsibility, environment, social worries, and things alike. Commensurately, such contemporary aspects open room for social initiatives. This scenario begins to be looked at, especially in creative communities. Such proposal for a systemic approach of design is seen as a way to involve the stakeholders in the processes of investigation and of social innovation, which can decisively contribute for the development of traditional local communities. As a theoretical basis for the research, this paper outlines some especial features of design and social innovation, in their particular and in their complementary aspects, as well as in the way they relate with each other.
Keywords: Responsible design, social innovation, creative community, systemic approach, network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16405025 A Scenario-Based Approach for the Air Traffic Flow Management Problem with Stochastic Capacities
Authors: Soumia Ichoua
Abstract:
In this paper, we investigate the strategic stochastic air traffic flow management problem which seeks to balance airspace capacity and demand under weather disruptions. The goal is to reduce the need for myopic tactical decisions that do not account for probabilistic knowledge about the NAS near-future states. We present and discuss a scenario-based modeling approach based on a time-space stochastic process to depict weather disruption occurrences in the NAS. A solution framework is also proposed along with a distributed implementation aimed at overcoming scalability problems. Issues related to this implementation are also discussed.
Keywords: Air traffic management, sample average approximation, scenario-based approach, stochastic capacity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20855024 A Co-writing Development Approachto Wikis: PedagogicalIssues and Implications
Authors: Said Hadjerrouit
Abstract:
Wikis are promoted as collaborative writing tools that allow students to transform a text into a collective document by information sharing and group reflection. However, despite the promising collaborative capabilities of wikis, their pedagogical value regarding collaborative writing is still questionable. Wiki alone cannot make collaborative writing happen, and students do not automatically become more active, participate, and collaborate with others when they use wikis. To foster collaborative writing and active involvement in wiki development there is a need for a systematic approach to wikis. Themain goal of this paper is to propose and evaluate a co-writing approach to the development of wikis, along with the study of three wiki applications to report on pedagogical implications of collaborative writing in higher education.Keywords: Co-writing development approach, MediaWiki, socio-constructivist epistemology, wiki.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17325023 Assessing the Theoretical Suitability of Sentinel-2 and WorldView-3 Data for Hydrocarbon Mapping of Spill Events, Using HYSS
Authors: K. Tunde Olagunju, C. Scott Allen, F.D. (Freek) van der Meer
Abstract:
Identification of hydrocarbon oil in remote sensing images is often the first step in monitoring oil during spill events. Most remote sensing methods adopt techniques for hydrocarbon identification to achieve detection in order to model an appropriate cleanup program. Identification on optical sensors does not only allow for detection but also for characterization and quantification. Until recently, in optical remote sensing, quantification and characterization were only potentially possible using high-resolution laboratory and airborne imaging spectrometers (hyperspectral data). Unlike multispectral, hyperspectral data are not freely available, as this data category is mainly obtained via airborne survey at present. In this research, two operational high-resolution multispectral satellites (WorldView-3 and Sentinel-2) are theoretically assessed for their suitability for hydrocarbon characterization, using the Hydrocarbon Spectra Slope model (HYSS). This method utilized the two most persistent hydrocarbon diagnostic/absorption features at 1.73 µm and 2.30 µm for hydrocarbon mapping on multispectral data. In this research, spectra measurement of seven different hydrocarbon oils (crude and refined oil) taken on 10 different substrates with the use of laboratory ASD Fieldspec were convolved to Sentinel-2 and WorldView-3 resolution, using their full width half maximum (FWHM) parameter. The resulting hydrocarbon slope values obtained from the studied samples enable clear qualitative discrimination of most hydrocarbons, despite the presence of different background substrates, particularly on WorldView-3. Due to close conformity of central wavelengths and narrow bandwidths to key hydrocarbon bands used in HYSS, the statistical significance for qualitative analysis on WorldView-3 sensors for all studied hydrocarbon oil returned with 95% confidence level (P-value ˂ 0.01), except for Diesel. Using multifactor analysis of variance (MANOVA), the discriminating power of HYSS is statistically significant for most hydrocarbon-substrate combinations on Sentinel-2 and WorldView-3 FWHM, revealing the potential of these two operational multispectral sensors as rapid response tools for hydrocarbon mapping. One notable exception is highly transmissive hydrocarbons on Sentinel-2 data due to the non-conformity of spectral bands with key hydrocarbon absorptions and the relatively coarse bandwidth (> 100 nm).
Keywords: hydrocarbon, oil spill, remote sensing, hyperspectral, multispectral, hydrocarbon – substrate combination, Sentinel-2, WorldView-3
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 705