Search results for: Data Centric Approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11074

Search results for: Data Centric Approach

8104 Aircraft Selection Process Using Preference Analysis for Reference Ideal Solution (PARIS)

Authors: C. Ardil

Abstract:

Multiple criteria decision making analysis (MCDMA) methods are applied to many real - life problems in different fields of engineering science and technology. The "preference analysis for reference ideal solution (PARIS)" method is proposed for an efficient MCDMA evaluation of decision problems. The multiple criteria aircraft evaluation approach is based on the integrated the mean weight, entropy weight, PARIS, and TOPSIS method, which eliminates the subjective importance weight assignment process. The evaluation criteria were identified from an extensive literature review of aircraft selection process. The aim of this study is to propose an efficient methodology for handling the aircraft selection process in which the proposed method solves effectively the MCDMA problem. A numerical example is presented to demonstrate the applicability and validity of the proposed MCDMA approach. 

Keywords: aircraft selection, aircraft, multiple criteria decision making, multiple criteria decision making analysis, mean weight, entropy weight, MCDMA, PARIS, TOPSIS, VIKOR, ELECTRE, PROMETHEE

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 535
8103 Basic Business-Forces behind the Surviving and Sustainable Organizations: The Case of Medium Scale Contractors in South Africa

Authors: Iruka C. Anugwo, Winston M. Shakantu

Abstract:

The objective of this study is to uncover the basic business-forces that necessitated the survival and sustainable performance of the medium scale contractors in the South African construction market. This study is essential as it set to contribute towards long-term strategic solutions for combating the incessant failure of start-ups construction organizations within South African. The study used a qualitative research methodology; as the most appropriate approach to elicit and understand, and uncover the phenomena that are basic business-forces for the active contractors in the market. The study also adopted a phenomenological study approach; and in-depth interviews were conducted with 20 medium scale contractors in Port Elizabeth, South Africa, between months of August to October 2015. This allowed for an in-depth understanding of the critical and basic business-forces that influenced their survival and performance beyond the first five years of business operation. Findings of the study showed that for potential contractors (startups), to survival in the competitive business environment such as construction industry, they must possess the basic business-forces. These forces are educational knowledge in construction and business management related disciplines, adequate industrial experiences, competencies and capabilities to delivery excellent services and products as well as embracing the spirit of entrepreneurship. Convincingly, it can be concluded that the strategic approach to minimize the endless failure of startups construction businesses; the potential construction contractors must endeavoring to access and acquire the basic educationally knowledge, training and qualification; need to acquire industrial experiences in collaboration with required competencies, capabilities and entrepreneurship acumen. Without these basic business-forces as been discovered in this study, the majority of the contractors gaining entrance in the market will find it difficult to develop and grow a competitive and sustainable construction organization in South Africa.

Keywords: Basic business-forces, medium scale contractors, South Africa, sustainable organisations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1551
8102 Information Extraction from Unstructured and Ungrammatical Data Sources for Semantic Annotation

Authors: Quratulain N. Rajput, Sajjad Haider, Nasir Touheed

Abstract:

The internet has become an attractive avenue for global e-business, e-learning, knowledge sharing, etc. Due to continuous increase in the volume of web content, it is not practically possible for a user to extract information by browsing and integrating data from a huge amount of web sources retrieved by the existing search engines. The semantic web technology enables advancement in information extraction by providing a suite of tools to integrate data from different sources. To take full advantage of semantic web, it is necessary to annotate existing web pages into semantic web pages. This research develops a tool, named OWIE (Ontology-based Web Information Extraction), for semantic web annotation using domain specific ontologies. The tool automatically extracts information from html pages with the help of pre-defined ontologies and gives them semantic representation. Two case studies have been conducted to analyze the accuracy of OWIE.

Keywords: Ontology, Semantic Annotation, Wrapper, Information Extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2109
8101 Probabilistic Life Cycle Assessment of the Nano Membrane Toilet

Authors: A. Anastasopoulou, A. Kolios, T. Somorin, A. Sowale, Y. Jiang, B. Fidalgo, A. Parker, L. Williams, M. Collins, E. J. McAdam, S. Tyrrel

Abstract:

Developing countries are nowadays confronted with great challenges related to domestic sanitation services in view of the imminent water scarcity. Contemporary sanitation technologies established in these countries are likely to pose health risks unless waste management standards are followed properly. This paper provides a solution to sustainable sanitation with the development of an innovative toilet system, called Nano Membrane Toilet (NMT), which has been developed by Cranfield University and sponsored by the Bill & Melinda Gates Foundation. The particular technology converts human faeces into energy through gasification and provides treated wastewater from urine through membrane filtration. In order to evaluate the environmental profile of the NMT system, a deterministic life cycle assessment (LCA) has been conducted in SimaPro software employing the Ecoinvent v3.3 database. The particular study has determined the most contributory factors to the environmental footprint of the NMT system. However, as sensitivity analysis has identified certain critical operating parameters for the robustness of the LCA results, adopting a stochastic approach to the Life Cycle Inventory (LCI) will comprehensively capture the input data uncertainty and enhance the credibility of the LCA outcome. For that purpose, Monte Carlo simulations, in combination with an artificial neural network (ANN) model, have been conducted for the input parameters of raw material, produced electricity, NOX emissions, amount of ash and transportation of fertilizer. The given analysis has provided the distribution and the confidence intervals of the selected impact categories and, in turn, more credible conclusions are drawn on the respective LCIA (Life Cycle Impact Assessment) profile of NMT system. Last but not least, the specific study will also yield essential insights into the methodological framework that can be adopted in the environmental impact assessment of other complex engineering systems subject to a high level of input data uncertainty.

Keywords: Sanitation systems, nano membrane toilet, LCA, stochastic uncertainty analysis, Monte Carlo Simulations, artificial neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 988
8100 The Evaluation of Production Line Performance by Using ARENA – A Case Study

Authors: Muhammad Marsudi, Hani Shafeek

Abstract:

The purpose of this paper is to simulate the production process of a metal stamping industry and to evaluate the utilization of the production line by using ARENA simulation software. The process time and the standard time for each process of the production line is obtained from data given by the company management. Other data are collected through direct observation of the line. There are three work stations performing ten different types of processes in order to produce a single product type. Arena simulation model is then developed based on the collected data. Verification and validation are done to the Arena model, and finally the result of Arena simulation can be analyzed. It is found that utilization at each workstation will increase if batch size is increased although throughput rate remains/is kept constant. This study is very useful for the company because the company needs to improve the efficiency and utilization of its production lines.

Keywords: Arena software, case study, production line, utilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5375
8099 Investigation of Optimal Parameter Settings in Super Duplex Welding

Authors: R. M. Chandima Ratnayake, Daniel Dyakov

Abstract:

Super steel materials play a vital role in the construction and fabrication of structural, piping and pipeline components. In assuring the integrity of onshore and offshore operating systems, they enable life cycle costs to be minimized. In this context, Duplex stainless steel (DSS) material related welding on constructions and fabrications plays a significant role in maintaining and assuring integrity at an optimal expenditure over the life cycle of production and process systems as well as associated structures. In DSS welding, factors such as gap geometry, shielding gas supply rate, welding current, and type of the welding process are vital to the final joint performance. Hence, an experimental investigation has been performed using an engineering robust design approach (ERDA) to investigate the optimal settings that generate optimal super DSS (i.e. UNS S32750) joint performance. This manuscript illustrates the mathematical approach and experimental design, optimal parameter settings and results of the verification experiment.

Keywords: Duplex stainless steel welding, engineering robust design, mathematical framework, optimal parameter settings.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1798
8098 A Large Ion Collider Experiment (ALICE) Diffractive Detector Control System for RUN-II at the Large Hadron Collider

Authors: J. C. Cabanillas-Noris, M. I. Martínez-Hernández, I. León-Monzón

Abstract:

The selection of diffractive events in the ALICE experiment during the first data taking period (RUN-I) of the Large Hadron Collider (LHC) was limited by the range over which rapidity gaps occur. It would be possible to achieve better measurements by expanding the range in which the production of particles can be detected. For this purpose, the ALICE Diffractive (AD0) detector has been installed and commissioned for the second phase (RUN-II). Any new detector should be able to take the data synchronously with all other detectors and be operated through the ALICE central systems. One of the key elements that must be developed for the AD0 detector is the Detector Control System (DCS). The DCS must be designed to operate safely and correctly this detector. Furthermore, the DCS must also provide optimum operating conditions for the acquisition and storage of physics data and ensure these are of the highest quality. The operation of AD0 implies the configuration of about 200 parameters, from electronics settings and power supply levels to the archiving of operating conditions data and the generation of safety alerts. It also includes the automation of procedures to get the AD0 detector ready for taking data in the appropriate conditions for the different run types in ALICE. The performance of AD0 detector depends on a certain number of parameters such as the nominal voltages for each photomultiplier tube (PMT), their threshold levels to accept or reject the incoming pulses, the definition of triggers, etc. All these parameters define the efficiency of AD0 and they have to be monitored and controlled through AD0 DCS. Finally, AD0 DCS provides the operator with multiple interfaces to execute these tasks. They are realized as operating panels and scripts running in the background. These features are implemented on a SCADA software platform as a distributed control system which integrates to the global control system of the ALICE experiment.

Keywords: AD0, ALICE, DCS, LHC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1398
8097 Applications of Genetic Programming in Data Mining

Authors: Saleh Mesbah Elkaffas, Ahmed A. Toony

Abstract:

This paper details the application of a genetic programming framework for induction of useful classification rules from a database of income statements, balance sheets, and cash flow statements for North American public companies. Potentially interesting classification rules are discovered. Anomalies in the discovery process merit further investigation of the application of genetic programming to the dataset for the problem domain.

Keywords: Genetic programming, data mining classification rule.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1545
8096 Computing Transition Intensity Using Time-Homogeneous Markov Jump Process: Case of South African HIV/AIDS Disposition

Authors: A. Bayaga

Abstract:

This research provides a technical account of estimating Transition Probability using Time-homogeneous Markov Jump Process applying by South African HIV/AIDS data from the Statistics South Africa. It employs Maximum Likelihood Estimator (MLE) model to explore the possible influence of Transition Probability of mortality cases in which case the data was based on actual Statistics South Africa. This was conducted via an integrated demographic and epidemiological model of South African HIV/AIDS epidemic. The model was fitted to age-specific HIV prevalence data and recorded death data using MLE model. Though the previous model results suggest HIV in South Africa has declined and AIDS mortality rates have declined since 2002 – 2013, in contrast, our results differ evidently with the generally accepted HIV models (Spectrum/EPP and ASSA2008) in South Africa. However, there is the need for supplementary research to be conducted to enhance the demographic parameters in the model and as well apply it to each of the nine (9) provinces of South Africa.

Keywords: AIDS mortality rates, Epidemiological model, Time-homogeneous Markov Jump Process, Transition Probability, Statistics South Africa.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2171
8095 Determination and Comparison of Fabric Pills Distribution Using Image Processing and Spatial Data Analysis Tools

Authors: Lenka Techniková, Maroš Tunák, Jiří Janáček

Abstract:

This work deals with the determination and comparison of pill patterns in 2 sets of fabric samples which differ in way of pill creation. The first set contains fabric samples with the pills created by simulation on a Martindale abrasion machine, while pills in the second set originated during normal wearing and maintenance. The goal of the study is to determine whether the pattern of the fabric pills created by simulation is the same as the pattern of naturally occurring pills. The system of determination and comparison of the pills is based on image processing and spatial data analysis tools. Firstly, 3D reconstruction of the fabric surfaces with the pills is realized with using a gradient fields method. The gradient fields method creates a 3D fabric surface from a set of 4 images. Thereafter, the pills are detected in 3D fabric surfaces using image-processing tools in the MATLAB software. Determination and comparison of the pills patterns of two sets of fabric samples is based on spatial data analysis using tools in R software.

Keywords: 3D reconstruction of the surface, image analysis tools, distribution of the pills, spatial data analysis tools.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2173
8094 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema

Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy

Abstract:

Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.

Keywords: Natural language processing, end user development; natural language interfaces, human computer interaction, data recognition, dialog systems, spreadsheet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1122
8093 Implementation of Building Information Modeling in Turkish Government Sector Projects

Authors: Mohammad Lemar Zalmai, Mustafa Nabi Kocakaya, Cemil Akcay, Ekrem Manisali

Abstract:

In recent years, the Building Information Modeling (BIM) approach has been developed expeditiously. As people see the benefits of this approach, it has begun to be used widely in construction projects and some countries made it mandatory to get more benefits from it. To promote the implementation of BIM in construction projects, it will be helpful to get some relevant information from surveys and interviews. The purpose of this study is to research the current adoption and implementation of BIM in public projects in Turkey. This study specified the challenges of BIM implementation in Turkey and proposed some solutions to overcome them. In this context, the challenges for BIM implementation and the factors that affect the BIM usage are determined based on previous academic researches and expert opinions by conducting interviews and questionnaire surveys. Several methods are used to process information in order to obtain weights of different factors to make BIM widespread in Turkey. This study concluded interviews' and questionnaire surveys' outcomes and proposed some suggestions to promote the implementation of BIM in Turkey. We believe research findings will be a good reference for boosting BIM implementation in Turkey.

Keywords: Building Information Modeling, BIM, BIM implementations, Turkish construction industry, Turkish government sector projects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 830
8092 MONARC: A Case Study on Simulation Analysis for LHC Activities

Authors: Ciprian Dobre

Abstract:

The scale, complexity and worldwide geographical spread of the LHC computing and data analysis problems are unprecedented in scientific research. The complexity of processing and accessing this data is increased substantially by the size and global span of the major experiments, combined with the limited wide area network bandwidth available. We present the latest generation of the MONARC (MOdels of Networked Analysis at Regional Centers) simulation framework, as a design and modeling tool for large scale distributed systems applied to HEP experiments. We present simulation experiments designed to evaluate the capabilities of the current real-world distributed infrastructure to support existing physics analysis processes and the means by which the experiments bands together to meet the technical challenges posed by the storage, access and computing requirements of LHC data analysis within the CMS experiment.

Keywords: Modeling and simulation, evaluation, large scale distributed systems, LHC experiments, CMS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1811
8091 An Exploratory Study of Reliability of Ranking vs. Rating in Peer Assessment

Authors: Yang Song, Yifan Guo, Edward F. Gehringer

Abstract:

Fifty years of research has found great potential for peer assessment as a pedagogical approach. With peer assessment, not only do students receive more copious assessments; they also learn to become assessors. In recent decades, more educational peer assessments have been facilitated by online systems. Those online systems are designed differently to suit different class settings and student groups, but they basically fall into two categories: rating-based and ranking-based. The rating-based systems ask assessors to rate the artifacts one by one following some review rubrics. The ranking-based systems allow assessors to review a set of artifacts and give a rank for each of them. Though there are different systems and a large number of users of each category, there is no comprehensive comparison on which design leads to higher reliability. In this paper, we designed algorithms to evaluate assessors' reliabilities based on their rating/ranking against the global ranks of the artifacts they have reviewed. These algorithms are suitable for data from both rating-based and ranking-based peer assessment systems. The experiments were done based on more than 15,000 peer assessments from multiple peer assessment systems. We found that the assessors in ranking-based peer assessments are at least 10% more reliable than the assessors in rating-based peer assessments. Further analysis also demonstrated that the assessors in ranking-based assessments tend to assess the more differentiable artifacts correctly, but there is no such pattern for rating-based assessors.

Keywords: Peer assessment, peer rating, peer ranking, reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1114
8090 Assessment of the Number of Damaged Buildings from a Flood Event Using Remote Sensing Technique

Authors: Jaturong Som-ard

Abstract:

The heavy rainfall from 3rd to 22th January 2017 had swamped much area of Ranot district in southern Thailand. Due to heavy rainfall, the district was flooded which had a lot of effects on economy and social loss. The major objective of this study is to detect flooding extent using Sentinel-1A data and identify a number of damaged buildings over there. The data were collected in two stages as pre-flooding and during flood event. Calibration, speckle filtering, geometric correction, and histogram thresholding were performed with the data, based on intensity spectral values to classify thematic maps. The maps were used to identify flooding extent using change detection, along with the buildings digitized and collected on JOSM desktop. The numbers of damaged buildings were counted within the flooding extent with respect to building data. The total flooded areas were observed as 181.45 sq.km. These areas were mostly occurred at Ban khao, Ranot, Takhria, and Phang Yang sub-districts, respectively. The Ban khao sub-district had more occurrence than the others because this area is located at lower altitude and close to Thale Noi and Thale Luang lakes than others. The numbers of damaged buildings were high in Khlong Daen (726 features), Tha Bon (645 features), and Ranot sub-district (604 features), respectively. The final flood extent map might be very useful for the plan, prevention and management of flood occurrence area. The map of building damage can be used for the quick response, recovery and mitigation to the affected areas for different concern organization.

Keywords: Flooding extent, Sentinel-1A data, JOSM desktop, damaged buildings.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 938
8089 A General Framework for Knowledge Discovery Using High Performance Machine Learning Algorithms

Authors: S. Nandagopalan, N. Pradeep

Abstract:

The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.

Keywords: Active Contour, Bayesian, Echocardiographic image, Feature vector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1713
8088 Smart Sustainable Cities: An Integrated Planning Approach towards Sustainable Urban Energy Systems, India

Authors: Adinarayanane Ramamurthy, Monsingh D. Devadas

Abstract:

Cities denote instantaneously a challenge and an opportunity for climate change policy. Cities are the place where most energy services are needed because urbanization is closely linked to high population densities and concentration of economic activities and production (Urban energy demand). Consequently, it is critical to explain about the role of cities within the world-s energy systems and its correlation with the climate change issue. With more than half of the world-s population already living in urban areas, and that percentage expected to rise to 75 per cent by 2050, it is clear that the path to sustainable development must pass through cities. Cities expanding in size and population pose increased challenges to the environment, of which energy is part as a natural resource, and to the quality of life. Nowadays, most cities have already understood the importance of sustainability, both at their local scale as in terms of their contribution to sustainability at higher geographical scales. It requires the perception of a city as a complex and dynamic ecosystem, an open system, or cluster of systems, where the energy as well as the other natural resources is transformed to satisfy the needs of the different urban activities. In fact, buildings and transportation generally represent most of cities direct energy demand, i.e., between 60 per cent and 80 per cent of the overall consumption. Buildings, both residential and services are usually influenced by the local physical and social conditions. In terms of transport, the energy demand is also strongly linked with the specific characteristics of a city (urban mobility).The concept of a “smart city" builds on statistics as seven key axes of a city-s success in moving towards common platform (brain nerve)of sustainable urban energy systems. With the aforesaid knowledge, the authors have suggested a frame work to role of cities, as energy actors for smart city management. The authors have discusses the potential elements needed for energy in smart cities and also identified potential energy actions and relevant barriers. Furthermore, three levels of city smartness in cities actions to overcome market /institutional failures with a local approach are distinguished. The authors have made an attempt to conceive and implement concepts of city smartness by adopting the city or local government as nerve center through an integrated planning approach. Finally, concluding with recommendations for the organization of the Smart Sustainable Cities for positive changes of urban India.

Keywords: Urbanization, Urban Energy Demand, Sustainable Urban Energy Systems, Integrated Planning Approach, Smart Sustainable City.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2965
8087 A TFETI Domain Decompositon Solver for Von Mises Elastoplasticity Model with Combination of Linear Isotropic-Kinematic Hardening

Authors: Martin Cermak, Stanislav Sysala

Abstract:

In this paper we present the efficient parallel implementation of elastoplastic problems based on the TFETI (Total Finite Element Tearing and Interconnecting) domain decomposition method. This approach allow us to use parallel solution and compute this nonlinear problem on the supercomputers and decrease the solution time and compute problems with millions of DOFs. In our approach we consider an associated elastoplastic model with the von Mises plastic criterion and the combination of linear isotropic-kinematic hardening law. This model is discretized by the implicit Euler method in time and by the finite element method in space. We consider the system of nonlinear equations with a strongly semismooth and strongly monotone operator. The semismooth Newton method is applied to solve this nonlinear system. Corresponding linearized problems arising in the Newton iterations are solved in parallel by the above mentioned TFETI. The implementation of this problem is realized in our in-house MatSol packages developed in MatLab.

Keywords: Isotropic-kinematic hardening, TFETI, domain decomposition, parallel solution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
8086 Lean Thinking and E-Commerce as New Opportunities to Improve Partnership in Supply Chain of Construction Industries

Authors: Kaustav Kundu, Alberto Portioli Staudacher

Abstract:

Construction industry plays a vital role in the economy of the world. However, due to high uncertainty and variability in the industry, its performance is not as efficient in terms of quality, lead times, productivity and costs as of other industries. Moreover, there are continuous conflicts among the different actors in the construction supply chains in terms of profit sharing. Previous studies suggested partnership as an important approach to promote cooperation among the different actors in the construction supply chains and thereby it improves the overall performance. Construction practitioners tried to focus on partnership which can enhance the performance of construction supply chains but they are not fully aware of different approaches and techniques for improving partnership. In this research, a systematic review on partnership in relation to construction supply chains is carried out to understand different elements influencing the partnership. The research development of this domain is analyzed by reviewing selected articles published from 1996 to 2015. Based on the papers, three major elements influencing partnership in construction supply chains are identified: ‘Lean approach’, ‘Relationship building’ and ‘E-commerce applications’. This study analyses the contributions in the areas within each element and provides suggestions for future developments of partnership in construction supply chains.

Keywords: Partnership, construction, lean, SCM, supply chain management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2421
8085 Rapid Study on Feature Extraction and Classification Models in Healthcare Applications

Authors: S. Sowmyayani

Abstract:

The advancement of computer-aided design helps the medical force and security force. Some applications include biometric recognition, elderly fall detection, face recognition, cancer recognition, tumor recognition, etc. This paper deals with different machine learning algorithms that are more generically used for any health care system. The most focused problems are classification and regression. With the rise of big data, machine learning has become particularly important for solving problems. Machine learning uses two types of techniques: supervised learning and unsupervised learning. The former trains a model on known input and output data and predicts future outputs. Classification and regression are supervised learning techniques. Unsupervised learning finds hidden patterns in input data. Clustering is one such unsupervised learning technique. The above-mentioned models are discussed briefly in this paper.

Keywords: Supervised learning, unsupervised learning, regression, neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 346
8084 Towards an Understanding of Social Capital in an Online Community of Filipino Music Artists

Authors: Jerome V. Cleofas

Abstract:

Cyberspace has become a more viable arena for budding artists to share musical acts through digital forms. The increasing relevance of online communities has attracted scholars from various fields demonstrating its influence on social capital. This paper extends this understanding of social capital among Filipino music artists belonging to the SoundCloud Philippines Facebook Group. The study makes use of various qualitative data obtained from key-informant interviews and participant observation of online and physical encounters, analyzed using the case study approach. Soundcloud Philippines has over seven-hundred members and is composed of Filipino singers, instrumentalists, composers, arrangers, producers, multimedia artists and event managers. Group interactions are a mix of online encounters based on Facebook and SoundCloud and physical encounters through meet-ups and events. Benefits reaped from the community are informational, technical, instrumental, promotional, motivational and social support. Under the guidance of online group administrators, collaborative activities such as music productions, concerts and events transpire. Most conflicts and problems arising are resolved peacefully. Social capital in SoundCloud Philippines is mobilized through recognition, respect and reciprocity.

Keywords: Facebook, music artists, online communities, social capital.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1748
8083 Chaos Theory and Application in Foreign Exchange Rates vs. IRR (Iranian Rial)

Authors: M. A. Torkamani, S. Mahmoodzadeh, S. Pourroostaei, C. Lucas

Abstract:

Daily production of information and importance of the sequence of produced data in forecasting future performance of market causes analysis of data behavior to become a problem of analyzing time series. But time series that are very complicated, usually are random and as a result their changes considered being unpredictable. While these series might be products of a deterministic dynamical and nonlinear process (chaotic) and as a result be predictable. Point of Chaotic theory view, complicated systems have only chaotically face and as a result they seem to be unregulated and random, but it is possible that they abide by a specified math formula. In this article, with regard to test of strange attractor and biggest Lyapunov exponent probability of chaos on several foreign exchange rates vs. IRR (Iranian Rial) has been investigated. Results show that data in this market have complex chaotic behavior with big degree of freedom.

Keywords: Chaos, Exchange Rate, Nonlinear Models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2477
8082 A New Heuristic Approach for Large Size Zero-One Multi Knapsack Problem Using Intercept Matrix

Authors: K. Krishna Veni, S. Raja Balachandar

Abstract:

This paper presents a heuristic to solve large size 0-1 Multi constrained Knapsack problem (01MKP) which is NP-hard. Many researchers are used heuristic operator to identify the redundant constraints of Linear Programming Problem before applying the regular procedure to solve it. We use the intercept matrix to identify the zero valued variables of 01MKP which is known as redundant variables. In this heuristic, first the dominance property of the intercept matrix of constraints is exploited to reduce the search space to find the optimal or near optimal solutions of 01MKP, second, we improve the solution by using the pseudo-utility ratio based on surrogate constraint of 01MKP. This heuristic is tested for benchmark problems of sizes upto 2500, taken from literature and the results are compared with optimum solutions. Space and computational complexity of solving 01MKP using this approach are also presented. The encouraging results especially for relatively large size test problems indicate that this heuristic can successfully be used for finding good solutions for highly constrained NP-hard problems.

Keywords: 0-1 Multi constrained Knapsack problem, heuristic, computational complexity, NP-Hard problems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1857
8081 Improving Similarity Search Using Clustered Data

Authors: Deokho Kim, Wonwoo Lee, Jaewoong Lee, Teresa Ng, Gun-Ill Lee, Jiwon Jeong

Abstract:

This paper presents a method for improving object search accuracy using a deep learning model. A major limitation to provide accurate similarity with deep learning is the requirement of huge amount of data for training pairwise similarity scores (metrics), which is impractical to collect. Thus, similarity scores are usually trained with a relatively small dataset, which comes from a different domain, causing limited accuracy on measuring similarity. For this reason, this paper proposes a deep learning model that can be trained with a significantly small amount of data, a clustered data which of each cluster contains a set of visually similar images. In order to measure similarity distance with the proposed method, visual features of two images are extracted from intermediate layers of a convolutional neural network with various pooling methods, and the network is trained with pairwise similarity scores which is defined zero for images in identical cluster. The proposed method outperforms the state-of-the-art object similarity scoring techniques on evaluation for finding exact items. The proposed method achieves 86.5% of accuracy compared to the accuracy of the state-of-the-art technique, which is 59.9%. That is, an exact item can be found among four retrieved images with an accuracy of 86.5%, and the rest can possibly be similar products more than the accuracy. Therefore, the proposed method can greatly reduce the amount of training data with an order of magnitude as well as providing a reliable similarity metric.

Keywords: Visual search, deep learning, convolutional neural network, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 825
8080 Intelligent Mobile Search Oriented to Global e-Commerce

Authors: Abdelkader Dekdouk

Abstract:

In this paper we propose a novel approach for searching eCommerce products using a mobile phone, illustrated by a prototype eCoMobile. This approach aims to globalize the mobile search by integrating the concept of user multilinguism into it. To show that, we particularly deal with English and Arabic languages. Indeed the mobile user can formulate his query on a commercial product in either language (English/Arabic). The description of his information need on commercial products relies on the ontology that represents the conceptualization of the product catalogue knowledge domain defined in both English and Arabic languages. A query expressed on a mobile device client defines the concept that corresponds to the name of the product followed by a set of pairs (property, value) specifying the characteristics of the product. Once a query is submitted it is then communicated to the server side which analyses it and in its turn performs an http request to an eCommerce application server (like Amazon). This latter responds by returning an XML file representing a set of elements where each element defines an item of the searched product with its specific characteristics. The XML file is analyzed on the server side and then items are displayed on the mobile device client along with its relevant characteristics in the chosen language.

Keywords: Mobile computing, search engine, multilingualglobal eCommerce, ontology, XML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2097
8079 Evaluating Efficiency of Nina Distribution Company Using Window Data Envelopment Analysis and Malmquist Index

Authors: Hossein Taherian Far, Ali Bazaee

Abstract:

Achieving continuous sustained economic growth and following economic development can be the target for all countries which are looking for it. In this regard, distribution industry plays an important role in growth and development of any nation. So, estimating the efficiency and productivity of the so called industry and identifying factors influencing it, is very necessary. The objective of the present study is to measure the efficiency and productivity of seven branches of Nina Distribution Company using window data envelopment analysis and Malmquist productivity index from spring 2013 to summer 2015. In this study, using criteria of fixed assets, payroll personnel, operating costs and duration of collection of receivables were selected as inputs and people and net sales, gross profit and percentage of coverage to customers were selected as outputs. Then, the process of performance window data envelopment analysis was driven and process efficiency has been measured using Malmquist index. The results indicate that the average technical efficiency of window Data Envelopment Analysis (DEA) model and fluctuating trend is sustainable. But the average management efficiency in window DEA model is related with negative growth (decline) of about 13%. The mean scale efficiency in all windows, except in the second one which is faced with 8%, shows growth of 18% compared to the first window. On the other hand, the mean change in total factor productivity in all branches of the industry shows average negative growth (decrease) of 12% which are the result of a negative change in technology.

Keywords: Nina Distribution Company branches, window data envelopment analysis, Malmquist productivity index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1162
8078 Best Starting Pitcher of the Chinese Professional Baseball League in 2009

Authors: Chih-Cheng Chen, Meng-Lung Lin, Yung-Tan Lee, Tien-Tze Chen, Ching-Yu Tseng

Abstract:

Baseball is unique among other sports in Taiwan. Baseball has become a “symbol of the Taiwanese spirit and Taiwan-s national sport". Taiwan-s first professional sports league, the Chinese Professional Baseball League (CPBL), was established in 1989. Starters pitch many more innings over the course of a season and for a century teams have made all their best pitchers starters. In this study, we attempt to determine the on-field performance these pitchers and which won the most CPBL games in 2009. We utilize the discriminate analysis approach to solve the problem, examining winning pitchers and their statistics, to reliably find the best starting pitcher. The data employed in this paper include innings pitched (IP), earned runs allowed (ERA) and walks plus hits per inning pitched (WPHIP) provided by the official website of the CPBL. The results show that Aaron Rakers was the best starting pitcher of the CPBL. The top 10 CPBL starting pitchers won 14 games to 8 games in the 2009 season. Though Fisher Discriminant Analysis, predicted to top 10 CPBL starting pitchers probably won 20 games to 9 games, more 1 game to 7 games in actually counts in 2009 season.

Keywords: Chinese Professional Baseball League, startingpitcher, Fisher's Discriminate analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1961
8077 The Data Processing Electronics of the METIS Coronagraph aboard the ESA Solar Orbiter Mission

Authors: M. Focardi, M. Pancrazzi, M. Uslenghi, G. Nicolini, E. Magli, F. Landini, M. Romoli, A. Bemporad, E. Antonucci, S. Fineschi, G. Naletto, P. Nicolosi, D. Spadaro, V. Andretta

Abstract:

METIS is the Multi Element Telescope for Imaging and Spectroscopy, a Coronagraph aboard the European Space Agency-s Solar Orbiter Mission aimed at the observation of the solar corona via both VIS and UV/EUV narrow-band imaging and spectroscopy. METIS, with its multi-wavelength capabilities, will study in detail the physical processes responsible for the corona heating and the origin and properties of the slow and fast solar wind. METIS electronics will collect and process scientific data thanks to its detectors proximity electronics, the digital front-end subsystem electronics and the MPPU, the Main Power and Processing Unit, hosting a space-qualified processor, memories and some rad-hard FPGAs acting as digital controllers.This paper reports on the overall METIS electronics architecture and data processing capabilities conceived to address all the scientific issues as a trade-off solution between requirements and allocated resources, just before the Preliminary Design Review as an ESA milestone in April 2012.

Keywords: Solar Coronagraph, Data Processing Electronics, VIS and UV/EUV Detectors, LEON Processor, Rad-hard FPGAs

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2554
8076 Learning Monte Carlo Data for Circuit Path Length

Authors: Namal A. Senanayake, A. Beg, Withana C. Prasad

Abstract:

This paper analyzes the patterns of the Monte Carlo data for a large number of variables and minterms, in order to characterize the circuit path length behavior. We propose models that are determined by training process of shortest path length derived from a wide range of binary decision diagram (BDD) simulations. The creation of the model was done use of feed forward neural network (NN) modeling methodology. Experimental results for ISCAS benchmark circuits show an RMS error of 0.102 for the shortest path length complexity estimation predicted by the NN model (NNM). Use of such a model can help reduce the time complexity of very large scale integrated (VLSI) circuitries and related computer-aided design (CAD) tools that use BDDs.

Keywords: Monte Carlo data, Binary decision diagrams, Neural network modeling, Shortest path length estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1595
8075 Organizational Management Model based on Knowledge Management, Talent Management and Technology Management Framework “Gomak“

Authors: Nieto Bernal W., Luna Amaya C.

Abstract:

This paper aims to present a framework for the organizational knowledge management, which seeks to deploy a standardized structure for the integrated management of knowledge is a common language based on domains, processes and global indicators inspired by the COBIT framework 5 (ISACA, 2012), which supports the integration of three technologies, enterprise information architecture (EIA), the business process modeling (BPM) and service-oriented architecture (SOA). The Gomak Framework is a management platform that seeks to integrate the information technology infrastructure, the structure of applications, information infrastructure, and business logic and business model to support a sound strategy of organizational knowledge management, low process-based approach and concurrent engineering. Concurrent engineering (CE) is a systematic approach to integrated product development that respond to customer expectations, involving all perspectives in parallel, from the beginning of the product life cycle. (European Space Agency, 2000).

Keywords: Business Process Modeling, Enterprise Information Architecture, Government and Knowledge Management, Service Oriented Architecture, Process Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1848