Search results for: modal expansion approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5390

Search results for: modal expansion approach

4340 New Approaches on Exponential Stability Analysis for Neural Networks with Time-Varying Delays

Authors: Qingqing Wang, Baocheng Chen, Shouming Zhong

Abstract:

In this paper, utilizing the Lyapunov functional method and combining linear matrix inequality (LMI) techniques and integral inequality approach (IIA) to study the exponential stability problem for neural networks with discrete and distributed time-varying delays.By constructing new Lyapunov-Krasovskii functional and dividing the discrete delay interval into multiple segments,some new delay-dependent exponential stability criteria are established in terms of LMIs and can be easily checked.In order to show the stability condition in this paper gives much less conservative results than those in the literature,numerical examples are considered.

Keywords: Neural networks, Exponential stability, LMI approach, Time-varying delays.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2053
4339 Analysis of Causality between Defect Causes Using Association Rule Mining

Authors: Sangdeok Lee, Sangwon Han, Changtaek Hyun

Abstract:

Construction defects are major components that result in negative impacts on project performance including schedule delays and cost overruns. Since construction defects generally occur when a few associated causes combine, a thorough understanding of defect causality is required in order to more systematically prevent construction defects. To address this issue, this paper uses association rule mining (ARM) to quantify the causality between defect causes, and social network analysis (SNA) to find indirect causality among them. The suggested approach is validated with 350 defect instances from concrete works in 32 projects in Korea. The results show that the interrelationships revealed by the approach reflect the characteristics of the concrete task and the important causes that should be prevented.

Keywords: Causality, defect causes, social network analysis, association rule mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1316
4338 Thermomechanical Coupled Analysis of Fiber Reinforced Polymer Composite Square Tube: A Finite Element Study

Authors: M. Ali, K. Alam, E. Ohioma

Abstract:

This paper presents a numerical investigation on the behavior of fiber reinforced polymer composite tubes (FRP) under thermomechanical coupled loading using finite element software ABAQUS and a special add-on subroutine, CZone. Three cases were explored; pure mechanical loading, pure thermal loading, and coupled thermomechanical loading. The failure index (Tsai-Wu) under all three loading cases was assessed for all plies in the tube walls. The simulation results under pure mechanical loading showed that composite tube failed at a tensile load of 3.1 kN. However, with the superposition of thermal load on mechanical load on the composite tube, the failure index of the previously failed plies in tube walls reduced significantly causing the tube to fail at 6 kN. This showed 93% improvement in the load carrying capacity of the composite tube in present study. The increase in load carrying capacity was attributed to the stress effects of the coefficients of thermal expansion (CTE) on the laminate as well as the inter-lamina stresses induced due to the composite stack layup.

Keywords: Thermal, mechanical, composites, square tubes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1929
4337 Optimal Manufacturing Scheduling for Dependent Details Processing

Authors: Ivan C. Mustakerov, Daniela I. Borissova

Abstract:

The increasing competitiveness in manufacturing industry is forcing manufacturers to seek effective processing schedules. The paper presents an optimization manufacture scheduling approach for dependent details processing with given processing sequences and times on multiple machines. By defining decision variables as start and end moments of details processing it is possible to use straightforward variables restrictions to satisfy different technological requirements and to formulate easy to understand and solve optimization tasks for multiple numbers of details and machines. A case study example is solved for seven base moldings for CNC metalworking machines processed on five different machines with given processing order among details and machines and known processing time-s duration. As a result of linear optimization task solution the optimal manufacturing schedule minimizing the overall processing time is obtained. The manufacturing schedule defines the moments of moldings delivery thus minimizing storage costs and provides mounting due-time satisfaction. The proposed optimization approach is based on real manufacturing plant problem. Different processing schedules variants for different technological restrictions were defined and implemented in the practice of Bulgarian company RAIS Ltd. The proposed approach could be generalized for other job shop scheduling problems for different applications.

Keywords: Optimal manufacturing scheduling, linear programming, metalworking machines production, dependant details processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1470
4336 Investigating the Effect of Uncertainty on a LP Model of a Petrochemical Complex: Stability Analysis Approach

Authors: Abdallah Al-Shammari

Abstract:

This study discusses the effect of uncertainty on production levels of a petrochemical complex. Uncertainly or variations in some model parameters, such as prices, supply and demand of materials, can affect the optimality or the efficiency of any chemical process. For any petrochemical complex with many plants, there are many sources of uncertainty and frequent variations which require more attention. Many optimization approaches are proposed in the literature to incorporate uncertainty within the model in order to obtain a robust solution. In this work, a stability analysis approach is applied to a deterministic LP model of a petrochemical complex consists of ten plants to investigate the effect of such variations on the obtained optimal production levels. The proposed approach can determinate the allowable variation ranges of some parameters, mainly objective or RHS coefficients, before the system lose its optimality. Parameters with relatively narrow range of variations, i.e. stability limits, are classified as sensitive parameters or constraints that need accurate estimate or intensive monitoring. These stability limits offer easy-to-use information to the decision maker and help in understanding the interaction between some model parameters and deciding when the system need to be re-optimize. The study shows that maximum production of ethylene and the prices of intermediate products are the most sensitive factors that affect the stability of the optimum solution

Keywords: Linear programming, Petrochemicals, stability analysis, uncertainty

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1939
4335 Parametric Approach for Reserve Liability Estimate in Mortgage Insurance

Authors: Rajinder Singh, Ram Valluru

Abstract:

Chain Ladder (CL) method, Expected Loss Ratio (ELR) method and Bornhuetter-Ferguson (BF) method, in addition to more complex transition-rate modeling, are commonly used actuarial reserving methods in general insurance. There is limited published research about their relative performance in the context of Mortgage Insurance (MI). In our experience, these traditional techniques pose unique challenges and do not provide stable claim estimates for medium to longer term liabilities. The relative strengths and weaknesses among various alternative approaches revolve around: stability in the recent loss development pattern, sufficiency and reliability of loss development data, and agreement/disagreement between reported losses to date and ultimate loss estimate. CL method results in volatile reserve estimates, especially for accident periods with little development experience. The ELR method breaks down especially when ultimate loss ratios are not stable and predictable. While the BF method provides a good tradeoff between the loss development approach (CL) and ELR, the approach generates claim development and ultimate reserves that are disconnected from the ever-to-date (ETD) development experience for some accident years that have more development experience. Further, BF is based on subjective a priori assumption. The fundamental shortcoming of these methods is their inability to model exogenous factors, like the economy, which impact various cohorts at the same chronological time but at staggered points along their life-time development. This paper proposes an alternative approach of parametrizing the loss development curve and using logistic regression to generate the ultimate loss estimate for each homogeneous group (accident year or delinquency period). The methodology was tested on an actual MI claim development dataset where various cohorts followed a sigmoidal trend, but levels varied substantially depending upon the economic and operational conditions during the development period spanning over many years. The proposed approach provides the ability to indirectly incorporate such exogenous factors and produce more stable loss forecasts for reserving purposes as compared to the traditional CL and BF methods.

Keywords: Actuarial loss reserving techniques, logistic regression, parametric function, volatility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 403
4334 An Efficient Data Collection Approach for Wireless Sensor Networks

Authors: Hanieh Alipour, Alireza Nemaney Pour

Abstract:

One of the most important applications of wireless sensor networks is data collection. This paper proposes as efficient approach for data collection in wireless sensor networks by introducing Member Forward List. This list includes the nodes with highest priority for forwarding the data. When a node fails or dies, this list is used to select the next node with higher priority. The benefit of this node is that it prevents the algorithm from repeating when a node fails or dies. The results show that Member Forward List decreases power consumption and latency in wireless sensor networks.

Keywords: Data Collection, Wireless Sensor Network, SensorNode, Tree-Based

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2394
4333 Comparative Canadian Online News Coverage Analysis of Sex Trafficking Reported Cases in Ontario and Nova Scotia

Authors: Alisha Fisher

Abstract:

Sex trafficking is a worldwide crisis that requires trauma-informed and survivor-centered media attention to accurate disseminate information. Much of the previous literature of sex trafficking tends to focus on frequency of incidents, intervention, and support strategies for survivors, with few of them looking to how the media is conducting their reporting on sex trafficking cases to the public. Utilizing data of reports from the media of cases of sex trafficking in the two Canadian provinces with the highest cases of sex trafficking, Ontario and Nova Scotia, we sought to analyze the similarities and differences of how sex trafficking cases were being reported. A total of 20 articles were examined, with 10 based within the province of Ontario and the remaining 10 from the province of Nova Scotia. We coded in two processes, first, who the article was about, and second, the framing and content inclusion. The results suggest that there is high usage, and reliance of voices and images of authority, with male people of color being shown as the perpetrators, and white women being shown as the survivors. These findings can aid in the expansion of trauma-informed, survivor-centered media literacy of reports of sex trafficking to provide accurate insights, and further developing robust methods to intersectional approaches to reporting cases of sex trafficking.

Keywords: Sex Trafficking, media coverage, canada sex trafficking, content analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 700
4332 Multi-VSS Scheme by Shifting Random Grids

Authors: Joy Jo-Yi Chang, Justie Su-Tzu Juan

Abstract:

Visual secret sharing (VSS) was proposed by Naor and Shamir in 1995. Visual secret sharing schemes encode a secret image into two or more share images, and single share image can’t obtain any information about the secret image. When superimposes the shares, it can restore the secret by human vision. Due to the traditional VSS have some problems like pixel expansion and the cost of sophisticated. And this method only can encode one secret image. The schemes of encrypting more secret images by random grids into two shares were proposed by Chen et al. in 2008. But when those restored secret images have much distortion, those schemes are almost limited in decoding. In the other words, if there is too much distortion, we can’t encrypt too much information. So, if we can adjust distortion to very small, we can encrypt more secret images. In this paper, four new algorithms which based on Chang et al.’s scheme be held in 2010 are proposed. First algorithm can adjust distortion to very small. Second algorithm distributes the distortion into two restored secret images. Third algorithm achieves no distortion for special secret images. Fourth algorithm encrypts three secret images, which not only retain the advantage of VSS but also improve on the problems of decoding.

Keywords: Visual cryptography, visual secret sharing, random grids, multiple, secret image sharing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1508
4331 Novel GPU Approach in Predicting the Directional Trend of the S&P 500

Authors: A. J. Regan, F. J. Lidgey, M. Betteridge, P. Georgiou, C. Toumazou, K. Hayatleh, J. R. Dibble

Abstract:

Our goal is development of an algorithm capable of predicting the directional trend of the Standard and Poor’s 500 index (S&P 500). Extensive research has been published attempting to predict different financial markets using historical data testing on an in-sample and trend basis, with many authors employing excessively complex mathematical techniques. In reviewing and evaluating these in-sample methodologies, it became evident that this approach was unable to achieve sufficiently reliable prediction performance for commercial exploitation. For these reasons, we moved to an out-ofsample strategy based on linear regression analysis of an extensive set of financial data correlated with historical closing prices of the S&P 500. We are pleased to report a directional trend accuracy of greater than 55% for tomorrow (t+1) in predicting the S&P 500.

Keywords: Financial algorithm, GPU, S&P 500, stock market prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1726
4330 Image Thresholding for Weld Defect Extraction in Industrial Radiographic Testing

Authors: Nafaâ Nacereddine, Latifa Hamami, Djemel Ziou

Abstract:

In non destructive testing by radiography, a perfect knowledge of the weld defect shape is an essential step to appreciate the quality of the weld and make decision on its acceptability or rejection. Because of the complex nature of the considered images, and in order that the detected defect region represents the most accurately possible the real defect, the choice of thresholding methods must be done judiciously. In this paper, performance criteria are used to conduct a comparative study of thresholding methods based on gray level histogram, 2-D histogram and locally adaptive approach for weld defect extraction in radiographic images.

Keywords: 1D and 2D histogram, locally adaptive approach, performance criteria, radiographic image, thresholding, weld defect.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2330
4329 The Link between Distributed Leadership and Educational Outcomes: An Overview of Research

Authors: Maria Eliophotou Menon

Abstract:

School leadership is commonly considered to have a significant influence on school effectiveness and improvement. Effective school leaders are expected to successfully introduce and support change and innovation at the school unit. Despite an abundance of studies on educational leadership, very few studies have provided evidence on the link between leadership models, and specific educational and school outcomes. This is true of a popular contemporary approach to leadership, namely, distributed leadership. The paper provides an overview of research findings on the effect of distributed leadership on educational outcomes. The theoretical basis for this approach to leadership is presented, with reference to methodological and research limitations. The paper discusses research findings and draws their implications for educational research on school leadership.

Keywords: Distributed leadership, educational outcomes, leadership research.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3700
4328 Creating 3D Models Using Infrared Thermography with Remotely Piloted Aerial Systems

Authors: P. van Tonder, C. C. Kruger

Abstract:

Concrete structures deteriorate over time and degradation escalates due to various factors. The rate of deterioration can be complex and unpredictable in nature. Such deteriorations may be located beneath the surface of the concrete at high elevations. This emphasizes the need for an efficient method of finding such defects to be able to assess the severity thereof. Current methods using thermography to find defects require equipment to reach higher elevations. This could become costly and time consuming not to mention the risks involved in having personnel scaffold or abseiling at such heights. Accordingly, by combining the thermal camera needed for thermography and a remotely piloted aerial system (Drone/RPAS), it could be used to alleviate some of the issues mentioned. Images can be translated into a 3D temperature model to aid concrete diagnostics and with further research can relate back to the mechanical properties of the structure but will not be dealt with in this paper. Such diagnostics includes finding delamination, similar to finding delamination on concrete decks, which resides beneath the surface of the concrete before spalling can occur. Delamination can be caused by reinforcement eroding and causing expansion beneath the concrete surface. This could lead to spalling, where concrete pieces start breaking off from the main concrete structure.

Keywords: Concrete, diagnostic, infrared thermography, 3D thermal models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 386
4327 Impact of Landuse Change on Surface Temperature in Ibadan, Nigeria

Authors: Abegunde Linda, Adedeji Oluwatola

Abstract:

It has become an increasing evident that large development influences the climate. There are concerns that rising temperature over developed areas could have negative impact and increase living discomfort within city boundaries. Temperature trends in Ibadan city have received little attention, yet the area has experienced heavy urban expansion between 1972 and 2014. This research aims at examining the impact of landuse change on surface temperature knowing that the built-up environment absorb and store solar energy, resulting into the Urban Heat Island (UHI) effect. The Landsat imagery was used to examine the landuse change for a period of 42 years (1972-2014). Land Surface Temperature (LST) was obtained by converting the thermal band to a surface temperature map and zonal statistic analyses was used to examine the relationship between landuse and temperature emission. The results showed that the settlement area increased to a large extent while the area covered by vegetation reduced during the study period. The spatial and temporal trends of surface temperature are related to the gradual change in urban landuse/landcover and the settlement area has the highest emission. This research provides useful insight into the temporal behavior of the Ibadan city.

Keywords: Landuse, LST, Remote sensing, UHI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2736
4326 Study on Specific Energy in Grinding of DRACs: A Response Surface Methodology Approach

Authors: Dayananda Pai, Shrikantha S. Rao, Savitha G.Kini

Abstract:

In this study, the effects of machining parameters on specific energy during surface grinding of 6061Al-SiC35P composites are investigated. Vol% of SiC, feed and depth of cut were chosen as process variables. The power needed for the calculation of the specific energy is measured from the two watt meter method. Experiments are conducted using standard RSM design called Central composite design (CCD). A second order response surface model was developed for specific energy. The results identify the significant influence factors to minimize the specific energy. The confirmation results demonstrate the practicability and effectiveness of the proposed approach.

Keywords: ANOVA, Metal matrix composites, Response surface methodology, Specific energy, Two watt meter method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2255
4325 Release Management with Continuous Delivery: A Case Study

Authors: A. Maruf Aytekin

Abstract:

We present our approach on using continuous delivery pattern for release management. One of the key practices of agile and lean teams is the continuous delivery of new features to stakeholders. The main benefits of this approach lie in the ability to release new applications rapidly which has real strategic impact on the competitive advantage of an organization. Organizations that successfully implement Continuous Delivery have the ability to evolve rapidly to support innovation, provide stable and reliable software in more efficient ways, decrease the amount of resources need for maintenance, and lower the software delivery time and costs. One of the objectives of this paper is to elaborate a case study where IT division of Central Securities Depository Institution (MKK) of Turkey apply Continuous Delivery pattern to improve release management process.

Keywords: Automation, continuous delivery, deployment, release management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5451
4324 A Type-2 Fuzzy Model for Link Prediction in Social Network

Authors: Mansoureh Naderipour, Susan Bastani, Mohammad Fazel Zarandi

Abstract:

Predicting links that may occur in the future and missing links in social networks is an attractive problem in social network analysis. Granular computing can help us to model the relationships between human-based system and social sciences in this field. In this paper, we present a model based on granular computing approach and Type-2 fuzzy logic to predict links regarding nodes’ activity and the relationship between two nodes. Our model is tested on collaboration networks. It is found that the accuracy of prediction is significantly higher than the Type-1 fuzzy and crisp approach.

Keywords: Social Network, link prediction, granular computing, Type-2 fuzzy sets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1560
4323 Kaikaku - Radical Improvement in Production

Authors: D. Gåsvaer, J. von Axelson

Abstract:

Considering today-s increasing speed of change, radical and innovative improvement - Kaikaku, is a necessity parallel to continuous incremental improvement - Kaizen, especially for SME-s in order to attain the competitive edge needed to be profitable. During 2011, a qualitative single case study with the objective of realizing a kaikaku in production has been conducted. The case study was run as a one year project using a collaborative approach including both researchers and company representatives. The case study was conducted with the purpose of gaining further knowledge about kaikaku realization as well as its implications. The empirical results provide insights about the great productivity results achieved by applying a specific kaikaku realization approach. However, it also sheds light on the difficulty and contradiction of combining innovation management and production system development.

Keywords: Kaikaku, Radical improvement, manufacturing, innovation capability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2515
4322 Problem-based Learning Approach to Human Computer Interaction

Authors: Oon-Seng Tan

Abstract:

Human Computer Interaction (HCI) has been an emerging field that draws in the experts from various fields to enhance the application of computer programs and the ease of computer users. HCI has much to do with learning and cognition and an emerging approach to learning and problem-solving is problembased learning (PBL). The processes of PBL involve important cognitive functions in the various stages. This paper will illustrate how closely related fields to HCI, PBL and cognitive psychology can benefit from informing each other through analysing various cognitive functions. Several cognitive functions from cognitive function disc (CFD) would be presented and discussed in relation to human-computer interface. This paper concludes with the implications of bridging the gaps amongst these disciplines.

Keywords: problem-based learning, human computerinteraction, cognitive psychology, Cognitive Function Disc (CFD)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2496
4321 Automated Java Testing: JUnit versus AspectJ

Authors: Manish Jain, Dinesh Gopalani

Abstract:

Growing dependency of mankind on software technology increases the need for thorough testing of the software applications and automated testing techniques that support testing activities. We have outlined our testing strategy for performing various types of automated testing of Java applications using AspectJ which has become the de-facto standard for Aspect Oriented Programming (AOP). Likewise JUnit, a unit testing framework is the most popular Java testing tool. In this paper, we have evaluated our proposed AOP approach for automated testing and JUnit on various parameters. First we have provided the similarity between the two approaches and then we have done a detailed comparison of the two testing techniques on factors like lines of testing code, learning curve, testing of private members etc. We established that our AOP testing approach using AspectJ has got several advantages and is thus particularly more effective than JUnit.

Keywords: Aspect oriented programming, AspectJ, Aspects, JUnit, software testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1886
4320 A New Approach to Solve Blasius Equation using Parameter Identification of Nonlinear Functions based on the Bees Algorithm (BA)

Authors: E. Assareh, M.A. Behrang, M. Ghalambaz, A.R. Noghrehabadi, A. Ghanbarzadeh

Abstract:

In this paper, a new approach is introduced to solve Blasius equation using parameter identification of a nonlinear function which is used as approximation function. Bees Algorithm (BA) is applied in order to find the adjustable parameters of approximation function regarding minimizing a fitness function including these parameters (i.e. adjustable parameters). These parameters are determined how the approximation function has to satisfy the boundary conditions. In order to demonstrate the presented method, the obtained results are compared with another numerical method. Present method can be easily extended to solve a wide range of problems.

Keywords: Bees Algorithm (BA); Approximate Solutions; Blasius Differential Equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1790
4319 Towards Developing a Self-Explanatory Scheduling System Based on a Hybrid Approach

Authors: Jian Zheng, Yoshiyasu Takahashi, Yuichi Kobayashi, Tatsuhiro Sato

Abstract:

In the study, we present a conceptual framework for developing a scheduling system that can generate self-explanatory and easy-understanding schedules. To this end, a user interface is conceived to help planners record factors that are considered crucial in scheduling, as well as internal and external sources relating to such factors. A hybrid approach combining machine learning and constraint programming is developed to generate schedules and the corresponding factors, and accordingly display them on the user interface. Effects of the proposed system on scheduling are discussed, and it is expected that scheduling efficiency and system understandability will be improved, compared with previous scheduling systems.

Keywords: Constraint programming, Factors considered in scheduling, machine learning, scheduling system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1409
4318 The Labeled Classification and its Application

Authors: M. Nemissi, H. Seridi, H. Akdag

Abstract:

This paper presents and evaluates a new classification method that aims to improve classifiers performances and speed up their training process. The proposed approach, called labeled classification, seeks to improve convergence of the BP (Back propagation) algorithm through the addition of an extra feature (labels) to all training examples. To classify every new example, tests will be carried out each label. The simplicity of implementation is the main advantage of this approach because no modifications are required in the training algorithms. Therefore, it can be used with others techniques of acceleration and stabilization. In this work, two models of the labeled classification are proposed: the LMLP (Labeled Multi Layered Perceptron) and the LNFC (Labeled Neuro Fuzzy Classifier). These models are tested using Iris, wine, texture and human thigh databases to evaluate their performances.

Keywords: Artificial neural networks, Fusion of neural networkfuzzysystems, Learning theory, Pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1399
4317 Target Detection with Improved Image Texture Feature Coding Method and Support Vector Machine

Authors: R. Xu, X. Zhao, X. Li, C. Kwan, C.-I Chang

Abstract:

An image texture analysis and target recognition approach of using an improved image texture feature coding method (TFCM) and Support Vector Machine (SVM) for target detection is presented. With our proposed target detection framework, targets of interest can be detected accurately. Cascade-Sliding-Window technique was also developed for automated target localization. Application to mammogram showed that over 88% of normal mammograms and 80% of abnormal mammograms can be correctly identified. The approach was also successfully applied to Synthetic Aperture Radar (SAR) and Ground Penetrating Radar (GPR) images for target detection.

Keywords: Image texture analysis, feature extraction, target detection, pattern classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763
4316 Prioritising the TQM Enablers and IT Resources in the ICT Industry: An AHP Approach

Authors: Suby Khanam, Jamshed Siddiqui, Faisal Talib

Abstract:

Total Quality Management (TQM) is a managerial approach that improves the competitiveness of the industry, meanwhile Information technology (IT) was introduced with TQM for handling the technical issues which is supported by quality experts for fulfilling the customers’ requirement. Present paper aims to utilise AHP (Analytic Hierarchy Process) methodology to priorities and rank the hierarchy levels of TQM enablers and IT resource together for its successful implementation in the Information and Communication Technology (ICT) industry. A total of 17 TQM enablers (nine) and IT resources (eight) were identified and partitioned into 3 categories and were prioritised by AHP approach. The finding indicates that the 17 sub-criteria can be grouped into three main categories namely organizing, tools and techniques, and culture and people. Further, out of 17 sub-criteria, three sub-criteria: top management commitment and support, total employee involvement, and continuous improvement got highest priority whereas three sub-criteria such as structural equation modelling, culture change, and customer satisfaction got lowest priority. The result suggests a hierarchy model for ICT industry to prioritise the enablers and resources as well as to improve the TQM and IT performance in the ICT industry. This paper has some managerial implication which suggests the managers of ICT industry to implement TQM and IT together in their organizations to get maximum benefits and how to utilize available resources. At the end, conclusions, limitation, future scope of the study are presented.

Keywords: Analytic Hierarchy Process, Information Technology, Information and Communication Technology, Prioritization, Total Quality Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1945
4315 Composite Programming for Electric Passenger Car Selection in Multiple Criteria Decision Making

Authors: C. Ardil

Abstract:

This paper discusses the use of the composite programming method to identify the optimum electric passenger automobile in multiple criteria decision making. With the composite programming approach, a set of alternatives are compared using an optimality measure that gauges how far apart they are from the optimum solution. In this paper, some key factors (range, battery, engine, maximum speed, acceleration) that customers should consider while purchasing an electric passenger car for daily use are discussed. A numerical illustration is provided to demonstrate the validity and applicability of the proximity measure approach

Keywords: electric passenger car selection, multiple criteria decision making, proximity measure method, composite programming, entropic weight method

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 307
4314 Large Strain Compression-Tension Behavior of AZ31B Rolled Sheet in the Rolling Direction

Authors: A. Yazdanmehr, H. Jahed

Abstract:

Being made with the lightest commercially available industrial metal, Magnesium (Mg) alloys are of interest for light-weighting. Expanding their application to different material processing methods requires Mg properties at large strains. Several room-temperature processes such as shot and laser peening and hole cold expansion need compressive large strain data. Two methods have been proposed in the literature to obtain the stress-strain curve at high strains: 1) anti-buckling guides and 2) small cubic samples. In this paper, an anti-buckling fixture is used with the help of digital image correlation (DIC) to obtain the compression-tension (C-T) of AZ31B-H24 rolled sheet at large strain values of up to 10.5%. The effect of the anti-bucking fixture on stress-strain curves is evaluated experimentally by comparing the results with those of the compression tests of cubic samples. For testing cubic samples, a new fixture has been designed to increase the accuracy of testing cubic samples with DIC strain measurements. Results show a negligible effect of anti-buckling on stress-strain curves, specifically at high strain values.

Keywords: Large strain, compression-tension, loading-unloading, Mg alloys.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 772
4313 Detection of Linkages Between Extreme Flow Measures and Climate Indices

Authors: Mohammed Sharif, Donald Burn

Abstract:

Large scale climate signals and their teleconnections can influence hydro-meteorological variables on a local scale. Several extreme flow and timing measures, including high flow and low flow measures, from 62 hydrometric stations in Canada are investigated to detect possible linkages with several large scale climate indices. The streamflow data used in this study are derived from the Canadian Reference Hydrometric Basin Network and are characterized by relatively pristine and stable land-use conditions with a minimum of 40 years of record. A composite analysis approach was used to identify linkages between extreme flow and timing measures and climate indices. The approach involves determining the 10 highest and 10 lowest values of various climate indices from the data record. Extreme flow and timing measures for each station were examined for the years associated with the 10 largest values and the years associated with the 10 smallest values. In each case, a re-sampling approach was applied to determine if the 10 values of extreme flow measures differed significantly from the series mean. Results indicate that several stations are impacted by the large scale climate indices considered in this study. The results allow the determination of any relationship between stations that exhibit a statistically significant trend and stations for which the extreme measures exhibit a linkage with the climate indices.

Keywords: flood analysis, low-flow events, climate change, trend analysis, Canada

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1580
4312 Fuzzy Multi-Component DEA with Shared and Undesirable Fuzzy Resources

Authors: Jolly Puri, Shiv Prasad Yadav

Abstract:

Multi-component data envelopment analysis (MC-DEA) is a popular technique for measuring aggregate performance of the decision making units (DMUs) along with their components. However, the conventional MC-DEA is limited to crisp input and output data which may not always be available in exact form. In real life problems, data may be imprecise or fuzzy. Therefore, in this paper, we propose (i) a fuzzy MC-DEA (FMC-DEA) model in which shared and undesirable fuzzy resources are incorporated, (ii) the proposed FMC-DEA model is transformed into a pair of crisp models using α cut approach, (iii) fuzzy aggregate performance of a DMU and fuzzy efficiencies of components are defined to be fuzzy numbers, and (iv) a numerical example is illustrated to validate the proposed approach.

Keywords: Multi-component DEA, fuzzy multi-component DEA, fuzzy resources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2061
4311 Stability Analysis of Linear Fractional Order Neutral System with Multiple Delays by Algebraic Approach

Authors: Lianglin Xiong, Yun Zhao, Tao Jiang

Abstract:

In this paper, we study the stability of n-dimensional linear fractional neutral differential equation with time delays. By using the Laplace transform, we introduce a characteristic equation for the above system with multiple time delays. We discover that if all roots of the characteristic equation have negative parts, then the equilibrium of the above linear system with fractional order is Lyapunov globally asymptotical stable if the equilibrium exist that is almost the same as that of classical differential equations. An example is provided to show the effectiveness of the approach presented in this paper.

Keywords: Fractional neutral differential equation, Laplace transform, characteristic equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2288