Search results for: Image registration techniques
1024 Concepts Extraction from Discharge Notes using Association Rule Mining
Authors: Basak Oguz Yolcular
Abstract:
A large amount of valuable information is available in plain text clinical reports. New techniques and technologies are applied to extract information from these reports. In this study, we developed a domain based software system to transform 600 Otorhinolaryngology discharge notes to a structured form for extracting clinical data from the discharge notes. In order to decrease the system process time discharge notes were transformed into a data table after preprocessing. Several word lists were constituted to identify common section in the discharge notes, including patient history, age, problems, and diagnosis etc. N-gram method was used for discovering terms co-Occurrences within each section. Using this method a dataset of concept candidates has been generated for the validation step, and then Predictive Apriori algorithm for Association Rule Mining (ARM) was applied to validate candidate concepts.Keywords: association rule mining, otorhinolaryngology, predictive apriori, text mining
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16141023 Studying the Causes and Affecting Factors of Motorcycle Accidents A Case Study on the Road Accidents in Zanjan Province (IRAN) - 2007
Authors: A. Beheshti, S. Salkhordeh, H. Amini
Abstract:
Based on statistics released by Islamic Republic of Iran Police (IRIP), from among the total 9555 motorcycle accidents that happened in 2007, 857 riders died and 11219 one got injured. If we also consider the death toll and injuries of other vehicles' accidents resulted from traffic violation by motorcycle riders, then paying attention to the motorcycle accidents seems to be very necessary. Therefore, in this study we tried to investigate the traits and issues related to production, application, and training, along with causes of motorcycle accidents from 4 perspectives of road, human, environment and vehicle and also based on statistical and geographical analysis of accident-sheets prepared by Iran Road Patrol Department (IRPD). Unfamiliarity of riders with regulations and techniques of motorcycling, disuse of safety equipments, inefficiency of roads and design of junctions for safe trafficking of motorcycles and finally the lack of sufficient control of responsible organizations are among the major causes which lead to these accidents.Keywords: Motorcycle, Motorcycle riders, Road accidents, Statistical analysis of accidents.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15821022 Structural Characteristics of Three-Dimensional Random Packing of Aggregates with Wide Size Distribution
Authors: Kasthurirangan Gopalakrishnan, Naga Shashidhar
Abstract:
The mechanical properties of granular solids are dependent on the flow of stresses from one particle to another through inter-particle contact. Although some experimental methods have been used to study the inter-particle contacts in the past, preliminary work with these techniques indicated that they do not have the necessary resolution to distinguish between those contacts that transmit the load and those that do not, especially for systems with a wide distribution of particle sizes. In this research, computer simulations are used to study the nature and distribution of contacts in a compact with wide particle size distribution, representative of aggregate size distribution used in asphalt pavement construction. The packing fraction, the mean number of contacts and the distribution of contacts were studied for different scenarios. A methodology to distinguish and compute the fraction of load-bearing particles and the fraction of space-filling particles (particles that do not transmit any force) is needed for further investigation.Keywords: Computer simulation, three-dimensional particlepacking, coordination number, asphalt concrete, aggregates.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21121021 A Neuro-Fuzzy Approach Based Voting Scheme for Fault Tolerant Systems Using Artificial Bee Colony Training
Authors: D. Uma Devi, P. Seetha Ramaiah
Abstract:
Voting algorithms are extensively used to make decisions in fault tolerant systems where each redundant module gives inconsistent outputs. Popular voting algorithms include majority voting, weighted voting, and inexact majority voters. Each of these techniques suffers from scenarios where agreements do not exist for the given voter inputs. This has been successfully overcome in literature using fuzzy theory. Our previous work concentrated on a neuro-fuzzy algorithm where training using the neuro system substantially improved the prediction result of the voting system. Weight training of Neural Network is sub-optimal. This study proposes to optimize the weights of the Neural Network using Artificial Bee Colony algorithm. Experimental results show the proposed system improves the decision making of the voting algorithms.Keywords: Voting algorithms, Fault tolerance, Fault masking, Neuro-Fuzzy System (NFS), Artificial Bee Colony (ABC)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26551020 Modelling Medieval Vaults: Digital Simulation of the North Transept Vault of St Mary, Nantwich, England
Authors: N. Webb, A. Buchanan
Abstract:
Digital and virtual heritage is often associated with the recreation of lost artefacts and architecture; however, we can also investigate works that were not completed, using digital tools and techniques. Here we explore physical evidence of a fourteenth-century Gothic vault located in the north transept of St Mary’s church in Nantwich, Cheshire, using existing springer stones that are built into the walls as a starting point. Digital surveying tools are used to document the architecture, followed by an analysis process to hypothesise and simulate possible design solutions, had the vault been completed. A number of options, both two-dimensionally and three-dimensionally, are discussed based on comparison with examples of other contemporary vaults, thus adding another specimen to the corpus of vault designs. Dissemination methods such as digital models and 3D prints are also explored as possible resources for demonstrating what the finished vault might have looked like for heritage interpretation and other purposes.Keywords: Digital simulation, heritage interpretation, medieval vaults, virtual heritage, 3D scanning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11781019 Automatic Inspection of Percussion Caps by Means of Combined 2D and 3D Machine Vision Techniques
Authors: A. Tellaeche, R. Arana, I.Maurtua
Abstract:
The exhaustive quality control is becoming more and more important when commercializing competitive products in the world's globalized market. Taken this affirmation as an undeniable truth, it becomes critical in certain sector markets that need to offer the highest restrictions in quality terms. One of these examples is the percussion cap mass production, a critical element assembled in firearm ammunition. These elements, built in great quantities at a very high speed, must achieve a minimum tolerance deviation in their fabrication, due to their vital importance in firing the piece of ammunition where they are built in. This paper outlines a machine vision development for the 100% inspection of percussion caps obtaining data from 2D and 3D simultaneous images. The acquisition speed and precision of these images from a metallic reflective piece as a percussion cap, the accuracy of the measures taken from these images and the multiple fabrication errors detected make the main findings of this work.Keywords: critical tolerance, high speed decision makingsimultaneous 2D/3D machine vision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15371018 Cloud Monitoring and Performance Optimization Ensuring High Availability and Security
Authors: Inayat Ur Rehman, Georgia Sakellari
Abstract:
Cloud computing has evolved into a vital technology for businesses, offering scalability, flexibility, and cost-effectiveness. However, maintaining high availability and optimal performance in the cloud is crucial for reliable services. This paper explores the significance of cloud monitoring and performance optimization in sustaining the high availability of cloud-based systems. It discusses diverse monitoring tools, techniques, and best practices for continually assessing the health and performance of cloud resources. The paper also delves into performance optimization strategies, including resource allocation, load balancing, and auto-scaling, to ensure efficient resource utilization and responsiveness. Addressing potential challenges in cloud monitoring and optimization, the paper offers insights into data security and privacy considerations. Through this thorough analysis, the paper aims to underscore the importance of cloud monitoring and performance optimization for ensuring a seamless and highly available cloud computing environment.
Keywords: Cloud computing, cloud monitoring, performance optimization, high availability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 741017 Time Series Simulation by Conditional Generative Adversarial Net
Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto
Abstract:
Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.
Keywords: Conditional Generative Adversarial Net, market and credit risk management, neural network, time series.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11991016 Modelling Customer's Attitude Towards E-Government Services
Authors: Norazah Mohd Suki, T Ramayah
Abstract:
e-Government structures permits the government to operate in a more transparent and accountable manner of which it increases the power of the individual in relation to that of the government. This paper identifies the factors that determine customer-s attitude towards e-Government services using a theoretical model based on the Technology Acceptance Model. Data relating to the constructs were collected from 200 respondents. The research model was tested using Structural Equation Modeling (SEM) techniques via the Analysis of Moment Structure (AMOS 16) computer software. SEM is a comprehensive approach to testing hypotheses about relations among observed and latent variables. The proposed model fits the data well. The results demonstrated that e- Government services acceptance can be explained in terms of compatibility and attitude towards e-Government services. The setup of the e-Government services will be compatible with the way users work and are more likely to adopt e-Government services owing to their familiarity with the Internet for various official, personal, and recreational uses. In addition, managerial implications for government policy makers, government agencies, and system developers are also discussed.
Keywords: E-government, structural equation modelling, attitude, service.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22031015 Using Automated Database Reverse Engineering for Database Integration
Authors: M. R. Abbasifard, M. Rahgozar, A. Bayati, P. Pournemati
Abstract:
One important problem in today organizations is the existence of non-integrated information systems, inconsistency and lack of suitable correlations between legacy and modern systems. One main solution is to transfer the local databases into a global one. In this regards we need to extract the data structures from the legacy systems and integrate them with the new technology systems. In legacy systems, huge amounts of a data are stored in legacy databases. They require particular attention since they need more efforts to be normalized, reformatted and moved to the modern database environments. Designing the new integrated (global) database architecture and applying the reverse engineering requires data normalization. This paper proposes the use of database reverse engineering in order to integrate legacy and modern databases in organizations. The suggested approach consists of methods and techniques for generating data transformation rules needed for the data structure normalization.Keywords: Reverse Engineering, Database Integration, System Integration, Data Structure Normalization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18521014 An Experimental Study of a Self-Supervised Classifier Ensemble
Authors: Neamat El Gayar
Abstract:
Learning using labeled and unlabelled data has received considerable amount of attention in the machine learning community due its potential in reducing the need for expensive labeled data. In this work we present a new method for combining labeled and unlabeled data based on classifier ensembles. The model we propose assumes each classifier in the ensemble observes the input using different set of features. Classifiers are initially trained using some labeled samples. The trained classifiers learn further through labeling the unknown patterns using a teaching signals that is generated using the decision of the classifier ensemble, i.e. the classifiers self-supervise each other. Experiments on a set of object images are presented. Our experiments investigate different classifier models, different fusing techniques, different training sizes and different input features. Experimental results reveal that the proposed self-supervised ensemble learning approach reduces classification error over the single classifier and the traditional ensemble classifier approachs.Keywords: Multiple Classifier Systems, classifier ensembles, learning using labeled and unlabelled data, K-nearest neighbor classifier, Bayes classifier.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16441013 Analysis of Boiling in Rectangular Micro Channel Heat Sink
Authors: Ahmed Jassim Shkarah, Mohd Yusoff Bin Sulaiman, Md Razali bin Hj Ayob
Abstract:
A 3D-conjugate numerical investigation was conducted to predict heat transfer characteristics in a rectangular cross-sectional micro-channel employing simultaneously developing Tow-phase flows. The sole purpose for analyzing two phase flow heat transfer in rectangular micro channel is to pin point what are the different factors affecting this phenomenon. Different methods and techniques have been undertaken to analyze the equations arising constituting the flow of heat from gas phase to liquid phase and vice versa.Different models of micro channels have been identified and analyzed. How the geometry of micro channels affects their activity i.e. of circular and non-circular geometry has also been reviewed. To the study the results average Nusselt no plotted against the Reynolds no has been taken into consideration to study average heat exchange in micro channels against applied heat flux. High heat fluxes up to 140 W/cm2 were applied to investigate micro-channel thermal characteristics.
Keywords: Tow Phase flow, Micro channel, VOF.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19861012 Design and Implementation of a Hybrid Fuzzy Controller for a High-Performance Induction
Authors: M. Zerikat, S. Chekroun
Abstract:
This paper proposes an effective algorithm approach to hybrid control systems combining fuzzy logic and conventional control techniques of controlling the speed of induction motor assumed to operate in high-performance drives environment. The introducing of fuzzy logic in the control systems helps to achieve good dynamical response, disturbance rejection and low sensibility to parameter variations and external influences. Some fundamentals of the fuzzy logic control are preliminary illustrated. The developed control algorithm is robust, efficient and simple. It also assures precise trajectory tracking with the prescribed dynamics. Experimental results have shown excellent tracking performance of the proposed control system, and have convincingly demonstrated the validity and the usefulness of the hybrid fuzzy controller in high-performance drives with parameter and load uncertainties. Satisfactory performance was observed for most reference tracks.
Keywords: Fuzzy controller, high-performance, inductionmotor, intelligent control, robustness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21731011 Fuzzy Time Series Forecasting Using Percentage Change as the Universe of Discourse
Authors: Meredith Stevenson, John E. Porter
Abstract:
Since the pioneering work of Zadeh, fuzzy set theory has been applied to a myriad of areas. Song and Chissom introduced the concept of fuzzy time series and applied some methods to the enrollments of the University of Alabama. In recent years, a number of techniques have been proposed for forecasting based on fuzzy set theory methods. These methods have either used enrollment numbers or differences of enrollments as the universe of discourse. We propose using the year to year percentage change as the universe of discourse. In this communication, the approach of Jilani, Burney, and Ardil is modified by using the year to year percentage change as the universe of discourse. We use enrollment figures for the University of Alabama to illustrate our proposed method. The proposed method results in better forecasting accuracy than existing models.
Keywords: Fuzzy forecasting, fuzzy time series, fuzzified enrollments, time-invariant model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25041010 Functional Decomposition Based Effort Estimation Model for Software-Intensive Systems
Authors: Nermin Sökmen
Abstract:
An effort estimation model is needed for softwareintensive projects that consist of hardware, embedded software or some combination of the two, as well as high level software solutions. This paper first focuses on functional decomposition techniques to measure functional complexity of a computer system and investigates its impact on system development effort. Later, it examines effects of technical difficulty and design team capability factors in order to construct the best effort estimation model. With using traditional regression analysis technique, the study develops a system development effort estimation model which takes functional complexity, technical difficulty and design team capability factors as input parameters. Finally, the assumptions of the model are tested.
Keywords: Functional complexity, functional decomposition, development effort, technical difficulty, design team capability, regression analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22781009 Heuristic Search Algorithms for Tuning PUMA 560 Fuzzy PID Controller
Authors: Sufian Ashraf Mazhari, Surendra Kumar
Abstract:
This paper compares the heuristic Global Search Techniques; Genetic Algorithm, Particle Swarm Optimization, Simulated Annealing, Generalized Pattern Search, genetic algorithm hybridized with Nelder–Mead and Generalized pattern search technique for tuning of fuzzy PID controller for Puma 560. Since the actual control is in joint space ,inverse kinematics is used to generate various joint angles correspoding to desired cartesian space trajectory. Efficient dynamics and kinematics are modeled on Matlab which takes very less simulation time. Performances of all the tuning methods with and without disturbance are compared in terms of ITSE in joint space and ISE in cartesian space for spiral trajectory tracking. Genetic Algorithm hybridized with Generalized Pattern Search is showing best performance.Keywords: Controller tuning, Fuzzy Control, Genetic Algorithm, Heuristic search, Robot control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22161008 A Genetic and Simulated Annealing Based Algorithms for Solving the Flow Assignment Problem in Computer Networks
Authors: Tarek M. Mahmoud
Abstract:
Selecting the routes and the assignment of link flow in a computer communication networks are extremely complex combinatorial optimization problems. Metaheuristics, such as genetic or simulated annealing algorithms, are widely applicable heuristic optimization strategies that have shown encouraging results for a large number of difficult combinatorial optimization problems. This paper considers the route selection and hence the flow assignment problem. A genetic algorithm and simulated annealing algorithm are used to solve this problem. A new hybrid algorithm combining the genetic with the simulated annealing algorithm is introduced. A modification of the genetic algorithm is also introduced. Computational experiments with sample networks are reported. The results show that the proposed modified genetic algorithm is efficient in finding good solutions of the flow assignment problem compared with other techniques.Keywords: Genetic Algorithms, Flow Assignment, Routing, Computer network, Simulated Annealing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22561007 An Efficient Technique for Extracting Fuzzy Rulesfrom Neural Networks
Authors: Besa Muslimi, Miriam A. M. Capretz, Jagath Samarabandu
Abstract:
Artificial neural networks (ANN) have the ability to model input-output relationships from processing raw data. This characteristic makes them invaluable in industry domains where such knowledge is scarce at best. In the recent decades, in order to overcome the black-box characteristic of ANNs, researchers have attempted to extract the knowledge embedded within ANNs in the form of rules that can be used in inference systems. This paper presents a new technique that is able to extract a small set of rules from a two-layer ANN. The extracted rules yield high classification accuracy when implemented within a fuzzy inference system. The technique targets industry domains that possess less complex problems for which no expert knowledge exists and for which a simpler solution is preferred to a complex one. The proposed technique is more efficient, simple, and applicable than most of the previously proposed techniques.
Keywords: fuzzy rule extraction, fuzzy systems, knowledgeacquisition, pattern recognition, artificial neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15811006 IIR Filter design with Craziness based Particle Swarm Optimization Technique
Authors: Suman Kumar Saha, Rajib Kar, Durbadal Mandal, S. P. Ghoshal
Abstract:
This paper demonstrates the application of craziness based particle swarm optimization (CRPSO) technique for designing the 8th order low pass Infinite Impulse Response (IIR) filter. CRPSO, the much improved version of PSO, is a population based global heuristic search algorithm which finds near optimal solution in terms of a set of filter coefficients. Effectiveness of this algorithm is justified with a comparative study of some well established algorithms, namely, real coded genetic algorithm (RGA) and particle swarm optimization (PSO). Simulation results affirm that the proposed algorithm CRPSO, outperforms over its counterparts not only in terms of quality output i.e. sharpness at cut-off, pass band ripple, stop band ripple, and stop band attenuation but also in convergence speed with assured stability.
Keywords: IIR Filter, RGA, PSO, CRPSO, Evolutionary Optimization Techniques, Low Pass (LP) Filter, Magnitude Response, Pole-Zero Plot, Stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25761005 1 kW Power Factor Correction Soft Switching Boost Converter with an Active Snubber Cell
Authors: Yakup Sahin, Naim Suleyman Ting, Ismail Aksoy
Abstract:
A 1 kW power factor correction boost converter with an active snubber cell is presented in this paper. In the converter, the main switch turns on under zero voltage transition (ZVT) and turns off under zero current transition (ZCT) without any additional voltage or current stress. The auxiliary switch turns on and off under zero current switching (ZCS). Besides, the main diode turns on under ZVS and turns off under ZCS. The output current and voltage are controlled by the PFC converter in wide line and load range. The simulation results of converter are obtained for 1 kW and 100 kHz. One of the most important feature of the given converter is that it has direct power transfer as well as excellent soft switching techniques. Also, the converter has 0.99 power factor with the sinusoidal input current shape.
Keywords: Power factor correction, direct power transfer, zero-voltage transition, zero-current transition, soft switching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18411004 Application of Wavelet Neural Networks in Optimization of Skeletal Buildings under Frequency Constraints
Authors: Mohammad Reza Ghasemi, Amin Ghorbani
Abstract:
The main goal of the present work is to decrease the computational burden for optimum design of steel frames with frequency constraints using a new type of neural networks called Wavelet Neural Network. It is contested to train a suitable neural network for frequency approximation work as the analysis program. The combination of wavelet theory and Neural Networks (NN) has lead to the development of wavelet neural networks. Wavelet neural networks are feed-forward networks using wavelet as activation function. Wavelets are mathematical functions within suitable inner parameters, which help them to approximate arbitrary functions. WNN was used to predict the frequency of the structures. In WNN a RAtional function with Second order Poles (RASP) wavelet was used as a transfer function. It is shown that the convergence speed was faster than other neural networks. Also comparisons of WNN with the embedded Artificial Neural Network (ANN) and with approximate techniques and also with analytical solutions are available in the literature.Keywords: Weight Minimization, Frequency Constraints, Steel Frames, ANN, WNN, RASP Function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17401003 A Context-Aware Supplier Selection Model
Authors: Mohammadreza Razzazi, Maryam Bayat
Abstract:
Selection of the best possible set of suppliers has a significant impact on the overall profitability and success of any business. For this reason, it is usually necessary to optimize all business processes and to make use of cost-effective alternatives for additional savings. This paper proposes a new efficient context-aware supplier selection model that takes into account possible changes of the environment while significantly reducing selection costs. The proposed model is based on data clustering techniques while inspiring certain principles of online algorithms for an optimally selection of suppliers. Unlike common selection models which re-run the selection algorithm from the scratch-line for any decision-making sub-period on the whole environment, our model considers the changes only and superimposes it to the previously defined best set of suppliers to obtain a new best set of suppliers. Therefore, any recomputation of unchanged elements of the environment is avoided and selection costs are consequently reduced significantly. A numerical evaluation confirms applicability of this model and proves that it is a more optimal solution compared with common static selection models in this field.Keywords: Supplier Selection, Context-Awareness, OnlineAlgorithms, Data Clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18191002 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis
Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya
Abstract:
In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.Keywords: Cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9771001 Thresholding Approach for Automatic Detection of Pseudomonas aeruginosa Biofilms from Fluorescence in situ Hybridization Images
Authors: Zonglin Yang, Tatsuya Akiyama, Kerry S. Williamson, Michael J. Franklin, Thiruvarangan Ramaraj
Abstract:
Pseudomonas aeruginosa is an opportunistic pathogen that forms surface-associated microbial communities (biofilms) on artificial implant devices and on human tissue. Biofilm infections are difficult to treat with antibiotics, in part, because the bacteria in biofilms are physiologically heterogeneous. One measure of biological heterogeneity in a population of cells is to quantify the cellular concentrations of ribosomes, which can be probed with fluorescently labeled nucleic acids. The fluorescent signal intensity following fluorescence in situ hybridization (FISH) analysis correlates to the cellular level of ribosomes. The goals here are to provide computationally and statistically robust approaches to automatically quantify cellular heterogeneity in biofilms from a large library of epifluorescent microscopy FISH images. In this work, the initial steps were developed toward these goals by developing an automated biofilm detection approach for use with FISH images. The approach allows rapid identification of biofilm regions from FISH images that are counterstained with fluorescent dyes. This methodology provides advances over other computational methods, allowing subtraction of spurious signals and non-biological fluorescent substrata. This method will be a robust and user-friendly approach which will enable users to semi-automatically detect biofilm boundaries and extract intensity values from fluorescent images for quantitative analysis of biofilm heterogeneity.
Keywords: Image informatics, Pseudomonas aeruginosa, biofilm, FISH, computer vision, data visualization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11811000 Gate Tunnel Current Calculation for NMOSFET Based on Deep Sub-Micron Effects
Authors: Ashwani K. Rana, Narottam Chand, Vinod Kapoor
Abstract:
Aggressive scaling of MOS devices requires use of ultra-thin gate oxides to maintain a reasonable short channel effect and to take the advantage of higher density, high speed, lower cost etc. Such thin oxides give rise to high electric fields, resulting in considerable gate tunneling current through gate oxide in nano regime. Consequently, accurate analysis of gate tunneling current is very important especially in context of low power application. In this paper, a simple and efficient analytical model has been developed for channel and source/drain overlap region gate tunneling current through ultra thin gate oxide n-channel MOSFET with inevitable deep submicron effect (DSME).The results obtained have been verified with simulated and reported experimental results for the purpose of validation. It is shown that the calculated tunnel current is well fitted to the measured one over the entire oxide thickness range. The proposed model is suitable enough to be used in circuit simulator due to its simplicity. It is observed that neglecting deep sub-micron effect may lead to large error in the calculated gate tunneling current. It is found that temperature has almost negligible effect on gate tunneling current. It is also reported that gate tunneling current reduces with the increase of gate oxide thickness. The impact of source/drain overlap length is also assessed on gate tunneling current.
Keywords: Gate tunneling current, analytical model, gate dielectrics, non uniform poly gate doping, MOSFET, fringing field effect and image charges.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733999 The Current Awareness of Just-In-Time Techniques within the Libyan Textile Private Industry: A Case Study
Authors: Rajab Abdullah Hokoma
Abstract:
Almost all Libyan industries (both private and public) have struggled with many difficulties during the past three decades due to many problems. These problems have created a strongly negative impact on the productivity and utilization of many companies within Libya. This paper studies the current awareness and implementation levels of Just-In-Time (JIT) within the Libyan Textile private industry. A survey has been applied in this study using an intensive detailed questionnaire. Based on the analysis of the survey responses, the results show that the management body within the surveyed companies has a modest strategy towards most of the areas that are considered as being very crucial in any successful implementation of JIT. The results also show a variation within the implementation levels of the JIT elements as these varies between Low and Acceptable levels. The paper has also identified limitations within the investigated areas within this industry, and has pointed to areas where senior managers within the Libyan textile industry should take immediate actions in order to achieve effective implementation of JIT within their companies.
Keywords: Industry, questionnaire, JIT, textile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2044998 Optimum Design of Trusses by Cuckoo Search
Authors: M. Saravanan, J. Raja Murugadoss, V. Jayanthi
Abstract:
Optimal design of structure has a main role in reduction of material usage which leads to deduction in the final cost of construction projects. Evolutionary approaches are found to be more successful techniques for solving size and shape structural optimization problem since it uses a stochastic random search instead of a gradient search. By reviewing the recent literature works the problem found was the optimization of weight. A new meta-heuristic algorithm called as Cuckoo Search (CS) Algorithm has used for the optimization of the total weight of the truss structures. This paper has used set of 10 bars and 25 bars trusses for the testing purpose. The main objective of this work is to reduce the number of iterations, weight and the total time consumption. In order to demonstrate the effectiveness of the present method, minimum weight design of truss structures is performed and the results of the CS are compared with other algorithms.
Keywords: Cuckoo search algorithm, levy’s flight, meta-heuristic, optimal weight.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2105997 Discovering Complex Regularities by Adaptive Self Organizing Classification
Authors: A. Faro, D. Giordano, F. Maiorana
Abstract:
Data mining uses a variety of techniques each of which is useful for some particular task. It is important to have a deep understanding of each technique and be able to perform sophisticated analysis. In this article we describe a tool built to simulate a variation of the Kohonen network to perform unsupervised clustering and support the entire data mining process up to results visualization. A graphical representation helps the user to find out a strategy to optmize classification by adding, moving or delete a neuron in order to change the number of classes. The tool is also able to automatically suggest a strategy for number of classes optimization.The tool is used to classify macroeconomic data that report the most developed countries? import and export. It is possible to classify the countries based on their economic behaviour and use an ad hoc tool to characterize the commercial behaviour of a country in a selected class from the analysis of positive and negative features that contribute to classes formation.
Keywords: Unsupervised classification, Kohonen networks, macroeconomics, Visual data mining, cluster interpretation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563996 Different Tillage Possibilities for Second Crop in Green Bean Farming
Authors: Yilmaz Bayhan, Emin Güzel, Ömer Barış Özlüoymak, Ahmet İnce, Abdullah Sessiz
Abstract:
In this study, determining of reduced tillage techniques in green bean farming as a second crop after harvesting wheat was targeted. To this aim, four different soil tillage methods namely, heavy-duty disc harrow (HD), rotary tiller (ROT), heavy-duty disc harrow plus rotary tiller (HD+ROT) and no-tillage (NT) (seeding by direct drill) were examined. Experiments were arranged in a randomized block design with three replications. The highest green beans yields were obtained in HD+ROT and NT as 5,862.1 and 5,829.3 Mg/ha, respectively. The lowest green bean yield was found in HD as 3,076.7 Mg/ha. The highest fuel consumption was measured 30.60 L ha-1 for HD+ROT whereas the lowest value was found 7.50 L ha-1 for NT. No tillage method gave the best results for fuel consumption and effective power requirement. It is concluded that no-tillage method can be used in second crop green bean in the Thrace Region due to economic and erosion conditions.
Keywords: Soil tillage, green bean, vegetative, generative, yield.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1086995 Compiler-Based Architecture for Context Aware Frameworks
Authors: Hossein Nejati, Seyed H. Mirisaee, Gholam H. Dastghaibifard
Abstract:
Computers are being integrated in the various aspects of human every day life in different shapes and abilities. This fact has intensified a requirement for the software development technologies which is ability to be: 1) portable, 2) adaptable, and 3) simple to develop. This problem is also known as the Pervasive Computing Problem (PCP) which can be implemented in different ways, each has its own pros and cons and Context Oriented Programming (COP) is one of the methods to address the PCP. In this paper a design for a COP framework, a context aware framework, is presented which has eliminated weak points of a previous design based on interpreter languages, while introducing the compiler languages power in implementing these frameworks. The key point of this improvement is combining COP and Dependency Injection (DI) techniques. Both old and new frameworks are analyzed to show advantages and disadvantages. Finally a simulation of both designs is proposed to indicating that the practical results agree with the theoretical analysis while the new design runs almost 8 times faster.Keywords: Dependency Injection, Compiler-based architecture, Context-Oriented Programming, COP, Pervasive ComputingProblem
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1929