Search results for: segmentation quality
9959 Markov Random Field-Based Segmentation Algorithm for Detection of Land Cover Changes Using Uninhabited Aerial Vehicle Synthetic Aperture Radar Polarimetric Images
Authors: Mehrnoosh Omati, Mahmod Reza Sahebi
Abstract:
The information on land use/land cover changing plays an essential role for environmental assessment, planning and management in regional development. Remotely sensed imagery is widely used for providing information in many change detection applications. Polarimetric Synthetic aperture radar (PolSAR) image, with the discrimination capability between different scattering mechanisms, is a powerful tool for environmental monitoring applications. This paper proposes a new boundary-based segmentation algorithm as a fundamental step for land cover change detection. In this method, first, two PolSAR images are segmented using integration of marker-controlled watershed algorithm and coupled Markov random field (MRF). Then, object-based classification is performed to determine changed/no changed image objects. Compared with pixel-based support vector machine (SVM) classifier, this novel segmentation algorithm significantly reduces the speckle effect in PolSAR images and improves the accuracy of binary classification in object-based level. The experimental results on Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) polarimetric images show a 3% and 6% improvement in overall accuracy and kappa coefficient, respectively. Also, the proposed method can correctly distinguish homogeneous image parcels.Keywords: coupled Markov random field (MRF), environment, object-based analysis, polarimetric SAR (PolSAR) images
Procedia PDF Downloads 2199958 Measurements of Service Quality vs Customer Satisfaction in Government Owned Retail Store at Kochi
Authors: N. S. Ajisha
Abstract:
In today’s competitive world the quality of the service you deliver is one of the important factor that determine customer satisfaction. Service quality is considered to be one important determinant to evaluate customer satisfaction and the relationship between service quality and customer satisfaction is considered as the foundation in researches on customer satisfaction. This research examines to do a gap analysis between the perception and expectation of the services delivered and find relation between the service quality and customer satisfaction. Service quality is found out here using the SERVQUAL model. And it finds out the dimension of service quality which is more important to measure customer satisfaction. The dimensions which we measure using SERVQUAL include the tangibles, reliability, responsiveness, assurance, and empathy. This study involves primary data collection like market survey.Keywords: customer satisfaction, service quality, retail service quality, Kochi
Procedia PDF Downloads 5569957 Improving Library Service Quality in Local City of Indonesia
Authors: Prima Fithri, Afri Adnan, Verra Syahmer
Abstract:
Library as a public service should be able to provide excellent and quality service. The criteria that should be available in the library is having the collection which relevant, actual and reliable, qualified and professional employee, delivery system that prompt and appropriate as well as supported by proper infrastructure. The aim of this study is to show the performance as an effort to provide quality of services that appropriate with the needs and desires of user. Then, in this research has been carried out the calculation of the gap between the perceptions and expectations of user about the services of the library. The Sevqual and QFD methods are used in this study. Servqual method for measuring the value of the gap that occurs in the dimensions of service quality and QFD method for determine priority repairment that need to be done to improve the quality of services that occur in the dimensions of service quality. From 97 questionaires, shows that value of the gap that occurs in the dimensions of service quality using by Servqual is 27.7% dimensions of responsiveness. It show how much user expectations are not met by the quality of existing services. Construction of the library and standard library becomes priority improvements that need to be done to improve the quality of service that occurs in the dimensions of service quality using the QFD.Keywords: library, service quality, service quality, QFD
Procedia PDF Downloads 5799956 Computational Cell Segmentation in Immunohistochemically Image of Meningioma Tumor Using Fuzzy C-Means and Adaptive Vector Directional Filter
Authors: Vahid Anari, Leila Shahmohammadi
Abstract:
Diagnosing and interpreting manually from a large cohort dataset of immunohistochemically stained tissue of tumors using an optical microscope involves subjectivity and also is tedious for pathologist specialists. Moreover, digital pathology today represents more of an evolution than a revolution in pathology. In this paper, we develop and test an unsupervised algorithm that can automatically enhance the IHC image of a meningioma tumor and classify cells into positive (proliferative) and negative (normal) cells. A dataset including 150 images is used to test the scheme. In addition, a new adaptive color image enhancement method is proposed based on a vector directional filter (VDF) and statistical properties of filtering the window. Since the cells are distinguishable by the human eye, the accuracy and stability of the algorithm are quantitatively compared through application to a wide variety of real images.Keywords: digital pathology, cell segmentation, immunohistochemically, noise reduction
Procedia PDF Downloads 679955 A U-Net Based Architecture for Fast and Accurate Diagram Extraction
Authors: Revoti Prasad Bora, Saurabh Yadav, Nikita Katyal
Abstract:
In the context of educational data mining, the use case of extracting information from images containing both text and diagrams is of high importance. Hence, document analysis requires the extraction of diagrams from such images and processes the text and diagrams separately. To the author’s best knowledge, none among plenty of approaches for extracting tables, figures, etc., suffice the need for real-time processing with high accuracy as needed in multiple applications. In the education domain, diagrams can be of varied characteristics viz. line-based i.e. geometric diagrams, chemical bonds, mathematical formulas, etc. There are two broad categories of approaches that try to solve similar problems viz. traditional computer vision based approaches and deep learning approaches. The traditional computer vision based approaches mainly leverage connected components and distance transform based processing and hence perform well in very limited scenarios. The existing deep learning approaches either leverage YOLO or faster-RCNN architectures. These approaches suffer from a performance-accuracy tradeoff. This paper proposes a U-Net based architecture that formulates the diagram extraction as a segmentation problem. The proposed method provides similar accuracy with a much faster extraction time as compared to the mentioned state-of-the-art approaches. Further, the segmentation mask in this approach allows the extraction of diagrams of irregular shapes.Keywords: computer vision, deep-learning, educational data mining, faster-RCNN, figure extraction, image segmentation, real-time document analysis, text extraction, U-Net, YOLO
Procedia PDF Downloads 1409954 Multi-Stage Classification for Lung Lesion Detection on CT Scan Images Applying Medical Image Processing Technique
Authors: Behnaz Sohani, Sahand Shahalinezhad, Amir Rahmani, Aliyu Aliyu
Abstract:
Recently, medical imaging and specifically medical image processing is becoming one of the most dynamically developing areas of medical science. It has led to the emergence of new approaches in terms of the prevention, diagnosis, and treatment of various diseases. In the process of diagnosis of lung cancer, medical professionals rely on computed tomography (CT) scans, in which failure to correctly identify masses can lead to incorrect diagnosis or sampling of lung tissue. Identification and demarcation of masses in terms of detecting cancer within lung tissue are critical challenges in diagnosis. In this work, a segmentation system in image processing techniques has been applied for detection purposes. Particularly, the use and validation of a novel lung cancer detection algorithm have been presented through simulation. This has been performed employing CT images based on multilevel thresholding. The proposed technique consists of segmentation, feature extraction, and feature selection and classification. More in detail, the features with useful information are selected after featuring extraction. Eventually, the output image of lung cancer is obtained with 96.3% accuracy and 87.25%. The purpose of feature extraction applying the proposed approach is to transform the raw data into a more usable form for subsequent statistical processing. Future steps will involve employing the current feature extraction method to achieve more accurate resulting images, including further details available to machine vision systems to recognise objects in lung CT scan images.Keywords: lung cancer detection, image segmentation, lung computed tomography (CT) images, medical image processing
Procedia PDF Downloads 1019953 Method for Improving ICESAT-2 ATL13 Altimetry Data Utility on Rivers
Authors: Yun Chen, Qihang Liu, Catherine Ticehurst, Chandrama Sarker, Fazlul Karim, Dave Penton, Ashmita Sengupta
Abstract:
The application of ICESAT-2 altimetry data in river hydrology critically depends on the accuracy of the mean water surface elevation (WSE) at a virtual station (VS) where satellite observations intersect with water. The ICESAT-2 track generates multiple VSs as it crosses the different water bodies. The difficulties are particularly pronounced in large river basins where there are many tributaries and meanders often adjacent to each other. One challenge is to split photon segments along a beam to accurately partition them to extract only the true representative water height for individual elements. As far as we can establish, there is no automated procedure to make this distinction. Earlier studies have relied on human intervention or river masks. Both approaches are unsatisfactory solutions where the number of intersections is large, and river width/extent changes over time. We describe here an automated approach called “auto-segmentation”. The accuracy of our method was assessed by comparison with river water level observations at 10 different stations on 37 different dates along the Lower Murray River, Australia. The congruence is very high and without detectable bias. In addition, we compared different outlier removal methods on the mean WSE calculation at VSs post the auto-segmentation process. All four outlier removal methods perform almost equally well with the same R2 value (0.998) and only subtle variations in RMSE (0.181–0.189m) and MAE (0.130–0.142m). Overall, the auto-segmentation method developed here is an effective and efficient approach to deriving accurate mean WSE at river VSs. It provides a much better way of facilitating the application of ICESAT-2 ATL13 altimetry to rivers compared to previously reported studies. Therefore, the findings of our study will make a significant contribution towards the retrieval of hydraulic parameters, such as water surface slope along the river, water depth at cross sections, and river channel bathymetry for calculating flow velocity and discharge from remotely sensed imagery at large spatial scales.Keywords: lidar sensor, virtual station, cross section, mean water surface elevation, beam/track segmentation
Procedia PDF Downloads 629952 Empirical Exploration of Correlations between Software Design Measures: A Replication Study
Authors: Jehad Al Dallal
Abstract:
Software engineers apply different measures to quantify the quality of software design. These measures consider artifacts developed at low or high level software design phases. The results are used to point to design weaknesses and to indicate design points that have to be restructured. Understanding the relationship among the quality measures and among the design quality aspects considered by these measures is important to interpreting the impact of a measure for a quality aspect on other potentially related aspects. In addition, exploring the relationship between quality measures helps to explain the impact of different quality measures on external quality aspects, such as reliability and maintainability. In this paper, we report a replication study that empirically explores the correlation between six well known and commonly applied design quality measures. These measures consider several quality aspects, including complexity, cohesion, coupling, and inheritance. The results indicate that inheritance measures are weakly correlated to other measures, whereas complexity, coupling, and cohesion measures are mostly strongly correlated.Keywords: quality attribute, quality measure, software design quality, Spearman correlation
Procedia PDF Downloads 3009951 Market Segmentation of Cruise Ship Passengers: Implications for Marketing of Local Products and Services at Destination Points
Authors: Gunnar Oskarsson, Irena Georgsdottir
Abstract:
Tourism has been growing incredibly fast during the past years, including the cruise industry, which is gaining increasing popularity among various groups of travelers. It is a challenging task for companies serving cruise ship passengers with local products and services at the point of destination to reach them in due time with information about their offerings, as well learning how to adapt their offerings and messages to the type of customers arriving on each particular occasion. Although some research has been conducted in this sphere, there is still limited knowledge about many specifics within this sector of the tourist industry. The objective of this research is to examine one of these, with the main goal of studying the segmentation of cruise passengers and to learn about marketing practices directed towards them. A qualitative research method, based on in-depth interviews, was used, as this provides an opportunity to gain insight into the participants’ perspectives. Interviews were conducted with 10 respondents from different companies in the tourist industry in Iceland, who interact with cruise passengers on a regular basis in their work environment. The main objective was to gain an understanding of what distinguishes different customer groups, or segments, in this industry, and of the marketing approaches directed towards them. The main findings reveal that participants note the strongest difference between cruise passengers of different nationalities, passengers coming on different ships (size and type), and passengers arriving at different times of the year. A drastic difference was noticed between nationalities in four main segments, American, British, Other European, and Asian customers, although some of these segments could be divided into even further sub-segments. Other important differencing factors were size and type of ships, quality or number of stars on the ship, and travelling time of the year. Companies serving cruise ship passengers, as well as the customers themselves, could benefit if the offerings of services were designed specifically for particular segments within the industry. Concerning marketing towards cruise passengers, the results indicate that it is carried out almost exclusively through the Internet using; a reliable website and, search engine optimization. Marketing is also by word-of-mouth. This research can assist practitioners by offering a deeper understanding of the approaches that may be effective in marketing local products and services to cruise ship passengers, based on their segmentation and by identifying effective ways to reach them. The research, furthermore, provides a valuable contribution to marketing knowledge for the benefit of an increasingly important market segment in a fast growing tourist industry.Keywords: capabilities, global integration, internationalisation, SMEs
Procedia PDF Downloads 4019950 Geographic Information System and Dynamic Segmentation of Very High Resolution Images for the Semi-Automatic Extraction of Sandy Accumulation
Authors: A. Bensaid, T. Mostephaoui, R. Nedjai
Abstract:
A considerable area of Algerian lands is threatened by the phenomenon of wind erosion. For a long time, wind erosion and its associated harmful effects on the natural environment have posed a serious threat, especially in the arid regions of the country. In recent years, as a result of increases in the irrational exploitation of natural resources (fodder) and extensive land clearing, wind erosion has particularly accentuated. The extent of degradation in the arid region of the Algerian Mecheria department generated a new situation characterized by the reduction of vegetation cover, the decrease of land productivity, as well as sand encroachment on urban development zones. In this study, we attempt to investigate the potential of remote sensing and geographic information systems for detecting the spatial dynamics of the ancient dune cords based on the numerical processing of LANDSAT images (5, 7, and 8) of three scenes 197/37, 198/36 and 198/37 for the year 2020. As a second step, we prospect the use of geospatial techniques to monitor the progression of sand dunes on developed (urban) lands as well as on the formation of sandy accumulations (dune, dunes fields, nebkha, barkhane, etc.). For this purpose, this study made use of the semi-automatic processing method for the dynamic segmentation of images with very high spatial resolution (SENTINEL-2 and Google Earth). This study was able to demonstrate that urban lands under current conditions are located in sand transit zones that are mobilized by the winds from the northwest and southwest directions.Keywords: land development, GIS, segmentation, remote sensing
Procedia PDF Downloads 1559949 Optical Coherence Tomography in Parkinson’s Disease: A Potential in-vivo Retinal α-Synuclein Biomarker in Parkinson’s Disease
Authors: Jessica Chorostecki, Aashka Shah, Fen Bao, Ginny Bao, Edwin George, Navid Seraji-Bozorgzad, Veronica Gorden, Christina Caon, Elliot Frohman
Abstract:
Background: Parkinson’s Disease (PD) is a neuro degenerative disorder associated with the loss of dopaminergic cells and the presence α-synuclein (AS) aggregation in of Lewy bodies. Both dopaminergic cells and AS are found in the retina. Optical coherence tomography (OCT) allows high-resolution in-vivo examination of retinal structure injury in neuro degenerative disorders including PD. Methods: We performed a cross-section OCT study in patients with definite PD and healthy controls (HC) using Spectral Domain SD-OCT platform to measure the peripapillary retinal nerve fiber layer (pRNFL) thickness and total macular volume (TMV). We performed intra-retinal segmentation with fully automated segmentation software to measure the volume of the RNFL, ganglion cell layer (GCL), inner plexiform layer (IPL), inner nuclear layer (INL), outer plexiform layer (OPL), and the outer nuclear layer (ONL). Segmentation was performed blinded to the clinical status of the study participants. Results: 101 eyes from 52 PD patients (mean age 65.8 years) and 46 eyes from 24 HC subjects (mean age 64.1 years) were included in the study. The mean pRNFL thickness was not significantly different (96.95 μm vs 94.42 μm, p=0.07) but the TMV was significantly lower in PD compared to HC (8.33 mm3 vs 8.58 mm3 p=0.0002). Intra-retinal segmentation showed no significant difference in the RNFL volume between the PD and HC groups (0.95 mm3 vs 0.92 mm3 p=0.454). However, GCL, IPL, INL, and ONL volumes were significantly reduced in PD compared to HC. In contrast, the volume of OPL was significantly increased in PD compared to HC. Conclusions: Our finding of the enlarged OPL corresponds with mRNA expression studies showing localization of AS in the OPL across vertebrate species and autopsy studies demonstrating AS aggregation in the deeper layers of retina in PD. We propose that the enlargement of the OPL may represent a potential biomarker of AS aggregation in PD. Longitudinal studies in larger cohorts are warranted to confirm our observations that may have significant implications in disease monitoring and therapeutic development.Keywords: Optical Coherence Tomography, biomarker, Parkinson's disease, alpha-synuclein, retina
Procedia PDF Downloads 4389948 Measuring Human Perception and Negative Elements of Public Space Quality Using Deep Learning: A Case Study of Area within the Inner Road of Tianjin City
Authors: Jiaxin Shi, Kaifeng Hao, Qingfan An, Zeng Peng
Abstract:
Due to a lack of data sources and data processing techniques, it has always been difficult to quantify public space quality, which includes urban construction quality and how it is perceived by people, especially in large urban areas. This study proposes a quantitative research method based on the consideration of emotional health and physical health of the built environment. It highlights the low quality of public areas in Tianjin, China, where there are many negative elements. Deep learning technology is then used to measure how effectively people perceive urban areas. First, this work suggests a deep learning model that might simulate how people can perceive the quality of urban construction. Second, we perform semantic segmentation on street images to identify visual elements influencing scene perception. Finally, this study correlated the scene perception score with the proportion of visual elements to determine the surrounding environmental elements that influence scene perception. Using a small-scale labeled Tianjin street view data set based on transfer learning, this study trains five negative spatial discriminant models in order to explore the negative space distribution and quality improvement of urban streets. Then it uses all Tianjin street-level imagery to make predictions and calculate the proportion of negative space. Visualizing the spatial distribution of negative space along the Tianjin Inner Ring Road reveals that the negative elements are mainly found close to the five key districts. The map of Tianjin was combined with the experimental data to perform the visual analysis. Based on the emotional assessment, the distribution of negative materials, and the direction of street guidelines, we suggest guidance content and design strategy points of the negative phenomena in Tianjin street space in the two dimensions of perception and substance. This work demonstrates the utilization of deep learning techniques to understand how people appreciate high-quality urban construction, and it complements both theory and practice in urban planning. It illustrates the connection between human perception and the actual physical public space environment, allowing researchers to make urban interventions.Keywords: human perception, public space quality, deep learning, negative elements, street images
Procedia PDF Downloads 1179947 Investigations of Effective Marketing Metric Strategies: The Case of St. George Brewery Factory, Ethiopia
Authors: Mekdes Getu Chekol, Biniam Tedros Kahsay, Rahwa Berihu Haile
Abstract:
The main objective of this study is to investigate the marketing strategy practice in the Case of St. George Brewery Factory in Addis Ababa. One of the core activities in a Business Company to stay in business is having a well-developed marketing strategy. It assessed how the marketing strategies were practiced in the company to achieve its goals aligned with segmentation, target market, positioning, and the marketing mix elements to satisfy customer requirements. Using primary and secondary data, the study is conducted by using both qualitative and quantitative approaches. The primary data was collected through open and closed-ended questionnaires. Considering the size of the population is small, the selection of the respondents was carried out by using a census. The finding shows that the company used all the 4 Ps of the marketing mix elements in its marketing strategies and provided quality products at affordable prices by promoting its products by using high and effective advertising mechanisms. The product availability and accessibility are admirable with the practices of both direct and indirect distribution channels. On the other hand, the company has identified its target customers, and the company’s market segmentation practice is geographical location. Communication effectiveness between the marketing department and other departments is very good. The adjusted R2 model explains 61.6% of the marketing strategy practice variance by product, price, promotion, and place. The remaining 38.4% of variation in the dependent variable was explained by other factors not included in this study. The result reveals that all four independent variables, product, price, promotion, and place, have a positive beta sign, proving that predictor variables have a positive effect on that of the predicting dependent variable marketing strategy practice. Even though the marketing strategies of the company are effectively practiced, there are some problems that the company faces while implementing them. These are infrastructure problems, economic problems, intensive competition in the market, shortage of raw materials, seasonality of consumption, socio-cultural problems, and the time and cost of awareness creation for the customers. Finally, the authors suggest that the company better develop a long-range view and try to implement a more structured approach to attain information about potential customers, competitor’s actions, and market intelligence within the industry. In addition, we recommend conducting the study by increasing the sample size and including different marketing factors.Keywords: marketing strategy, market segmentation, target marketing, market positioning, marketing mix
Procedia PDF Downloads 619946 Logistics Model for Improving Quality in Railway Transport
Authors: Eva Nedeliakova, Juraj Camaj, Jaroslav Masek
Abstract:
This contribution is focused on the methodology for identifying levels of quality and improving quality through new logistics model in railway transport. It is oriented on the application of dynamic quality models, which represent an innovative method of evaluation quality services. Through this conception, time factor, expected, and perceived quality in each moment of the transportation process within logistics chain can be taken into account. Various models describe the improvement of the quality which emphases the time factor throughout the whole transportation logistics chain. Quality of services in railway transport can be determined by the existing level of service quality, by detecting the causes of dissatisfaction employees but also customers, to uncover strengths and weaknesses. This new logistics model is able to recognize critical processes in logistic chain. It includes service quality rating that must respect its specific properties, which are unrepeatability, impalpability, their use right at the time they are provided and particularly changeability, which is significant factor in the conditions of rail transport as well. These peculiarities influence the quality of service regarding the constantly increasing requirements and that result in new ways of finding progressive attitudes towards the service quality rating.Keywords: logistics model, quality, railway transport
Procedia PDF Downloads 5719945 Analysis, Design, and Implementation of Quality Management System for KSA Software Company
Authors: Omar Said Almushyt
Abstract:
Quality management, in all countries all over the world, has become recently necessary to face challenges among companies. Software companies in KSA suffer from two problems, namely, low customer satisfaction, and low product quality. Implementation of quality management for a software company can solve these problems, by improving the quality of products and enhancing customer satisfaction. This will lead the company to be competitive. Introducing quality management system onto system analysis followed by system design and finally implementing that system can achieve these goals. Results of the present work showed that the proposed method can increase both the product quality by 10 % and the customer satisfaction by 20 %.Keywords: quality, management, software, information engineering
Procedia PDF Downloads 4409944 Robust Segmentation of Salient Features in Automatic Breast Ultrasound (ABUS) Images
Authors: Lamees Nasser, Yago Diez, Robert Martí, Joan Martí, Ibrahim Sadek
Abstract:
Automated 3D breast ultrasound (ABUS) screening is a novel modality in medical imaging because of its common characteristics shared with other ultrasound modalities in addition to the three orthogonal planes (i.e., axial, sagittal, and coronal) that are useful in analysis of tumors. In the literature, few automatic approaches exist for typical tasks such as segmentation or registration. In this work, we deal with two problems concerning ABUS images: nipple and rib detection. Nipple and ribs are the most visible and salient features in ABUS images. Determining the nipple position plays a key role in some applications for example evaluation of registration results or lesion follow-up. We present a nipple detection algorithm based on color and shape of the nipple, besides an automatic approach to detect the ribs. In point of fact, rib detection is considered as one of the main stages in chest wall segmentation. This approach consists of four steps. First, images are normalized in order to minimize the intensity variability for a given set of regions within the same image or a set of images. Second, the normalized images are smoothed by using anisotropic diffusion filter. Next, the ribs are detected in each slice by analyzing the eigenvalues of the 3D Hessian matrix. Finally, a breast mask and a probability map of regions detected as ribs are used to remove false positives (FP). Qualitative and quantitative evaluation obtained from a total of 22 cases is performed. For all cases, the average and standard deviation of the root mean square error (RMSE) between manually annotated points placed on the rib surface and detected points on rib borders are 15.1188 mm and 14.7184 mm respectively.Keywords: Automated 3D Breast Ultrasound, Eigenvalues of Hessian matrix, Nipple detection, Rib detection
Procedia PDF Downloads 3319943 A Combined Feature Extraction and Thresholding Technique for Silence Removal in Percussive Sounds
Authors: B. Kishore Kumar, Pogula Rakesh, T. Kishore Kumar
Abstract:
The music analysis is a part of the audio content analysis used to analyze the music by using the different features of audio signal. In music analysis, the first step is to divide the music signal to different sections based on the feature profiles of the music signal. In this paper, we present a music segmentation technique that will effectively segmentize the signal and thresholding technique to remove silence from the percussive sounds produced by percussive instruments, which uses two features of music, namely signal energy and spectral centroid. The proposed method impose thresholds on both the features which will vary depends on the music signal. Depends on the threshold, silence part is removed and the segmentation is done. The effectiveness of the proposed method is analyzed using MATLAB.Keywords: percussive sounds, spectral centroid, spectral energy, silence removal, feature extraction
Procedia PDF Downloads 5949942 Segmentation of Piecewise Polynomial Regression Model by Using Reversible Jump MCMC Algorithm
Authors: Suparman
Abstract:
Piecewise polynomial regression model is very flexible model for modeling the data. If the piecewise polynomial regression model is matched against the data, its parameters are not generally known. This paper studies the parameter estimation problem of piecewise polynomial regression model. The method which is used to estimate the parameters of the piecewise polynomial regression model is Bayesian method. Unfortunately, the Bayes estimator cannot be found analytically. Reversible jump MCMC algorithm is proposed to solve this problem. Reversible jump MCMC algorithm generates the Markov chain that converges to the limit distribution of the posterior distribution of piecewise polynomial regression model parameter. The resulting Markov chain is used to calculate the Bayes estimator for the parameters of piecewise polynomial regression model.Keywords: piecewise regression, bayesian, reversible jump MCMC, segmentation
Procedia PDF Downloads 3739941 Image Segmentation with Deep Learning of Prostate Cancer Bone Metastases on Computed Tomography
Authors: Joseph M. Rich, Vinay A. Duddalwar, Assad A. Oberai
Abstract:
Prostate adenocarcinoma is the most common cancer in males, with osseous metastases as the commonest site of metastatic prostate carcinoma (mPC). Treatment monitoring is based on the evaluation and characterization of lesions on multiple imaging studies, including Computed Tomography (CT). Monitoring of the osseous disease burden, including follow-up of lesions and identification and characterization of new lesions, is a laborious task for radiologists. Deep learning algorithms are increasingly used to perform tasks such as identification and segmentation for osseous metastatic disease and provide accurate information regarding metastatic burden. Here, nnUNet was used to produce a model which can segment CT scan images of prostate adenocarcinoma vertebral bone metastatic lesions. nnUNet is an open-source Python package that adds optimizations to deep learning-based UNet architecture but has not been extensively combined with transfer learning techniques due to the absence of a readily available functionality of this method. The IRB-approved study data set includes imaging studies from patients with mPC who were enrolled in clinical trials at the University of Southern California (USC) Health Science Campus and Los Angeles County (LAC)/USC medical center. Manual segmentation of metastatic lesions was completed by an expert radiologist Dr. Vinay Duddalwar (20+ years in radiology and oncologic imaging), to serve as ground truths for the automated segmentation. Despite nnUNet’s success on some medical segmentation tasks, it only produced an average Dice Similarity Coefficient (DSC) of 0.31 on the USC dataset. DSC results fell in a bimodal distribution, with most scores falling either over 0.66 (reasonably accurate) or at 0 (no lesion detected). Applying more aggressive data augmentation techniques dropped the DSC to 0.15, and reducing the number of epochs reduced the DSC to below 0.1. Datasets have been identified for transfer learning, which involve balancing between size and similarity of the dataset. Identified datasets include the Pancreas data from the Medical Segmentation Decathlon, Pelvic Reference Data, and CT volumes with multiple organ segmentations (CT-ORG). Some of the challenges of producing an accurate model from the USC dataset include small dataset size (115 images), 2D data (as nnUNet generally performs better on 3D data), and the limited amount of public data capturing annotated CT images of bone lesions. Optimizations and improvements will be made by applying transfer learning and generative methods, including incorporating generative adversarial networks and diffusion models in order to augment the dataset. Performance with different libraries, including MONAI and custom architectures with Pytorch, will be compared. In the future, molecular correlations will be tracked with radiologic features for the purpose of multimodal composite biomarker identification. Once validated, these models will be incorporated into evaluation workflows to optimize radiologist evaluation. Our work demonstrates the challenges of applying automated image segmentation to small medical datasets and lays a foundation for techniques to improve performance. As machine learning models become increasingly incorporated into the workflow of radiologists, these findings will help improve the speed and accuracy of vertebral metastatic lesions detection.Keywords: deep learning, image segmentation, medicine, nnUNet, prostate carcinoma, radiomics
Procedia PDF Downloads 979940 The Affect of Total Quality Management on Firm's Innovation Performance: A Literature Review
Authors: Omer Akkaya, Nurullah Ekmekcı, Muammer Zerenler
Abstract:
Innovation for businesses means a new product and service and sometimes a new implementation. Total Quality Management is a management philosophy which focus on customer, process and system.There is a certain relationship between principles of Total Quality Management and innovation performance. Main aim of this study is to show how the implementation and principles of Total Quality Management (TQM) affect a firm's innovation performance. Also, this paper discusses positive and negative affects of Total Quality Management on innovation performance and demonstrates some examples.Keywords: innovation, innovation types, total quality management, principles of total quality management
Procedia PDF Downloads 6319939 Some Observations on the Analysis of Four Performances of the Allemande from J.S. Bach's Partita for Solo Flute (BWV 1013) in Terms of Zipf's Law
Authors: Douglas W. Scott
Abstract:
The Allemande from J. S. Bach's Partita for solo flute (BWV 1013) presents many unique challenges for any flautist, especially in terms of segmentation analysis required to select breathing places in the first half. Without claiming to identify a 'correct' solution to this problem, this paper analyzes the section in terms of a set of techniques based around a statistical property commonly (if not ubiquitously) found in music, namely Zipf’s law. Specifically, the paper considers violations of this expected profile at various levels of analysis, an approach which has yielded interesting insights in previous studies. The investigation is then grounded by considering four actual solutions to the problem found in recordings made by different flautists, which opens up the possibility of expanding Zipfian analysis to include a consideration of inter-onset-intervals (IOIs). It is found that significant deviations from the expected Zipfian distributions can reveal and highlight stylistic choices made by different performers.Keywords: inter-onset-interval, Partita for solo flute, BWV 1013, segmentation analysis, Zipf’s law
Procedia PDF Downloads 1849938 The Contemporary Issues of Quality Management: Relationship between Total Quality Management and Knowledge Management
Authors: Mehrnoosh Askarizadeh
Abstract:
To meet the challenges of the new global environment, companies have started paying great attention towards quality management as an integral part of their strategic business plans. The purpose of this article is to investigate the relationship between total quality management (TQM) and knowledge management (KM). Successful total quality management implementation throughout the organizations requires major changes in the main four aspects of knowledge management, namely: Creating, storage, sharing and application. Skill, knowledge and productivity are important factors in organization’s success and have important role. Therefore, TQM management system pays special attention to it. However, knowledge as the source is essential for organization’s survival. Our study points out how the quality management and knowledge management have been incorporated into each other for the development of the quality culture within the organization.Keywords: knowledge management (KM), total quality management (TQM), organizational performance (OP), deming cycle
Procedia PDF Downloads 4819937 The Quality Health Services and Patient Satisfaction in Hospital
Authors: Nadia Fatima Zahra Malki
Abstract:
Quality is one of the most important modern management patterns that organizations seek to achieve in all areas and sectors in order to meet the needs and desires of customers and to remain and continuity, as they constitute a competitive advantage for the organization. and among the most prominent organizations that must be available on the quality factor are health organizations as they relate to the most valuable component of production. It is a person, and his health, and any error in it threatens his life and may lead to death, so she must provide health services of high quality to achieve the highest degree of satisfaction for the patient. This research aims to study the quality of health services and the extent of their impact on patient satisfaction, and this is through an applied study that relied on measuring the level of quality of health services in the university hospital center of Algeria and the extent of their impact on patient satisfaction according to the dimensions of the quality of health services, and we reached a conclusion that the determinants of the quality of health services It affects patient satisfaction, which necessitates developing health services according to patients' requirements and improving their quality to obtain patient satisfaction.Keywords: health service, health quality, quality determinants, patient satisfaction
Procedia PDF Downloads 649936 A Methodology Based on Image Processing and Deep Learning for Automatic Characterization of Graphene Oxide
Authors: Rafael do Amaral Teodoro, Leandro Augusto da Silva
Abstract:
Originated from graphite, graphene is a two-dimensional (2D) material that promises to revolutionize technology in many different areas, such as energy, telecommunications, civil construction, aviation, textile, and medicine. This is possible because its structure, formed by carbon bonds, provides desirable optical, thermal, and mechanical characteristics that are interesting to multiple areas of the market. Thus, several research and development centers are studying different manufacturing methods and material applications of graphene, which are often compromised by the scarcity of more agile and accurate methodologies to characterize the material – that is to determine its composition, shape, size, and the number of layers and crystals. To engage in this search, this study proposes a computational methodology that applies deep learning to identify graphene oxide crystals in order to characterize samples by crystal sizes. To achieve this, a fully convolutional neural network called U-net has been trained to segment SEM graphene oxide images. The segmentation generated by the U-net is fine-tuned with a standard deviation technique by classes, which allows crystals to be distinguished with different labels through an object delimitation algorithm. As a next step, the characteristics of the position, area, perimeter, and lateral measures of each detected crystal are extracted from the images. This information generates a database with the dimensions of the crystals that compose the samples. Finally, graphs are automatically created showing the frequency distributions by area size and perimeter of the crystals. This methodological process resulted in a high capacity of segmentation of graphene oxide crystals, presenting accuracy and F-score equal to 95% and 94%, respectively, over the test set. Such performance demonstrates a high generalization capacity of the method in crystal segmentation, since its performance considers significant changes in image extraction quality. The measurement of non-overlapping crystals presented an average error of 6% for the different measurement metrics, thus suggesting that the model provides a high-performance measurement for non-overlapping segmentations. For overlapping crystals, however, a limitation of the model was identified. To overcome this limitation, it is important to ensure that the samples to be analyzed are properly prepared. This will minimize crystal overlap in the SEM image acquisition and guarantee a lower error in the measurements without greater efforts for data handling. All in all, the method developed is a time optimizer with a high measurement value, considering that it is capable of measuring hundreds of graphene oxide crystals in seconds, saving weeks of manual work.Keywords: characterization, graphene oxide, nanomaterials, U-net, deep learning
Procedia PDF Downloads 1609935 Implementation of Total Quality Management in Public Sector: Case of Tunisia
Authors: Rafla Hchaichi
Abstract:
The public administration is currently experiencing in the field of quality unprecedented effervescence. However, in a globalized world more and more competitive, public services are confronted with the need to improve their performances which push public companies to implement quality approaches. Quality approaches have taken diverse forms such as service commitment, labels, certifications and the Common Assessment Framework. This paper provides an overview on the strategy for administrative development in Tunisia since the Carthaginian civilization until today. It outlines the evolution of quality management in the Tunisian public context while focusing on the National Referential of Quality of Administrative Services.Keywords: quality approach, the common assessment framework, service commitment, label, certification, quality of public service, performance of public service, Tunisian Public Service
Procedia PDF Downloads 5589934 Object Oriented Classification Based on Feature Extraction Approach for Change Detection in Coastal Ecosystem across Kochi Region
Authors: Mohit Modi, Rajiv Kumar, Manojraj Saxena, G. Ravi Shankar
Abstract:
Change detection of coastal ecosystem plays a vital role in monitoring and managing natural resources along the coastal regions. The present study mainly focuses on the decadal change in Kochi islands connecting the urban flatland areas and the coastal regions where sand deposits have taken place. With this, in view, the change detection has been monitored in the Kochi area to apprehend the urban growth and industrialization leading to decrease in the wetland ecosystem. The region lies between 76°11'19.134"E to 76°25'42.193"E and 9°52'35.719"N to 10°5'51.575"N in the south-western coast of India. The IRS LISS-IV satellite image has been processed using a rule-based algorithm to classify the LULC and to interpret the changes between 2005 & 2015. The approach takes two steps, i.e. extracting features as a single GIS vector layer using different parametric values and to dissolve them. The multi-resolution segmentation has been carried out on the scale ranging from 10-30. The different classes like aquaculture, agricultural land, built-up, wetlands etc. were extracted using parameters like NDVI, mean layer values, the texture-based feature with corresponding threshold values using a rule set algorithm. The objects obtained in the segmentation process were visualized to be overlaying the satellite image at a scale of 15. This layer was further segmented using the spectral difference segmentation rule between the objects. These individual class layers were dissolved in the basic segmented layer of the image and were interpreted in vector-based GIS programme to achieve higher accuracy. The result shows a rapid increase in an industrial area of 40% based on industrial area statistics of 2005. There is a decrease in wetlands area which has been converted into built-up. New roads have been constructed which are connecting the islands to urban areas as well as highways. The increase in coastal region has been visualized due to sand depositions. The outcome is well supported by quantitative assessments which will empower rich understanding of land use land cover change for appropriate policy intervention and further monitoring.Keywords: land use land cover, multiresolution segmentation, NDVI, object based classification
Procedia PDF Downloads 1879933 The Importance of Country-of-Origin Information and Perceived Product Quality in Uzbekistan
Authors: Begzod Nishanov, Farhod Karimov
Abstract:
Globalization and the internet have completely changed the way in which businesses operate as well as has equipped customers with endless potential. Today, consumers’ product choice is not only affected by branding, price and quality of the product, but also by the country-of-origin information. Precisely, ‘Made In’ label is considered as one of the driving factors which directly impact on consumers’ preferences. Generally, it is obvious that products manufactured in less developed countries are considered to be of lower quality and riskier compared to the products made in developed countries. In this regard, it is worth to note that this phenomenon is mainly applicable to western developed countries. However, there is a lack of empirical research on underlying the influence of country-of-origin phenomenon in emerging economies such as Uzbekistan. Today, Uzbek market is being dominated by growing number of foreign made products. Uzbek manufacturers are facing intense competition not only from local producers but also from the availability of foreign goods suppliers. Consequently, consumers are given wider choice of products than ever before. In this regard, it is important to define the importance of country-of-origin information in order to understand Uzbek consumers’ preference. The methodology of the research is formulated based on the methodology of previous papers. A total 527 online questionnaires were completed. Data analysis was conducted using factor analysis and analysis of variance test (ANOVA). Findings of the research support the view that Uzbek consumers attach great importance to the country-of-origin information of products. Precisely, it can be stated that Uzbek people perceive product quality by its ‘Made in...’ label, especially when buying high involvement goods such as car or refrigerator. Another findings of the paper show that products manufactured in developed countries including Germany, Japan and USA are found to be of high quality, while products manufactured in less developed countries are considered to be of lower quality. Marketers can use this information for segmentation purposes. For example, products manufactured in less developed countries can be targeted for low-to-middle income families while goods manufactured in developed countries can be targeted for higher income families. In conclusion, it can be stated that perceived product quality of products that are made in Uzbekistan has slightly increased since 18 years. It implies that nowadays products under ‘Made in Uzbekistan’ label is continually becoming available to many consumers in foreign markets, especially among Commonwealth of Independent States (CIS) countries. Therefore, conducting further research to explore the phenomenon of country-of-origin information and perceived product quality in emerging markets is of paramount importance.Keywords: country-of-origin, consumer behavior, product evaluation, perceived quality
Procedia PDF Downloads 2639932 Evalutaion of the Surface Water Quality Using the Water Quality Index and Discriminant Analysis Method
Authors: Lazhar Belkhiri, Ammar Tiri, Lotfi Mouni
Abstract:
Water resources present to the public order of the world a very important problem for the protection and management of water quality given the complexity of water quality data sets. In this study, the water quality index (WQI) and irrigation water quality index (IWQI) were calculated in order to evaluate the surface water quality for drinking and irrigation purposes based on nine hydrochemical parameters. In order to separate the variables that are the most responsible for the spatial differentiation, the discriminant analysis (DA) was applied. The results show that the surface water quality for drinking is poor quality and very poor quality based on WQI values, however, the values of IWQI reflect that this water is acceptable for irrigation with a restriction for sensitive plants. Consequently, the discriminant analysis DA method has shown that the following parameters pH, potassium, chloride, sulfate, and bicarbonate are significant discrimination between the different stations with the spatial variation of the surface water quality, therefore, the results obtained in this study provide very useful information to decision-makersKeywords: surface water quality, drinking and irrigation purposes, water quality index, discriminant analysis
Procedia PDF Downloads 899931 Evaluating 8D Reports Using Text-Mining
Authors: Benjamin Kuester, Bjoern Eilert, Malte Stonis, Ludger Overmeyer
Abstract:
Increasing quality requirements make reliable and effective quality management indispensable. This includes the complaint handling in which the 8D method is widely used. The 8D report as a written documentation of the 8D method is one of the key quality documents as it internally secures the quality standards and acts as a communication medium to the customer. In practice, however, the 8D report is mostly faulty and of poor quality. There is no quality control of 8D reports today. This paper describes the use of natural language processing for the automated evaluation of 8D reports. Based on semantic analysis and text-mining algorithms the presented system is able to uncover content and formal quality deficiencies and thus increases the quality of the complaint processing in the long term.Keywords: 8D report, complaint management, evaluation system, text-mining
Procedia PDF Downloads 3169930 Evaluating the Quality of Private University Websites in Malaysia
Authors: Rubijesmin Abdul Latif
Abstract:
This paper focuses on evaluating what are quality components of university websites in Malaysia especially the private universities. It is believed that with websites that prioritize quality, the websites will serve its intended users satisfactory. From the compiled analysis of other studies, quality components were identified and tested among 30 randomly selected respondents. Four Malaysia private university websites were compared and the highlights were better understanding of what users want for a quality university website.Keywords: website evaluation, criteria, quality, usability, user experience, university website
Procedia PDF Downloads 373