Search results for: GLCM texture features
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4236

Search results for: GLCM texture features

3276 Image Processing of Scanning Electron Microscope Micrograph of Ferrite and Pearlite Steel for Recognition of Micro-Constituents

Authors: Subir Gupta, Subhas Ganguly

Abstract:

In this paper, we demonstrate the new area of application of image processing in metallurgical images to develop the more opportunity for structure-property correlation based approaches of alloy design. The present exercise focuses on the development of image processing tools suitable for phrase segmentation, grain boundary detection and recognition of micro-constituents in SEM micrographs of ferrite and pearlite steels. A comprehensive data of micrographs have been experimentally developed encompassing the variation of ferrite and pearlite volume fractions and taking images at different magnification (500X, 1000X, 15000X, 2000X, 3000X and 5000X) under scanning electron microscope. The variation in the volume fraction has been achieved using four different plain carbon steel containing 0.1, 0.22, 0.35 and 0.48 wt% C heat treated under annealing and normalizing treatments. The obtained data pool of micrographs arbitrarily divided into two parts to developing training and testing sets of micrographs. The statistical recognition features for ferrite and pearlite constituents have been developed by learning from training set of micrographs. The obtained features for microstructure pattern recognition are applied to test set of micrographs. The analysis of the result shows that the developed strategy can successfully detect the micro constitutes across the wide range of magnification and variation of volume fractions of the constituents in the structure with an accuracy of about +/- 5%.

Keywords: SEM micrograph, metallurgical image processing, ferrite pearlite steel, microstructure

Procedia PDF Downloads 190
3275 Mapping of Arenga Pinnata Tree Using Remote Sensing

Authors: Zulkiflee Abd Latif, Sitinor Atikah Nordin, Alawi Sulaiman

Abstract:

Different tree species possess different and various benefits. Arenga Pinnata tree species own several potential uses that is valuable for the economy and the country. Mapping vegetation using remote sensing technique involves various process, techniques and consideration. Using satellite imagery, this method enables the access of inaccessible area and with the availability of near infra-red band; it is useful in vegetation analysis, especially in identifying tree species. Pixel-based and object-based classification technique is used as a method in this study. Pixel-based classification technique used in this study divided into unsupervised and supervised classification. Object based classification technique becomes more popular another alternative method in classification process. Using spectral, texture, color and other information, to classify the target make object-based classification is a promising technique for classification. Classification of Arenga Pinnata trees is overlaid with elevation, slope and aspect, soil and river data and several other data to give information regarding the tree character and living environment. This paper will present the utilization of remote sensing technique in order to map Arenga Pinnata tree species

Keywords: Arenga Pinnata, pixel-based classification, object-based classification, remote sensing

Procedia PDF Downloads 359
3274 Reduction of False Positives in Head-Shoulder Detection Based on Multi-Part Color Segmentation

Authors: Lae-Jeong Park

Abstract:

The paper presents a method that utilizes figure-ground color segmentation to extract effective global feature in terms of false positive reduction in the head-shoulder detection. Conventional detectors that rely on local features such as HOG due to real-time operation suffer from false positives. Color cue in an input image provides salient information on a global characteristic which is necessary to alleviate the false positives of the local feature based detectors. An effective approach that uses figure-ground color segmentation has been presented in an effort to reduce the false positives in object detection. In this paper, an extended version of the approach is presented that adopts separate multipart foregrounds instead of a single prior foreground and performs the figure-ground color segmentation with each of the foregrounds. The multipart foregrounds include the parts of the head-shoulder shape and additional auxiliary foregrounds being optimized by a search algorithm. A classifier is constructed with the feature that consists of a set of the multiple resulting segmentations. Experimental results show that the presented method can discriminate more false positive than the single prior shape-based classifier as well as detectors with the local features. The improvement is possible because the presented approach can reduce the false positives that have the same colors in the head and shoulder foregrounds.

Keywords: pedestrian detection, color segmentation, false positive, feature extraction

Procedia PDF Downloads 269
3273 Social Media Marketing in Russia

Authors: J. A. Ageeva, Z. S. Zavyalova

Abstract:

The article considers social media as a tool for business promotion. We analyze and compare the SMM experience in the western countries and Russia. A short review of Russian social networks are given including their peculiar features, and the main problems and perspectives of Russian SMM are described.

Keywords: social media, social networks, marketing, SMM

Procedia PDF Downloads 539
3272 Development of a Biomaterial from Naturally Occurring Chloroapatite Mineral for Biomedical Applications

Authors: H. K. G. K. D. K. Hapuhinna, R. D. Gunaratne, H. M. J. C. Pitawala

Abstract:

Hydroxyapatite is a bioceramic which can be used for applications in orthopedics and dentistry due to its structural similarity with the mineral phase of mammalian bones and teeth. In this study, it was synthesized, chemically changing natural Eppawala chloroapatite mineral as a value-added product. Sol-gel approach and solid state sintering were used to synthesize products using diluted nitric acid, ethanol and calcium hydroxide under different conditions. Synthesized Eppawala hydroxyapatite powder was characterized using X-ray Fluorescence (XRF), X-ray Powder Diffraction (XRD), Fourier-transform Infrared Spectroscopy (FTIR), Scanning Electron Microscopy (SEM), Thermogravimetric Analysis (TGA) and Differential Scanning Calorimetry (DSC) in order to find out its composition, crystallinity, presence of functional groups, bonding type, surface morphology, microstructural features, and thermal dependence and stability, respectively. The XRD results reflected the formation of a hexagonal crystal structure of hydroxyapatite. Elementary composition and microstructural features of products were discussed based on the XRF and SEM results of the synthesized hydroxyapatite powder. TGA and DSC results of synthesized products showed high thermal stability and good material stability in nature. Also, FTIR spectroscopy results confirmed the formation of hydroxyapatite from apatite via the presence of hydroxyl groups. Those results coincided with the FTIR results of mammalian bones including human bones. The study concludes that there is a possibility of producing hydroxyapatite using commercially available Eppawala chloroapatite in Sri Lanka.

Keywords: dentistry, Eppawala chlorapatite, hydroxyapatite, orthopedics

Procedia PDF Downloads 229
3271 Convolutional Neural Networks versus Radiomic Analysis for Classification of Breast Mammogram

Authors: Mehwish Asghar

Abstract:

Breast Cancer (BC) is a common type of cancer among women. Its screening is usually performed using different imaging modalities such as magnetic resonance imaging, mammogram, X-ray, CT, etc. Among these modalities’ mammogram is considered a powerful tool for diagnosis and screening of breast cancer. Sophisticated machine learning approaches have shown promising results in complementing human diagnosis. Generally, machine learning methods can be divided into two major classes: one is Radiomics analysis (RA), where image features are extracted manually; and the other one is the concept of convolutional neural networks (CNN), in which the computer learns to recognize image features on its own. This research aims to improve the incidence of early detection, thus reducing the mortality rate caused by breast cancer through the latest advancements in computer science, in general, and machine learning, in particular. It has also been aimed to ease the burden of doctors by improving and automating the process of breast cancer detection. This research is related to a relative analysis of different techniques for the implementation of different models for detecting and classifying breast cancer. The main goal of this research is to provide a detailed view of results and performances between different techniques. The purpose of this paper is to explore the potential of a convolutional neural network (CNN) w.r.t feature extractor and as a classifier. Also, in this research, it has been aimed to add the module of Radiomics for comparison of its results with deep learning techniques.

Keywords: breast cancer (BC), machine learning (ML), convolutional neural network (CNN), radionics, magnetic resonance imaging, artificial intelligence

Procedia PDF Downloads 208
3270 Comprehensive Feature Extraction for Optimized Condition Assessment of Fuel Pumps

Authors: Ugochukwu Ejike Akpudo, Jank-Wook Hur

Abstract:

The increasing demand for improved productivity, maintainability, and reliability has prompted rapidly increasing research studies on the emerging condition-based maintenance concept- Prognostics and health management (PHM). Varieties of fuel pumps serve critical functions in several hydraulic systems; hence, their failure can have daunting effects on productivity, safety, etc. The need for condition monitoring and assessment of these pumps cannot be overemphasized, and this has led to the uproar in research studies on standard feature extraction techniques for optimized condition assessment of fuel pumps. By extracting time-based, frequency-based and the more robust time-frequency based features from these vibrational signals, a more comprehensive feature assessment (and selection) can be achieved for a more accurate and reliable condition assessment of these pumps. With the aid of emerging deep classification and regression algorithms like the locally linear embedding (LLE), we propose a method for comprehensive condition assessment of electromagnetic fuel pumps (EMFPs). Results show that the LLE as a comprehensive feature extraction technique yields better feature fusion/dimensionality reduction results for condition assessment of EMFPs against the use of single features. Also, unlike other feature fusion techniques, its capabilities as a fault classification technique were explored, and the results show an acceptable accuracy level using standard performance metrics for evaluation.

Keywords: electromagnetic fuel pumps, comprehensive feature extraction, condition assessment, locally linear embedding, feature fusion

Procedia PDF Downloads 108
3269 Text Localization in Fixed-Layout Documents Using Convolutional Networks in a Coarse-to-Fine Manner

Authors: Beier Zhu, Rui Zhang, Qi Song

Abstract:

Text contained within fixed-layout documents can be of great semantic value and so requires a high localization accuracy, such as ID cards, invoices, cheques, and passports. Recently, algorithms based on deep convolutional networks achieve high performance on text detection tasks. However, for text localization in fixed-layout documents, such algorithms detect word bounding boxes individually, which ignores the layout information. This paper presents a novel architecture built on convolutional neural networks (CNNs). A global text localization network and a regional bounding-box regression network are introduced to tackle the problem in a coarse-to-fine manner. The text localization network simultaneously locates word bounding points, which takes the layout information into account. The bounding-box regression network inputs the features pooled from arbitrarily sized RoIs and refine the localizations. These two networks share their convolutional features and are trained jointly. A typical type of fixed-layout documents: ID cards, is selected to evaluate the effectiveness of the proposed system. These networks are trained on data cropped from nature scene images, and synthetic data produced by a synthetic text generation engine. Experiments show that our approach locates high accuracy word bounding boxes and achieves state-of-the-art performance.

Keywords: bounding box regression, convolutional networks, fixed-layout documents, text localization

Procedia PDF Downloads 181
3268 Study on Construction of 3D Topography by UAV-Based Images

Authors: Yun-Yao Chi, Chieh-Kai Tsai, Dai-Ling Li

Abstract:

In this paper, a method of fast 3D topography modeling using the high-resolution camera images is studied based on the characteristics of Unmanned Aerial Vehicle (UAV) system for low altitude aerial photogrammetry and the need of three dimensional (3D) urban landscape modeling. Firstly, the existing high-resolution digital camera with special design of overlap images is designed by reconstructing and analyzing the auto-flying paths of UAVs, which improves the self-calibration function to achieve the high precision imaging by software, and further increased the resolution of the imaging system. Secondly, several-angle images including vertical images and oblique images gotten by the UAV system are used for the detail measure of urban land surfaces and the texture extraction. Finally, the aerial photography and 3D topography construction are both developed in campus of Chang-Jung University and in Guerin district area in Tainan, Taiwan, provide authentication model for construction of 3D topography based on combined UAV-based camera images from system. The results demonstrated that the UAV system for low altitude aerial photogrammetry can be used in the construction of 3D topography production, and the technology solution in this paper offers a new, fast, and technical plan for the 3D expression of the city landscape, fine modeling and visualization.

Keywords: 3D, topography, UAV, images

Procedia PDF Downloads 292
3267 Configuration of Water-Based Features in Islamic Heritage Complexes and Vernacular Architecture: An Analysis into Interactions of Morphology, Form, and Climatic Performance

Authors: Mustaffa Kamal Bashar Mohd Fauzi, Puteri Shireen Jahn Kassim, Nurul Syala Abdul Latip

Abstract:

It is increasingly realized that sustainability includes both a response to the climatic and cultural context of a place. To assess the cultural context, a morphological analysis of urban patterns from heritage legacies is necessary. While the climatic form is derived from an analysis of meteorological data, cultural patterns and forms must be abstracted from a typological and morphological study. This current study aims to analyzes morphological and formal elements of water-based architectural and urban design of past Islamic vernacular complexes in the hot arid regions and how a vast utilization of water was shaped and sited to act as cooling devices for an entire complex. Apart from its pleasant coolness, water can be used in an aesthetically way such as emphasizing visual axes, vividly enhancing the visual of the surrounding environment and symbolically portraying the act of purity in the design. By comparing 2 case studies based on the analysis of interactions of water features into the form, planning and morphology of 2 Islamic heritage complexes, Fatehpur Sikri (India) and Lahore Fort (Pakistan) with a focus on Shish Mahal of Lahore Fort in terms of their mass, architecture and urban planning, it is agreeable that water plays an integral role in their climatic amelioration via different methods of water conveyance system. Both sites are known for their substantial historical values and prominent for their sustainable vernacular buildings for example; the courtyard of Shish Mahal in Lahore fort are designed to provide continuous coolness by constructing various miniatures water channels that run underneath the paved courtyard. One of the most remarkable features of this system that all water is made dregs-free before it was inducted into these underneath channels. In Fatehpur Sikri, the method of conveyance seems differed from Lahore Fort as the need to supply water to the ridge where Fatehpur Sikri situated is become the major challenges. Thus, the achievement of supplying water to the palatial complexes is solved by placing inhabitable water buildings within the two supply system for raising water. The process of raising the water can be either mechanical or laborious inside the enclosed well and water rising houses. The studies analyzes and abstract the water supply forms, patterns and flows in 3-dimensional shapes through the actions of evaporative cooling and wind-induced ventilation under arid climates. Through the abstraction analytical and descriptive relational morphology of the spatial configurations, the studies can suggest the idealized spatial system that can be used in urban design and complexes which later became a methodological and abstraction tool of sustainability to suit the modern contemporary world.

Keywords: heritage site, Islamic vernacular architecture, water features, morphology, urban design

Procedia PDF Downloads 365
3266 Optimized Deep Learning-Based Facial Emotion Recognition System

Authors: Erick C. Valverde, Wansu Lim

Abstract:

Facial emotion recognition (FER) system has been recently developed for more advanced computer vision applications. The ability to identify human emotions would enable smart healthcare facility to diagnose mental health illnesses (e.g., depression and stress) as well as better human social interactions with smart technologies. The FER system involves two steps: 1) face detection task and 2) facial emotion recognition task. It classifies the human expression in various categories such as angry, disgust, fear, happy, sad, surprise, and neutral. This system requires intensive research to address issues with human diversity, various unique human expressions, and variety of human facial features due to age differences. These issues generally affect the ability of the FER system to detect human emotions with high accuracy. Early stage of FER systems used simple supervised classification task algorithms like K-nearest neighbors (KNN) and artificial neural networks (ANN). These conventional FER systems have issues with low accuracy due to its inefficiency to extract significant features of several human emotions. To increase the accuracy of FER systems, deep learning (DL)-based methods, like convolutional neural networks (CNN), are proposed. These methods can find more complex features in the human face by means of the deeper connections within its architectures. However, the inference speed and computational costs of a DL-based FER system is often disregarded in exchange for higher accuracy results. To cope with this drawback, an optimized DL-based FER system is proposed in this study.An extreme version of Inception V3, known as Xception model, is leveraged by applying different network optimization methods. Specifically, network pruning and quantization are used to enable lower computational costs and reduce memory usage, respectively. To support low resource requirements, a 68-landmark face detector from Dlib is used in the early step of the FER system.Furthermore, a DL compiler is utilized to incorporate advanced optimization techniques to the Xception model to improve the inference speed of the FER system. In comparison to VGG-Net and ResNet50, the proposed optimized DL-based FER system experimentally demonstrates the objectives of the network optimization methods used. As a result, the proposed approach can be used to create an efficient and real-time FER system.

Keywords: deep learning, face detection, facial emotion recognition, network optimization methods

Procedia PDF Downloads 112
3265 Effect of Using Different Packaging Materials on Quality of Minimally Process (Fresh-Cut) Banana (Musa acuminata balbisiana) Cultivar 'Nipah'

Authors: Nur Allisha Othman, Rosnah Shamsudin, Zaulia Othman, Siti Hajar Othman

Abstract:

Mitigating short storage life of fruit like banana uses minimally process or known as fresh cut can contribute to the growing demand especially in South East Asian countries. The effect of different types of packaging material on fresh-cut Nipah (Musa acuminata balbisiana) were studied. Fresh cut banana cultivar (cv) Nipah are packed in polypropylene plastic (PP), low density polypropylene plastic (LDPE), polymer plastic film (shrink wrap) and polypropylene container as control for 12 days at low temperature (4ᵒC). Quality of physical and chemical evaluation such as colour, texture, pH, TA, TSS, and vitamin C were examined every 2 days interval for 12 days at 4ᵒC. Result shows that the PP is the most suitable packaging for banana cv Nipah because it can reduce respiration and physicochemical quality changes of banana cv Nipah. Different types of packaging significantly affected quality of fresh-cut banana cv Nipah. PP bag was the most suitable packaging to maintain quality and prolong storage life of fresh-cut banana cv Nipah for 12 days at 4ᵒC.

Keywords: physicochemical, PP, LDPE, shrink wrap, browning, respiration

Procedia PDF Downloads 216
3264 A Multi-Release Software Reliability Growth Models Incorporating Imperfect Debugging and Change-Point under the Simulated Testing Environment and Software Release Time

Authors: Sujit Kumar Pradhan, Anil Kumar, Vijay Kumar

Abstract:

The testing process of the software during the software development time is a crucial step as it makes the software more efficient and dependable. To estimate software’s reliability through the mean value function, many software reliability growth models (SRGMs) were developed under the assumption that operating and testing environments are the same. Practically, it is not true because when the software works in a natural field environment, the reliability of the software differs. This article discussed an SRGM comprising change-point and imperfect debugging in a simulated testing environment. Later on, we extended it in a multi-release direction. Initially, the software was released to the market with few features. According to the market’s demand, the software company upgraded the current version by adding new features as time passed. Therefore, we have proposed a generalized multi-release SRGM where change-point and imperfect debugging concepts have been addressed in a simulated testing environment. The failure-increasing rate concept has been adopted to determine the change point for each software release. Based on nine goodness-of-fit criteria, the proposed model is validated on two real datasets. The results demonstrate that the proposed model fits the datasets better. We have also discussed the optimal release time of the software through a cost model by assuming that the testing and debugging costs are time-dependent.

Keywords: software reliability growth models, non-homogeneous Poisson process, multi-release software, mean value function, change-point, environmental factors

Procedia PDF Downloads 65
3263 An Approach for Association Rules Ranking

Authors: Rihab Idoudi, Karim Saheb Ettabaa, Basel Solaiman, Kamel Hamrouni

Abstract:

Medical association rules induction is used to discover useful correlations between pertinent concepts from large medical databases. Nevertheless, ARs algorithms produce huge amount of delivered rules and do not guarantee the usefulness and interestingness of the generated knowledge. To overcome this drawback, we propose an ontology based interestingness measure for ARs ranking. According to domain expert, the goal of the use of ARs is to discover implicit relationships between items of different categories such as ‘clinical features and disorders’, ‘clinical features and radiological observations’, etc. That’s to say, the itemsets which are composed of ‘similar’ items are uninteresting. Therefore, the dissimilarity between the rule’s items can be used to judge the interestingness of association rules; the more different are the items, the more interesting the rule is. In this paper, we design a distinct approach for ranking semantically interesting association rules involving the use of an ontology knowledge mining approach. The basic idea is to organize the ontology’s concepts into a hierarchical structure of conceptual clusters of targeted subjects, where each cluster encapsulates ‘similar’ concepts suggesting a specific category of the domain knowledge. The interestingness of association rules is, then, defined as the dissimilarity between corresponding clusters. That is to say, the further are the clusters of the items in the AR, the more interesting the rule is. We apply the method in our domain of interest – mammographic domain- using an existing mammographic ontology called Mammo with the goal of deriving interesting rules from past experiences, to discover implicit relationships between concepts modeling the domain.

Keywords: association rule, conceptual clusters, interestingness measures, ontology knowledge mining, ranking

Procedia PDF Downloads 315
3262 Clinical Features of Acute Aortic Dissection Patients Initially Diagnosed with ST-Segment Elevation Myocardial Infarction

Authors: Min Jee Lee, Young Sun Park, Shin Ahn, Chang Hwan Sohn, Dong Woo Seo, Jae Ho Lee, Yoon Seon Lee, Kyung Soo Lim, Won Young Kim

Abstract:

Background: Acute myocardial infarction (AMI) concomitant with acute aortic syndrome (AAS) is rare but prompt recognition of concomitant AAS is crucial, especially in patients with ST-segment elevation myocardial infarction (STEMI) because misdiagnosis with early thrombolytic or anticoagulant treatment may result in catastrophic consequences. Objectives: This study investigated the clinical features of patients of STEMI concomitant with AAS that may lead to the diagnostic clue. Method: Between 1 January 2010 and 31 December 2014, 22 patients who were the initial diagnosis of acute coronary syndrome (AMI and unstable angina) and AAS (aortic dissection, intramural hematoma and ruptured thoracic aneurysm) in our emergency department were reviewed. Among these, we excluded 10 patients who were transferred from other hospital and 4 patients with non-STEMI, leaving a total of 8 patients of STEMI concomitant with AAS for analysis. Result: The mean age of study patients was 57.5±16.31 years and five patients were Standford type A and three patients were type B aortic dissection. Six patients had ST-segment elevation in anterior leads and two patients had in inferior leads. Most of the patients had acute onset, severe chest pain but no patients had dissecting nature chest pain. Serum troponin I was elevated in three patients but all patients had D-dimer elevation. Aortic regurgitation or regional wall motion abnormality was founded in four patients. However, widened mediastinum was seen in all study patients. Conclusion: When patients with STEMI have elevated D-dimer and widened mediastinum, concomitant AAS may have to be suspected.

Keywords: aortic dissection, myocardial infarction, ST-segment, d-dimer

Procedia PDF Downloads 386
3261 Change Detection Analysis on Support Vector Machine Classifier of Land Use and Land Cover Changes: Case Study on Yangon

Authors: Khin Mar Yee, Mu Mu Than, Kyi Lint, Aye Aye Oo, Chan Mya Hmway, Khin Zar Chi Winn

Abstract:

The dynamic changes of Land Use and Land Cover (LULC) changes in Yangon have generally resulted the improvement of human welfare and economic development since the last twenty years. Making map of LULC is crucially important for the sustainable development of the environment. However, the exactly data on how environmental factors influence the LULC situation at the various scales because the nature of the natural environment is naturally composed of non-homogeneous surface features, so the features in the satellite data also have the mixed pixels. The main objective of this study is to the calculation of accuracy based on change detection of LULC changes by Support Vector Machines (SVMs). For this research work, the main data was satellite images of 1996, 2006 and 2015. Computing change detection statistics use change detection statistics to compile a detailed tabulation of changes between two classification images and Support Vector Machines (SVMs) process was applied with a soft approach at allocation as well as at a testing stage and to higher accuracy. The results of this paper showed that vegetation and cultivated area were decreased (average total 29 % from 1996 to 2015) because of conversion to the replacing over double of the built up area (average total 30 % from 1996 to 2015). The error matrix and confidence limits led to the validation of the result for LULC mapping.

Keywords: land use and land cover change, change detection, image processing, support vector machines

Procedia PDF Downloads 120
3260 Exclusive Value Adding by iCenter Analytics on Transient Condition

Authors: Zhu Weimin, Allegorico Carmine, Ruggiero Gionata

Abstract:

During decades of Baker Hughes (BH) iCenter experience, it is demonstrated that in addition to conventional insights on equipment steady operation conditions, insights on transient conditions can add significant and exclusive value for anomaly detection, downtime saving, and predictive maintenance. Our work shows examples from the BH iCenter experience to introduce the advantages and features of using transient condition analytics: (i) Operation under critical engine conditions: e.g., high level or high change rate of temperature, pressure, flow, vibration, etc., that would not be reachable in normal operation, (ii) Management of dedicated sub-systems or components, many of which are often bottlenecks for reliability and maintenance, (iii) Indirect detection of anomalies in the absence of instrumentation, (iv) Repetitive sequences: if data is properly processed, the engineering features of transients provide not only anomaly detection but also problem characterization and prognostic indicators for predictive maintenance, (v) Engine variables accounting for fatigue analysis. iCenter has been developing and deploying a series of analytics based on transient conditions. They are contributing to exclusive value adding in the following areas: (i) Reliability improvement, (ii) Startup reliability improvement, (iii) Predictive maintenance, (iv) Repair/overhaul cost down. Illustrative examples for each of the above areas are presented in our study, focusing on challenges and adopted techniques ranging from purely statistical approaches to the implementation of machine learning algorithms. The obtained results demonstrate how the value is obtained using transient condition analytics in the BH iCenter experience.

Keywords: analytics, diagnostics, monitoring, turbomachinery

Procedia PDF Downloads 62
3259 Production of Gluten-Free Bread Using Emulsifying Salts and Rennet Casein

Authors: A. Morina, S. Ö. Muti, M. Öztürk

Abstract:

Celiac disease is a chronic intestinal disease observed in individuals with gluten intolerance. In this study, our aim was to create a protein matrix to mimic the functional properties of gluten. For this purpose, rennet casein and four emulsifying salts (disodium phosphate (DSP), tetrasodium pyrophosphate (TSPP), sodium acid pyrophosphate (SAPP), and sodium hexametaphosphate (SHMP)) were investigated in gluten-free bread manufacture. Compositional, textural, and visual properties of the gluten-free bread dough and gluten-free breads were investigated by a two–level factorial experimental design with two-star points (α = 1.414) and two replicates of the center point. Manufacturing gluten-free bread with rennet casein and SHMP significantly increased the bread volume (P < 0.0001, R² = 97.8). In general, utilization of rennet casein with DSP and SAPP increased bread hardness while no difference was observed in samples manufactured with TSPP and SHMP. Except for TSPP, bread color was improved by the utilization of rennet casein and DSP, SAPP, and SHMP combinations. In conclusion, it is possible to manufacture gluten-free bread with acceptable texture and color by rennet casein and SHMP.

Keywords: celiac disease, gluten-free bread, emulsified salts, rennet casein, rice flour

Procedia PDF Downloads 153
3258 Developing Pandi-Tekki to Tourism Destination in Tanglang, Billiri Local Government Area, Gombe State, Nigeria

Authors: Sanusi Abubakar Sadiq

Abstract:

Despite the significance of tourism as a key revenue earner and employment generator, it is still being disregarded in many areas. The prospects of existing resources could boost development in communities; region, etc. are less used. This study is carried out with the view of developing Pandi-Tekki in Tanglang in Billiri Local Government Area as a Tourism Destination. It was primarily aimed at identifying features of Pandi-Tekki that could be developed into tourism attraction and suggest ways of developing the prospective site into a tourism destination, as well as exploring its possible contribution to tourism sector in Gombe State. Literature was reviewed based on relevant published materials. Data was collected through the use of qualitative and quantitative methods which include personal observation and structured questionnaire. Data was analyzed using the statistical package for the social sciences (SPSS) software. Result based on the data collected shows that Pandi-Tekki has potentials that can be developed as an attraction. The result also shows that the local community perceives tourism as a good development that will open them up to the entire world and also generate revenue to stimulate their economy. Conclusions were drawn based on the findings with regard to the analysis carried out in this research. It was discovered that Pandi-Tekki can be developed as a tourism destination, and there will be a great success towards achieving the aim and objectives of the development. Therefore, recommendations were made on creating awareness on the need to develop Pandi-Tekki as a Tourism Destination and the need for government to provide tourism facilities at the destination since it is a public outfit.

Keywords: attraction, destination, developing, features

Procedia PDF Downloads 272
3257 Virtual 3D Environments for Image-Based Navigation Algorithms

Authors: V. B. Bastos, M. P. Lima, P. R. G. Kurka

Abstract:

This paper applies to the creation of virtual 3D environments for the study and development of mobile robot image based navigation algorithms and techniques, which need to operate robustly and efficiently. The test of these algorithms can be performed in a physical way, from conducting experiments on a prototype, or by numerical simulations. Current simulation platforms for robotic applications do not have flexible and updated models for image rendering, being unable to reproduce complex light effects and materials. Thus, it is necessary to create a test platform that integrates sophisticated simulated applications of real environments for navigation, with data and image processing. This work proposes the development of a high-level platform for building 3D model’s environments and the test of image-based navigation algorithms for mobile robots. Techniques were used for applying texture and lighting effects in order to accurately represent the generation of rendered images regarding the real world version. The application will integrate image processing scripts, trajectory control, dynamic modeling and simulation techniques for physics representation and picture rendering with the open source 3D creation suite - Blender.

Keywords: simulation, visual navigation, mobile robot, data visualization

Procedia PDF Downloads 245
3256 Site Analysis’ Importance as a Valid Factor in Building Design

Authors: Mekwa Eme, Anya chukwuma

Abstract:

The act of evaluating a particular site physically and socially in order to create a good design solution that will address the physical and interior environment of the location is known as architectural site analysis. This essay will describe site analysis as a useful design component. According to the introduction and supporting research, site evaluation and analysis are crucial to good design in terms of topography, orientation, site size, accessibility, rainfall, wind direction, and times of sunrise and sunset. Methodology: Both quantitative and qualitative analyses are used in this paper. The primary and secondary types of data collection are as follows. This information was gathered via the case study approach, already published literature, journals, the internet, a local poll, oral interviews, inquiries, and in-person interviews. The purpose of this is to clarify the benefits of site analysis for the design process and its implications for the working or building stage. Results: Each site's criteria are unique in terms of things like soil, plants, trees, accessibility, topography, and security. This will make it easier for the architect and environmentalist to decide on the idea, shape, and supporting structures of the design. It is crucial because before any design work is done, the nature of the target location will be determined through site visits and research. The location, contours, site features, and accessibility are just a few of the topics included in this site study. In order for students and working architects to understand the nature of the site they will be working on, site analysis is a key component of architectural education. The building's orientation, the site's circulation, and the sustainability of the site may all be determined with thorough research of the site's features.

Keywords: analysis, climate, statistics, design

Procedia PDF Downloads 233
3255 Analysis of Efficiency Production of Grass Black Jelly (Mesona palustris) in Double Scale

Authors: Irvan Adhin Cholilie, Susinggih Wijana, Yusron Sugiarto

Abstract:

The aim of this research is to compare the results of black grass jelly produced using laboratory scale and double scale. In this research, the production from the laboratory scale is using ingredients of 1 kg black grass jelly added with 5 liters of water, while the double scale is using 5 kg black grass jelly and 75 liters of water. The results of organoleptic tests performed by 30 panelists (general) to the sample gels of grass black powder produced from both of laboratory and double scale are not different significantly in color, odor, flavor, and texture. Proximate test results conducted in both of grass black jelly powder produced in laboratory scale and double scale also have no significant differences in all parameters. Grass black jelly powder from double scale contains water, carbohydrate, crude fiber, and yield in the amount of 12,25 %; 43,7 %; 5,89 %; and 16,28 % respectively. The results of the energy efficiency analysis by boiling, draining, evaporation, drying, and milling processes are 85,11 %; 76,97 %; 99,64 %; 99,99% and 99,39% respectively. The utility needs including water needs for each batch amounted 0.1 m3 and cost Rp 220,5 per batch, the electricity needs for each batch is 20.01 kWh and cost Rp 18569.28 per batch, and LPG needs for each batch is 30 kg costed Rp 234,000.00 so that the total cost spent for the process is Rp 252,789.78 .

Keywords: black grass jelly, powder, mass balance, energy balance, cost

Procedia PDF Downloads 376
3254 Digital Manufacturing: Evolution and a Process Oriented Approach to Align with Business Strategy

Authors: Abhimanyu Pati, Prabir K. Bandyopadhyay

Abstract:

The paper intends to highlight the significance of Digital Manufacturing (DM) strategy in support and achievement of business strategy and goals of any manufacturing organization. Towards this end, DM initiatives have been given a process perspective, while not undermining its technological significance, with a view to link its benefits directly with fulfilment of customer needs and expectations in a responsive and cost-effective manner. A digital process model has been proposed to categorize digitally enabled organizational processes with a view to create synergistic groups, which adopt and use digital tools having similar characteristics and functionalities. This will throw future opportunities for researchers and developers to create a unified technology environment for integration and orchestration of processes. Secondly, an effort has been made to apply “what” and “how” features of Quality Function Deployment (QFD) framework to establish the relationship between customers’ needs – both for external and internal customers, and the features of various digital processes, which support for the achievement of these customer expectations. The paper finally concludes that in the present highly competitive environment, business organizations cannot thrive to sustain unless they understand the significance of digital strategy and integrate it with their business strategy with a clearly defined implementation roadmap. A process-oriented approach to DM strategy will help business executives and leaders to appreciate its value propositions and its direct link to organization’s competitiveness.

Keywords: knowledge management, cloud computing, knowledge management approaches, cloud-based knowledge management

Procedia PDF Downloads 302
3253 Virulence Phenotypes Among Multi-Drug Resistant Uropathogenic Bacteria

Authors: V. V. Lakshmi, Y. V. S. Annapurna

Abstract:

Urinary tract infection (UTI) is one of the most common infectious diseases seen in the community. Susceptible individuals experience multiple episodes, and progress to acute pyelonephritis or uro-sepsis or develop asymptomatic bacteriuria (ABU). Ability to cause extraintestinal infections depends on several virulence factors required for survival at extraintestinal sites. Presence of virulence phenotypes enhances the pathogenicity of these otherwise commensal organisms and thus augments its ability to cause extraintestinal infections, the most frequent in urinary tract infections(UTI). The present study focuses on detection of the virulence characters exhibited by the uropathogenic organism and most common factors exhibited in the local pathogens. A total of 700 isolates of E.coli and Klebsiella spp were included in the study. These were isolated from patients from local hospitals reported to be suffering with UTI over a period of three years. Isolation and identification was done based on Gram character and IMVIC reactions. Antibiotic sensitivity profile was carried out by disc diffusion method and multi drug resistant strains with MAR index of 0.7 were further selected.. Virulence features examined included their ability to produce exopolysaccharides, protease- gelatinase production, hemolysin production, haemagglutination and hydrophobicity test. Exopolysaccharide production was most predominant virulence feature among the isolates when checked by congo red method. The biofilms production examined by microtitre plates using ELISA reader confirmed that this is the major factor contributing to virulencity of the pathogens followed by hemolysin production

Keywords: Escherichia coli, Klebsiella sp, Uropathogens, Virulence features.

Procedia PDF Downloads 412
3252 Historical Development of Negative Emotive Intensifiers in Hungarian

Authors: Martina Katalin Szabó, Bernadett Lipóczi, Csenge Guba, István Uveges

Abstract:

In this study, an exhaustive analysis was carried out about the historical development of negative emotive intensifiers in the Hungarian language via NLP methods. Intensifiers are linguistic elements which modify or reinforce a variable character in the lexical unit they apply to. Therefore, intensifiers appear with other lexical items, such as adverbs, adjectives, verbs, infrequently with nouns. Due to the complexity of this phenomenon (set of sociolinguistic, semantic, and historical aspects), there are many lexical items which can operate as intensifiers. The group of intensifiers are admittedly one of the most rapidly changing elements in the language. From a linguistic point of view, particularly interesting are a special group of intensifiers, the so-called negative emotive intensifiers, that, on their own, without context, have semantic content that can be associated with negative emotion, but in particular cases, they may function as intensifiers (e.g.borzasztóanjó ’awfully good’, which means ’excellent’). Despite their special semantic features, negative emotive intensifiers are scarcely examined in literature based on large Historical corpora via NLP methods. In order to become better acquainted with trends over time concerning the intensifiers, The exhaustively analysed a specific historical corpus, namely the Magyar TörténetiSzövegtár (Hungarian Historical Corpus). This corpus (containing 3 millions text words) is a collection of texts of various genres and styles, produced between 1772 and 2010. Since the corpus consists of raw texts and does not contain any additional information about the language features of the data (such as stemming or morphological analysis), a large amount of manual work was required to process the data. Thus, based on a lexicon of negative emotive intensifiers compiled in a previous phase of the research, every occurrence of each intensifier was queried, and the results were stored in a separate data frame. Then, basic linguistic processing (POS-tagging, lemmatization etc.) was carried out automatically with the ‘magyarlanc’ NLP-toolkit. Finally, the frequency and collocation features of all the negative emotive words were automatically analyzed in the corpus. Outcomes of the research revealed in detail how these words have proceeded through grammaticalization over time, i.e., they change from lexical elements to grammatical ones, and they slowly go through a delexicalization process (their negative content diminishes over time). What is more, it was also pointed out which negative emotive intensifiers are at the same stage in this process in the same time period. Giving a closer look to the different domains of the analysed corpus, it also became certain that during this process, the pragmatic role’s importance increases: the newer use expresses the speaker's subjective, evaluative opinion at a certain level.

Keywords: historical corpus analysis, historical linguistics, negative emotive intensifiers, semantic changes over time

Procedia PDF Downloads 217
3251 Iris Cancer Detection System Using Image Processing and Neural Classifier

Authors: Abdulkader Helwan

Abstract:

Iris cancer, so called intraocular melanoma is a cancer that starts in the iris; the colored part of the eye that surrounds the pupil. There is a need for an accurate and cost-effective iris cancer detection system since the available techniques used currently are still not efficient. The combination of the image processing and artificial neural networks has a great efficiency for the diagnosis and detection of the iris cancer. Image processing techniques improve the diagnosis of the cancer by enhancing the quality of the images, so the physicians diagnose properly. However, neural networks can help in making decision; whether the eye is cancerous or not. This paper aims to develop an intelligent system that stimulates a human visual detection of the intraocular melanoma, so called iris cancer. The suggested system combines both image processing techniques and neural networks. The images are first converted to grayscale, filtered, and then segmented using prewitt edge detection algorithm to detect the iris, sclera circles and the cancer. The principal component analysis is used to reduce the image size and for extracting features. Those features are considered then as inputs for a neural network which is capable of deciding if the eye is cancerous or not, throughout its experience adopted by many training iterations of different normal and abnormal eye images during the training phase. Normal images are obtained from a public database available on the internet, “Mile Research”, while the abnormal ones are obtained from another database which is the “eyecancer”. The experimental results for the proposed system show high accuracy 100% for detecting cancer and making the right decision.

Keywords: iris cancer, intraocular melanoma, cancerous, prewitt edge detection algorithm, sclera

Procedia PDF Downloads 489
3250 Skin Care through Ayurveda

Authors: K. L. Virupaksha Gupta

Abstract:

Ayurveda offers a holistic outlook regarding skin care. Most Initial step in Ayurveda is to identify the skin type and care accordingly which is highly personalized. Though dermatologically there are various skin type classifications such Baumann skin types (based on 4 parameters i) Oily Vs Dry ii) Sensitive Vs Resistant iii) Pigmented Vs Non-Pigmented iv) Wrinkled Vs Tight (Unwrinkled) etc but Skin typing in Ayurveda is mainly determined by the prakriti (constitution) of the individual as well as the status of Doshas (Humors) which are basically of 3 types – i.e Vata Pitta and Kapha,. Difference between them is mainly attributed to the qualities of each dosha (humor). All the above said skin types can be incorporated under these three types. The skin care modalities in each of the constitution vary greatly. Skin of an individual of Vata constitution would be lustreless, having rough texture and cracks due to dryness and thus should be given warm and unctuous therapies and oil massage for lubrication and natural moisturizers for hydration. Skin of an individual of Pitta constitution would look more vascular (pinkish), delicate and sensitive with a fair complexion, unctuous and tendency for wrinkles and greying of hair at an early age and hence should be given cooling and nurturing therapies and should avoid tanning treatments. Skin of an individual of kapha constitution will have oily skin, they are delicate and look beautiful and radiant and hence these individuals would require therapies to mainly combat oily skin. Hence, the skin typing and skin care in Ayurveda is highly rational and scientific.

Keywords: Ayurveda, dermatology, Dosha, skin types

Procedia PDF Downloads 397
3249 Effect of Citric Acid and Clove on Cured Smoked Meat: A Traditional Meat Product

Authors: Esther Eduzor, Charles A. Negbenebor, Helen O. Agu

Abstract:

Smoking of meat enhances the taste and look of meat, it also increases its longevity, and helps preserve the meat by slowing down the spoilage of fat and growth of bacteria. The Lean meat from the forequarter of beef carcass was obtained from the Maiduguri abattoir. The meat was cut into four portions with weight ranging from 525-545 g. The meat was cut into bits measuring about 8 cm in length, 3.5 cm in thickness and weighed 64.5 g. Meat samples were washed, cured with various concentration of sodium chloride, sodium nitrate, citric acid and clove for 30 min, drained and smoked in a smoking kiln at a temperature range of 55-600°C, for 8 hr a day for 3 days. The products were stored at ambient temperature and evaluated microbiologically and organoleptically. In terms of processing and storage there were increases in pH, free fatty acid content, a decrease in water holding capacity and microbial count of the cured smoked meat. The panelists rated control samples significantly (p < 0.05) higher in terms of colour, texture, taste and overall acceptability. The following organisms were isolated and identified during storage: Bacillus specie, Bacillus subtilis, streptococcus, Pseudomonas, Aspergillus niger, Candida and Penicillium specie. The study forms a basis for new product development for meat industry.

Keywords: citric acid, cloves, smoked meat, bioengineering

Procedia PDF Downloads 437
3248 Virulence Phenotypes among Multi Drug Resistant Uropathogenic E. Coli and Klebsiella SPP

Authors: V. V. Lakshmi, Y. V. S. Annapurna

Abstract:

Urinary tract infection (UTI) is one of the most common infectious diseases seen in the community. Susceptible individuals experience multiple episodes, and progress to acute pyelonephritis or uro-sepsis or develop asymptomatic bacteriuria (ABU). Ability to cause extraintestinal infections depends on several virulence factors required for survival at extraintestinal sites. Presence of virulence phenotypes enhances the pathogenicity of these otherwise commensal organisms and thus augments its ability to cause extraintestinal infections, the most frequent in urinary tract infections(UTI). The present study focuses on detection of the virulence characters exhibited by the uropathogenic organism and most common factors exhibited in the local pathogens. A total of 700 isolates of E.coli and Klebsiella spp were included in the study.These were isolated from patients from local hospitals reported to be suffering with UTI over a period of three years. Isolation and identification was done based on Gram character and IMVIC reactions. Antibiotic sensitivity profile was carried out by disc diffusion method and multi drug resistant strains with MAR index of 0.7 were further selected. Virulence features examined included their ability to produce exopolysaccharides, protease- gelatinase production, hemolysin production, haemagglutination and hydrophobicity test. Exopolysaccharide production was most predominant virulence feature among the isolates when checked by congo red method. The biofilms production examined by microtitre plates using ELISA reader confirmed that this is the major factor contributing to virulencity of the pathogens followed by hemolysin production.

Keywords: Escherichia coli, Klebsiella spp, Uropathogens, virulence features

Procedia PDF Downloads 305
3247 Spectrogram Pre-Processing to Improve Isotopic Identification to Discriminate Gamma and Neutrons Sources

Authors: Mustafa Alhamdi

Abstract:

Industrial application to classify gamma rays and neutron events is investigated in this study using deep machine learning. The identification using a convolutional neural network and recursive neural network showed a significant improvement in predication accuracy in a variety of applications. The ability to identify the isotope type and activity from spectral information depends on feature extraction methods, followed by classification. The features extracted from the spectrum profiles try to find patterns and relationships to present the actual spectrum energy in low dimensional space. Increasing the level of separation between classes in feature space improves the possibility to enhance classification accuracy. The nonlinear nature to extract features by neural network contains a variety of transformation and mathematical optimization, while principal component analysis depends on linear transformations to extract features and subsequently improve the classification accuracy. In this paper, the isotope spectrum information has been preprocessed by finding the frequencies components relative to time and using them as a training dataset. Fourier transform implementation to extract frequencies component has been optimized by a suitable windowing function. Training and validation samples of different isotope profiles interacted with CdTe crystal have been simulated using Geant4. The readout electronic noise has been simulated by optimizing the mean and variance of normal distribution. Ensemble learning by combing voting of many models managed to improve the classification accuracy of neural networks. The ability to discriminate gamma and neutron events in a single predication approach using deep machine learning has shown high accuracy using deep learning. The paper findings show the ability to improve the classification accuracy by applying the spectrogram preprocessing stage to the gamma and neutron spectrums of different isotopes. Tuning deep machine learning models by hyperparameter optimization of neural network models enhanced the separation in the latent space and provided the ability to extend the number of detected isotopes in the training database. Ensemble learning contributed significantly to improve the final prediction.

Keywords: machine learning, nuclear physics, Monte Carlo simulation, noise estimation, feature extraction, classification

Procedia PDF Downloads 138