Search results for: hyperspectral image classification using tree search algorithm
7449 Estimation of Asphalt Pavement Surfaces Using Image Analysis Technique
Authors: Mohammad A. Khasawneh
Abstract:
Asphalt concrete pavements gradually lose their skid resistance causing safety problems especially under wet conditions and high driving speeds. In order to enact the actual field polishing and wearing process of asphalt pavement surfaces in a laboratory setting, several laboratory-scale accelerated polishing devices were developed by different agencies. To mimic the actual process, friction and texture measuring devices are needed to quantify surface deterioration at different polishing intervals that reflect different stages of the pavement life. The test could still be considered lengthy and to some extent labor-intensive. Therefore, there is a need to come up with another method that can assist in investigating the bituminous pavement surface characteristics in a practical and time-efficient test procedure. The purpose of this paper is to utilize a well-developed image analysis technique to characterize asphalt pavement surfaces without the need to use conventional friction and texture measuring devices in an attempt to shorten and simplify the polishing procedure in the lab. Promising findings showed the possibility of using image analysis in lieu of the labor-sensitive-variable-in-nature friction and texture measurements. It was found that the exposed aggregate surface area of asphalt specimens made from limestone and gravel aggregates produced solid evidence of the validity of this method in describing asphalt pavement surfaces. Image analysis results correlated well with the British Pendulum Numbers (BPN), Polish Values (PV) and Mean Texture Depth (MTD) values.Keywords: friction, image analysis, polishing, statistical analysis, texture
Procedia PDF Downloads 3097448 Application of Fuzzy Clustering on Classification Agile Supply Chain
Authors: Hamidreza Fallah Lajimi , Elham Karami, Fatemeh Ali nasab, Mostafa Mahdavikia
Abstract:
Being responsive is an increasingly important skill for firms in today’s global economy; thus firms must be agile. Naturally, it follows that an organization’s agility depends on its supply chain being agile. However, achieving supply chain agility is a function of other abilities within the organization. This paper analyses results from a survey of 71 Iran manufacturing companies in order to identify some of the factors for agile organizations in managing their supply chains. Then we classification this company in four cluster with fuzzy c-mean technique and with four validations functional determine automatically the optimal number of clusters.Keywords: agile supply chain, clustering, fuzzy clustering
Procedia PDF Downloads 4797447 Automatic Segmentation of 3D Tomographic Images Contours at Radiotherapy Planning in Low Cost Solution
Authors: D. F. Carvalho, A. O. Uscamayta, J. C. Guerrero, H. F. Oliveira, P. M. Azevedo-Marques
Abstract:
The creation of vector contours slices (ROIs) on body silhouettes in oncologic patients is an important step during the radiotherapy planning in clinic and hospitals to ensure the accuracy of oncologic treatment. The radiotherapy planning of patients is performed by complex softwares focused on analysis of tumor regions, protection of organs at risk (OARs) and calculation of radiation doses for anomalies (tumors). These softwares are supplied for a few manufacturers and run over sophisticated workstations with vector processing presenting a cost of approximately twenty thousand dollars. The Brazilian project SIPRAD (Radiotherapy Planning System) presents a proposal adapted to the emerging countries reality that generally does not have the monetary conditions to acquire some radiotherapy planning workstations, resulting in waiting queues for new patients treatment. The SIPRAD project is composed by a set of integrated and interoperabilities softwares that are able to execute all stages of radiotherapy planning on simple personal computers (PCs) in replace to the workstations. The goal of this work is to present an image processing technique, computationally feasible, that is able to perform an automatic contour delineation in patient body silhouettes (SIPRAD-Body). The SIPRAD-Body technique is performed in tomography slices under grayscale images, extending their use with a greedy algorithm in three dimensions. SIPRAD-Body creates an irregular polyhedron with the Canny Edge adapted algorithm without the use of preprocessing filters, as contrast and brightness. In addition, comparing the technique SIPRAD-Body with existing current solutions is reached a contours similarity at least 78%. For this comparison is used four criteria: contour area, contour length, difference between the mass centers and Jaccard index technique. SIPRAD-Body was tested in a set of oncologic exams provided by the Clinical Hospital of the University of Sao Paulo (HCRP-USP). The exams were applied in patients with different conditions of ethnology, ages, tumor severities and body regions. Even in case of services that have already workstations, it is possible to have SIPRAD working together PCs because of the interoperability of communication between both systems through the DICOM protocol that provides an increase of workflow. Therefore, the conclusion is that SIPRAD-Body technique is feasible because of its degree of similarity in both new radiotherapy planning services and existing services.Keywords: radiotherapy, image processing, DICOM RT, Treatment Planning System (TPS)
Procedia PDF Downloads 2997446 Investigation of Interlayer Shear Effects in Asphalt Overlay on Existing Rigid Airfield Pavement Using Digital Image Correlation
Authors: Yuechao Lei, Lei Zhang
Abstract:
The interface shear between asphalt overlay and existing rigid airport pavements occurs due to differences in the mechanical properties of materials subjected to aircraft loading. Interlayer contact influences the mechanical characteristics of the asphalt overlay directly. However, the effective interlayer relative displacement obtained accurately using existing displacement sensors of the loading apparatus remains challenging. This study aims to utilize digital image correlation technology to enhance the accuracy of interfacial contact parameters by obtaining effective interlayer relative displacements. Composite structure specimens were prepared, and fixtures for interlayer shear tests were designed and fabricated. Subsequently, a digital image recognition scheme for required markers was designed and optimized. Effective interlayer relative displacement values were obtained through image recognition and calculation of surface markers on specimens. Finite element simulations validated the mechanical response of composite specimens with interlayer shearing. Results indicated that an optimized marking approach using the wall mending agent for surface application and color coding enhanced the image recognition quality of marking points on the specimen surface. Further image extraction provided effective interlayer relative displacement values during interlayer shear, thereby improving the accuracy of interface contact parameters. For composite structure specimens utilizing Styrene-Butadiene-Styrene (SBS) modified asphalt as the tack coat, the corresponding maximum interlayer shear stress strength was 0.6 MPa, and fracture energy was 2917 J/m2. This research provides valuable insights for investigating the impact of interlayer contact in composite pavement structures on the mechanical characteristics of asphalt overlay.Keywords: interlayer contact, effective relative displacement, digital image correlation technology, composite pavement structure, asphalt overlay
Procedia PDF Downloads 517445 Deep Neural Networks for Restoration of Sky Images Affected by Static and Anisotropic Aberrations
Authors: Constanza A. Barriga, Rafael Bernardi, Amokrane Berdja, Christian D. Guzman
Abstract:
Most image restoration methods in astronomy rely upon probabilistic tools that infer the best solution for a deconvolution problem. They achieve good performances when the point spread function (PSF) is spatially invariable in the image plane. However, this latter condition is not always satisfied with real optical systems. PSF angular variations cannot be evaluated directly from the observations, neither be corrected at a pixel resolution. We have developed a method for the restoration of images affected by static and anisotropic aberrations using deep neural networks that can be directly applied to sky images. The network is trained using simulated sky images corresponding to the T-80 telescope optical system, an 80 cm survey imager at Cerro Tololo (Chile), which are synthesized using a Zernike polynomial representation of the optical system. Once trained, the network can be used directly on sky images, outputting a corrected version of the image, which has a constant and known PSF across its field-of-view. The method was tested with the T-80 telescope, achieving better results than with PSF deconvolution techniques. We present the method and results on this telescope.Keywords: aberrations, deep neural networks, image restoration, variable point spread function, wide field images
Procedia PDF Downloads 1397444 Laser Ultrasonic Imaging Based on Synthetic Aperture Focusing Technique Algorithm
Authors: Sundara Subramanian Karuppasamy, Che Hua Yang
Abstract:
In this work, the laser ultrasound technique has been used for analyzing and imaging the inner defects in metal blocks. To detect the defects in blocks, traditionally the researchers used piezoelectric transducers for the generation and reception of ultrasonic signals. These transducers can be configured into the sparse and phased array. But these two configurations have their drawbacks including the requirement of many transducers, time-consuming calculations, limited bandwidth, and provide confined image resolution. Here, we focus on the non-contact method for generating and receiving the ultrasound to examine the inner defects in aluminum blocks. A Q-switched pulsed laser has been used for the generation and the reception is done by using Laser Doppler Vibrometer (LDV). Based on the Doppler effect, LDV provides a rapid and high spatial resolution way for sensing ultrasonic waves. From the LDV, a series of scanning points are selected which serves as the phased array elements. The side-drilled hole of 10 mm diameter with a depth of 25 mm has been introduced and the defect is interrogated by the linear array of scanning points obtained from the LDV. With the aid of the Synthetic Aperture Focusing Technique (SAFT) algorithm, based on the time-shifting principle the inspected images are generated from the A-scan data acquired from the 1-D linear phased array elements. Thus the defect can be precisely detected with good resolution.Keywords: laser ultrasonics, linear phased array, nondestructive testing, synthetic aperture focusing technique, ultrasonic imaging
Procedia PDF Downloads 1367443 Bioinformatic Study of Follicle Stimulating Hormone Receptor (FSHR) Gene in Different Buffalo Breeds
Authors: Hamid Mustafa, Adeela Ajmal, Kim EuiSoo, Noor-ul-Ain
Abstract:
World wild, buffalo production is considered as most important component of food industry. Efficient buffalo production is related with reproductive performance of this species. Lack of knowledge of reproductive efficiency and its related genes in buffalo species is a major constraint for sustainable buffalo production. In this study, we performed some bioinformatics analysis on Follicle Stimulating Hormone Receptor (FSHR) gene and explored the possible relationship of this gene among different buffalo breeds and with other farm animals. We also found the evolution pattern for this gene among these species. We investigate CDS lengths, Stop codon variation, homology search, signal peptide, isoelectic point, tertiary structure, motifs and phylogenetic tree. The results of this study indicate 4 different motif in this gene, which are Activin-recp, GS motif, STYKc Protein kinase and transmembrane. The results also indicate that this gene has very close relationship with cattle, bison, sheep and goat. Multiple alignment (MA) showed high conservation of motif which indicates constancy of this gene during evolution. The results of this study can be used and applied for better understanding of this gene for better characterization of Follicle Stimulating Hormone Receptor (FSHR) gene structure in different farm animals, which would be helpful for efficient breeding plans for animal’s production.Keywords: buffalo, FSHR gene, bioinformatics, production
Procedia PDF Downloads 5367442 Heart Murmurs and Heart Sounds Extraction Using an Algorithm Process Separation
Authors: Fatima Mokeddem
Abstract:
The phonocardiogram signal (PCG) is a physiological signal that reflects heart mechanical activity, is a promising tool for curious researchers in this field because it is full of indications and useful information for medical diagnosis. PCG segmentation is a basic step to benefit from this signal. Therefore, this paper presents an algorithm that serves the separation of heart sounds and heart murmurs in case they exist in order to use them in several applications and heart sounds analysis. The separation process presents here is founded on three essential steps filtering, envelope detection, and heart sounds segmentation. The algorithm separates the PCG signal into S1 and S2 and extract cardiac murmurs.Keywords: phonocardiogram signal, filtering, Envelope, Detection, murmurs, heart sounds
Procedia PDF Downloads 1447441 Association Rules Mining and NOSQL Oriented Document in Big Data
Authors: Sarra Senhadji, Imene Benzeguimi, Zohra Yagoub
Abstract:
Big Data represents the recent technology of manipulating voluminous and unstructured data sets over multiple sources. Therefore, NOSQL appears to handle the problem of unstructured data. Association rules mining is one of the popular techniques of data mining to extract hidden relationship from transactional databases. The algorithm for finding association dependencies is well-solved with Map Reduce. The goal of our work is to reduce the time of generating of frequent itemsets by using Map Reduce and NOSQL database oriented document. A comparative study is given to evaluate the performances of our algorithm with the classical algorithm Apriori.Keywords: Apriori, Association rules mining, Big Data, Data Mining, Hadoop, MapReduce, MongoDB, NoSQL
Procedia PDF Downloads 1667440 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores
Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay
Abstract:
Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.Keywords: retail stores, faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition
Procedia PDF Downloads 1627439 A Convolution Neural Network Approach to Predict Pes-Planus Using Plantar Pressure Mapping Images
Authors: Adel Khorramrouz, Monireh Ahmadi Bani, Ehsan Norouzi, Morvarid Lalenoor
Abstract:
Background: Plantar pressure distribution measurement has been used for a long time to assess foot disorders. Plantar pressure is an important component affecting the foot and ankle function and Changes in plantar pressure distribution could indicate various foot and ankle disorders. Morphologic and mechanical properties of the foot may be important factors affecting the plantar pressure distribution. Accurate and early measurement may help to reduce the prevalence of pes planus. With recent developments in technology, new techniques such as machine learning have been used to assist clinicians in predicting patients with foot disorders. Significance of the study: This study proposes a neural network learning-based flat foot classification methodology using static foot pressure distribution. Methodologies: Data were collected from 895 patients who were referred to a foot clinic due to foot disorders. Patients with pes planus were labeled by an experienced physician based on clinical examination. Then all subjects (with and without pes planus) were evaluated for static plantar pressures distribution. Patients who were diagnosed with the flat foot in both feet were included in the study. In the next step, the leg length was normalized and the network was trained for plantar pressure mapping images. Findings: From a total of 895 image data, 581 were labeled as pes planus. A computational neural network (CNN) ran to evaluate the performance of the proposed model. The prediction accuracy of the basic CNN-based model was performed and the prediction model was derived through the proposed methodology. In the basic CNN model, the training accuracy was 79.14%, and the test accuracy was 72.09%. Conclusion: This model can be easily and simply used by patients with pes planus and doctors to predict the classification of pes planus and prescreen for possible musculoskeletal disorders related to this condition. However, more models need to be considered and compared for higher accuracy.Keywords: foot disorder, machine learning, neural network, pes planus
Procedia PDF Downloads 3687438 Hydrographic Mapping Based on the Concept of Fluvial-Geomorphological Auto-Classification
Authors: Jesús Horacio, Alfredo Ollero, Víctor Bouzas-Blanco, Augusto Pérez-Alberti
Abstract:
Rivers have traditionally been classified, assessed and managed in terms of hydrological, chemical and / or biological criteria. Geomorphological classifications had in the past a secondary role, although proposals like River Styles Framework, Catchment Baseline Survey or Stroud Rural Sustainable Drainage Project did incorporate geomorphology for management decision-making. In recent years many studies have been attracted to the geomorphological component. The geomorphological processes and their associated forms determine the structure of a river system. Understanding these processes and forms is a critical component of the sustainable rehabilitation of aquatic ecosystems. The fluvial auto-classification approach suggests that a river is a self-built natural system, with processes and forms designed to effectively preserve their ecological function (hydrologic, sedimentological and biological regime). Fluvial systems are formed by a wide range of elements with multiple non-linear interactions on different spatial and temporal scales. Besides, the fluvial auto-classification concept is built using data from the river itself, so that each classification developed is peculiar to the river studied. The variables used in the classification are specific stream power and mean grain size. A discriminant analysis showed that these variables are the best characterized processes and forms. The statistical technique applied allows to get an individual discriminant equation for each geomorphological type. The geomorphological classification was developed using sites with high naturalness. Each site is a control point of high ecological and geomorphological quality. The changes in the conditions of the control points will be quickly recognizable, and easy to apply a right management measures to recover the geomorphological type. The study focused on Galicia (NW Spain) and the mapping was made analyzing 122 control points (sites) distributed over eight river basins. In sum, this study provides a method for fluvial geomorphological classification that works as an open and flexible tool underlying the fluvial auto-classification concept. The hydrographic mapping is the visual expression of the results, such that each river has a particular map according to its geomorphological characteristics. Each geomorphological type is represented by a particular type of hydraulic geometry (channel width, width-depth ratio, hydraulic radius, etc.). An alteration of this geometry is indicative of a geomorphological disturbance (whether natural or anthropogenic). Hydrographic mapping is also dynamic because its meaning changes if there is a modification in the specific stream power and/or the mean grain size, that is, in the value of their equations. The researcher has to check annually some of the control points. This procedure allows to monitor the geomorphology quality of the rivers and to see if there are any alterations. The maps are useful to researchers and managers, especially for conservation work and river restoration.Keywords: fluvial auto-classification concept, mapping, geomorphology, river
Procedia PDF Downloads 3687437 Influence of Species and Harvesting Height on Chemical Composition, Buffer Nitrogen Solubility and in vitro Ruminal Fermentation of Browse Tree Leaves
Authors: Thabiso M. Sebolai, Victor Mlambo, Solomon Tefera, Othusitse R. Madibela
Abstract:
In some tree species, sustained herbivory can induce changes in biosynthetic pathways resulting in overproduction of anti-nutritional secondary plant compounds. This inductive mechanism, which has not been demonstrated in semi-arid rangelands of South Africa, may result in browse leaves of lower nutritive value. In this study we investigate the interactive effect of browsing pressure and tree species on chemical composition, buffer nitrogen solubility index (NSI), in vitro ruminal dry matter degradability (IVDMD) and in vitro ruminal N degradability (IVND) of leaves. Leaves from Maytenus capitata, Olea africana, Coddia rudis, Carissa macrocarpa, Rhus refracta, Ziziphus mucronata, Boscia oliedes, Grewia robusta, Phyllanthus vessucosus and Ehretia rigida trees growing in a communal grazing area were harvested at two heights: browsable ( < 1.5 m) and non-browsable ( > 1.5 m), representing high and low browsing pressure, respectively. The type of animals utilizing the communal rangeland includes cattle at 1 livestock unit (450kg)/12 to 15 hectors and goats at 1 livestock unit/4 ha. Harvested leaves were dried, milled and analysed for proximate components, soluble phenolics, condensed tannins, minerals and in vitro ruminal fermentation. A significant plant species and harvesting height interaction effect (P < 0.05) was observed for total nitrogen (N) and soluble phenolics concentration. Tree species and harvesting height affected (P < 0.05) condensed tannin (CTs) content where samples harvested from the non-browsable height had higher (0.61 AU550 nm/200 mg) levels than those harvested at browsable height (0.55 AU550 nm/200 mg) while their interaction had no effects. Macro and micro-minerals were only influenced (P < 0.05) by browse species but not harvesting height. Species and harvesting height interacted (P < 0.05) to influence IVDMD and IVND of leaves at 12, 24 and 36 hours of incubation. The different browse leaves contained moderate to high protein, moderate level of phenolics and minerals, suggesting that they have the potential to provide supplementary nutrients for ruminants during the dry seasons.Keywords: browse plants, chemical composition, harvesting heights, phenolics
Procedia PDF Downloads 1487436 Stochastic Programming and C-Somga: Animal Ration Formulation
Authors: Pratiksha Saxena, Dipti Singh, Neha Khanna
Abstract:
A self-organizing migrating genetic algorithm(C-SOMGA) is developed for animal diet formulation. This paper presents animal diet formulation using stochastic and genetic algorithm. Tri-objective models for cost minimization and shelf life maximization are developed. These objectives are achieved by combination of stochastic programming and C-SOMGA. Stochastic programming is used to introduce nutrient variability for animal diet. Self-organizing migrating genetic algorithm provides exact and quick solution and presents an innovative approach towards successful application of soft computing technique in the area of animal diet formulation.Keywords: animal feed ration, feed formulation, linear programming, stochastic programming, self-migrating genetic algorithm, C-SOMGA technique, shelf life maximization, cost minimization, nutrient maximization
Procedia PDF Downloads 4477435 Lesson of Moral Teaching of the Sokoto Caliphate in the Quest for Genuine National Development in Nigeria
Authors: Murtala Marafa
Abstract:
It’s been 50 years now since we began the desperate search for a genuine all round development as a nation. Painfully though, like a wild goose chase, the search for that promised land had remain elusive. In this piece, recourse is made to the sound administrative qualities of the 19th century Sokoto Caliphate leaders. It enabled them to administer the vast entity on the basis of mutual peace and justice. It also guaranteed a just political order built on a sound and viable economy. The paper is of the view that if the Nigerian society can allow for a replication of such moral virtues as exemplified by the founding fathers of the Caliphate, Nigeria could transform into a politically coherent and economically viable nation aspired by all.Keywords: administration, religion, sokoto caliphate, moral teachings
Procedia PDF Downloads 2787434 EcoTeka, an Open-Source Software for Urban Ecosystem Restoration through Technology
Authors: Manon Frédout, Laëtitia Bucari, Mathias Aloui, Gaëtan Duhamel, Olivier Rovellotti, Javier Blanco
Abstract:
Ecosystems must be resilient to ensure cleaner air, better water and soil quality, and thus healthier citizens. Technology can be an excellent tool to support urban ecosystem restoration projects, especially when based on Open Source and promoting Open Data. This is the goal of the ecoTeka application: one single digital tool for tree management which allows decision-makers to improve their urban forestry practices, enabling more responsible urban planning and climate change adaptation. EcoTeka provides city councils with three main functionalities tackling three of their challenges: easier biodiversity inventories, better green space management, and more efficient planning. To answer the cities’ need for reliable tree inventories, the application has been first built with open data coming from the websites OpenStreetMap and OpenTrees, but it will also include very soon the possibility of creating new data. To achieve this, a multi-source algorithm will be elaborated, based on existing artificial intelligence Deep Forest, integrating open-source satellite images, 3D representations from LiDAR, and street views from Mapillary. This data processing will permit identifying individual trees' position, height, crown diameter, and taxonomic genus. To support urban forestry management, ecoTeka offers a dashboard for monitoring the city’s tree inventory and trigger alerts to inform about upcoming due interventions. This tool was co-constructed with the green space departments of the French cities of Alès, Marseille, and Rouen. The third functionality of the application is a decision-making tool for urban planning, promoting biodiversity and landscape connectivity metrics to drive ecosystem restoration roadmap. Based on landscape graph theory, we are currently experimenting with new methodological approaches to scale down regional ecological connectivity principles to local biodiversity conservation and urban planning policies. This methodological framework will couple graph theoretic approach and biological data, mainly biodiversity occurrences (presence/absence) data available on both international (e.g., GBIF), national (e.g., Système d’Information Nature et Paysage) and local (e.g., Atlas de la Biodiversté Communale) biodiversity data sharing platforms in order to help reasoning new decisions for ecological networks conservation and restoration in urban areas. An experiment on this subject is currently ongoing with Montpellier Mediterranee Metropole. These projects and studies have shown that only 26% of tree inventory data is currently geo-localized in France - the rest is still being done on paper or Excel sheets. It seems that technology is not yet used enough to enrich the knowledge city councils have about biodiversity in their city and that existing biodiversity open data (e.g., occurrences, telemetry, or genetic data), species distribution models, landscape graph connectivity metrics are still underexploited to make rational decisions for landscape and urban planning projects. This is the goal of ecoTeka: to support easier inventories of urban biodiversity and better management of urban spaces through rational planning and decisions relying on open databases. Future studies and projects will focus on the development of tools for reducing the artificialization of soils, selecting plant species adapted to climate change, and highlighting the need for ecosystem and biodiversity services in cities.Keywords: digital software, ecological design of urban landscapes, sustainable urban development, urban ecological corridor, urban forestry, urban planning
Procedia PDF Downloads 787433 Comparative Isotherms Studies on Adsorptive Removal of Methyl Orange from Wastewater by Watermelon Rinds and Neem-Tree Leaves
Authors: Sadiq Sani, Muhammad B. Ibrahim
Abstract:
Watermelon rinds powder (WRP) and neem-tree leaves powder (NLP) were used as adsorbents for equilibrium adsorption isotherms studies for detoxification of methyl orange dye (MO) from simulated wastewater. The applicability of the process to various isotherm models was tested. All isotherms from the experimental data showed excellent linear reliability (R2: 0.9487-0.9992) but adsorptions onto WRP were more reliable (R2: 0.9724-0.9992) than onto NLP (R2: 0.9487-0.9989) except for Temkin’s Isotherm where reliability was better onto NLP (R2: 0.9937) than onto WRP (R2: 0.9935). Dubinin-Radushkevich’s monolayer adsorption capacities for both WRP and NLP (qD: 20.72 mg/g, 23.09 mg/g) were better than Langmuir’s (qm: 18.62 mg/g, 21.23 mg/g) with both capacities higher for adsorption onto NLP (qD: 23.09 mg/g; qm: 21.23 mg/g) than onto WRP (qD: 20.72 mg/g; qm: 18.62 mg/g). While values for Langmuir’s separation factor (RL) for both adsorbents suggested unfavourable adsorption processes (RL: -0.0461, -0.0250), Freundlich constant (nF) indicated favourable process onto both WRP (nF: 3.78) and NLP (nF: 5.47). Adsorption onto NLP had higher Dubinin-Radushkevich’s mean free energy of adsorption (E: 0.13 kJ/mol) than WRP (E: 0.08 kJ/mol) and Temkin’s heat of adsorption (bT) was better onto NLP (bT: -0.54 kJ/mol) than onto WRP (bT: -0.95 kJ/mol) all of which suggested physical adsorption.Keywords: adsorption isotherms, methyl orange, neem leaves, watermelon rinds
Procedia PDF Downloads 2777432 Facial Emotion Recognition with Convolutional Neural Network Based Architecture
Authors: Koray U. Erbas
Abstract:
Neural networks are appealing for many applications since they are able to learn complex non-linear relationships between input and output data. As the number of neurons and layers in a neural network increase, it is possible to represent more complex relationships with automatically extracted features. Nowadays Deep Neural Networks (DNNs) are widely used in Computer Vision problems such as; classification, object detection, segmentation image editing etc. In this work, Facial Emotion Recognition task is performed by proposed Convolutional Neural Network (CNN)-based DNN architecture using FER2013 Dataset. Moreover, the effects of different hyperparameters (activation function, kernel size, initializer, batch size and network size) are investigated and ablation study results for Pooling Layer, Dropout and Batch Normalization are presented.Keywords: convolutional neural network, deep learning, deep learning based FER, facial emotion recognition
Procedia PDF Downloads 2807431 Unraveling the Threads of Madness: Henry Russell’s 'The Maniac' as an Advocate for Deinstitutionalization in the Nineteenth Century
Authors: T. J. Laws-Nicola
Abstract:
Henry Russell was best known as a composer of more than 300 songs. Many of his compositions were popular for both their sentimental texts, as in ‘The Old Armchair,’ and those of a more political nature, such as ‘Woodsman, Spare That Tree!’ Indeed, Russell had written such songs of advocacy as those associated with abolitionism (‘The Slave Ship’) and environmentalism (‘Woodsman, Spare that Tree!’). ‘The Maniac’ is his only composition addressing the issue of institutionalization. The text is borrowed and adapted from the monodrama The Captive by M.G. ‘Monk’ Lewis. Through an analysis of form, harmony, melody, text, and thematic development and interactions between text and music we can approach a clearer understanding of ‘The Maniac’ and how the text and music interact. Select periodicals, such as The London Times, provide contemporary critical review for ‘The Maniac.’ Additional nineteenth century songs whose texts focus on madness and/or institutionalization will assist in building a stylistic and cultural context for ‘The Maniac.’ Through comparative analyses of ‘The Maniac’ with a body of songs that focus on similar topics, we can approach a clear understanding of the song as a vehicle for deinstitutionalization.Keywords: 19th century song, institutionalization, M. G. Lewis, Henry Russell
Procedia PDF Downloads 5397430 Electronic Nose Based on Metal Oxide Semiconductor Sensors as an Alternative Technique for the Spoilage Classification of Oat Milk
Authors: A. Deswal, N. S. Deora, H. N. Mishra
Abstract:
The aim of the present study was to develop a rapid method for electronic nose for online quality control of oat milk. Analysis by electronic nose and bacteriological measurements were performed to analyse spoilage kinetics of oat milk samples stored at room temperature and refrigerated conditions for up to 15 days. Principal component analysis (PCA), discriminant factorial analysis (DFA) and soft independent modelling by class analogy (SIMCA) classification techniques were used to differentiate the samples of oat milk at different days. The total plate count (bacteriological method) was selected as the reference method to consistently train the electronic nose system. The e-nose was able to differentiate between the oat milk samples of varying microbial load. The results obtained by the bacteria total viable counts showed that the shelf-life of oat milk stored at room temperature and refrigerated conditions were 20 hours and 13 days, respectively. The models built classified oat milk samples based on the total microbial population into “unspoiled” and “spoiled”.Keywords: electronic-nose, bacteriological, shelf-life, classification
Procedia PDF Downloads 2617429 Quality Assurances for an On-Board Imaging System of a Linear Accelerator: Five Months Data Analysis
Authors: Liyun Chang, Cheng-Hsiang Tsai
Abstract:
To ensure the radiation precisely delivering to the target of cancer patients, the linear accelerator equipped with the pretreatment on-board imaging system is introduced and through it the patient setup is verified before the daily treatment. New generation radiotherapy using beam-intensity modulation, usually associated the treatment with steep dose gradients, claimed to have achieved both a higher degree of dose conformation in the targets and a further reduction of toxicity in normal tissues. However, this benefit is counterproductive if the beam is delivered imprecisely. To avoid shooting critical organs or normal tissues rather than the target, it is very important to carry out the quality assurance (QA) of this on-board imaging system. The QA of the On-Board Imager® (OBI) system of one Varian Clinac-iX linear accelerator was performed through our procedures modified from a relevant report and AAPM TG142. Two image modalities, 2D radiography and 3D cone-beam computed tomography (CBCT), of the OBI system were examined. The daily and monthly QA was executed for five months in the categories of safety, geometrical accuracy and image quality. A marker phantom and a blade calibration plate were used for the QA of geometrical accuracy, while the Leeds phantom and Catphan 504 phantom were used in the QA of radiographic and CBCT image quality, respectively. The reference images were generated through a GE LightSpeed CT simulator with an ADAC Pinnacle treatment planning system. Finally, the image quality was analyzed via an OsiriX medical imaging system. For the geometrical accuracy test, the average deviations of the OBI isocenter in each direction are less than 0.6 mm with uncertainties less than 0.2 mm, while all the other items have the displacements less than 1 mm. For radiographic image quality, the spatial resolution is 1.6 lp/cm with contrasts less than 2.2%. The spatial resolution, low contrast, and HU homogenous of CBCT are larger than 6 lp/cm, less than 1% and within 20 HU, respectively. All tests are within the criteria, except the HU value of Teflon measured with the full fan mode exceeding the suggested value that could be due to itself high HU value and needed to be rechecked. The OBI system in our facility was then demonstrated to be reliable with stable image quality. The QA of OBI system is really necessary to achieve the best treatment for a patient.Keywords: CBCT, image quality, quality assurance, OBI
Procedia PDF Downloads 3037428 Agile Software Effort Estimation Using Regression Techniques
Authors: Mikiyas Adugna
Abstract:
Effort estimation is among the activities carried out in software development processes. An accurate model of estimation leads to project success. The method of agile effort estimation is a complex task because of the dynamic nature of software development. Researchers are still conducting studies on agile effort estimation to enhance prediction accuracy. Due to these reasons, we investigated and proposed a model on LASSO and Elastic Net regression to enhance estimation accuracy. The proposed model has major components: preprocessing, train-test split, training with default parameters, and cross-validation. During the preprocessing phase, the entire dataset is normalized. After normalization, a train-test split is performed on the dataset, setting training at 80% and testing set to 20%. We chose two different phases for training the two algorithms (Elastic Net and LASSO) regression following the train-test-split. In the first phase, the two algorithms are trained using their default parameters and evaluated on the testing data. In the second phase, the grid search technique (the grid is used to search for tuning and select optimum parameters) and 5-fold cross-validation to get the final trained model. Finally, the final trained model is evaluated using the testing set. The experimental work is applied to the agile story point dataset of 21 software projects collected from six firms. The results show that both Elastic Net and LASSO regression outperformed the compared ones. Compared to the proposed algorithms, LASSO regression achieved better predictive performance and has acquired PRED (8%) and PRED (25%) results of 100.0, MMRE of 0.0491, MMER of 0.0551, MdMRE of 0.0593, MdMER of 0.063, and MSE of 0.0007. The result implies LASSO regression algorithm trained model is the most acceptable, and higher estimation performance exists in the literature.Keywords: agile software development, effort estimation, elastic net regression, LASSO
Procedia PDF Downloads 757427 Poster : Incident Signals Estimation Based on a Modified MCA Learning Algorithm
Authors: Rashid Ahmed , John N. Avaritsiotis
Abstract:
Many signal subspace-based approaches have already been proposed for determining the fixed Direction of Arrival (DOA) of plane waves impinging on an array of sensors. Two procedures for DOA estimation based neural networks are presented. First, Principal Component Analysis (PCA) is employed to extract the maximum eigenvalue and eigenvector from signal subspace to estimate DOA. Second, minor component analysis (MCA) is a statistical method of extracting the eigenvector associated with the smallest eigenvalue of the covariance matrix. In this paper, we will modify a Minor Component Analysis (MCA(R)) learning algorithm to enhance the convergence, where a convergence is essential for MCA algorithm towards practical applications. The learning rate parameter is also presented, which ensures fast convergence of the algorithm, because it has direct effect on the convergence of the weight vector and the error level is affected by this value. MCA is performed to determine the estimated DOA. Preliminary results will be furnished to illustrate the convergences results achieved.Keywords: Direction of Arrival, neural networks, Principle Component Analysis, Minor Component Analysis
Procedia PDF Downloads 4547426 Designing a Corpus Database to Enhance the Learning of Old English Language
Authors: Raquel Mateo Mendaza, Carmen Novo Urraca
Abstract:
The current paper presents the elaboration of a corpus database that aligns two different corpora in order to simplify the search of information both for researchers and students of Old English. This database comprises the information contained in two main reference corpora, namely the Dictionary of Old English Corpus (DOEC), compiled at the University of Toronto, and the York-Toronto-Helsinki Parsed Corpus of Old English (YCOE). The first one provides information on all surviving texts written in the Old English language. The latter offers the syntactical and morphological annotation of several texts included in the DOEC. Although both corpora are closely related, as the YCOE includes the DOE source text identifier, the main problem detected is that there is not an alignment of texts that allows for the search of whole fragments to be further analysed in terms of morphology and syntax. The database proposed in this paper gathers all this information and presents it in a simple, more accessible, visual, and educational way. The alignment of fragments has been done in an automatized way. However, some problems have emerged during the creating process particularly related to the lack of correspondence in the division of fragments. For this reason, it has been necessary to revise the whole entries manually to obtain a truthful high-quality product and to carefully indicate the gaps encountered in these corpora. All in all, this database contains more than 60,000 entries corresponding with the DOE fragments annotated by the YCOE. The main strength of the resulting product is its research and teaching implications in the study of Old English. The use of this database will help researchers and students in the study of different aspects of the language, such as inflectional morphology, syntactic behaviour of given words, or translation studies, among others. By means of the search of words or fragments, the annotated information on morphology and syntax will be automatically displayed, automatizing, and speeding up the search of data.Keywords: alignment, corpus database, morphosyntactic analysis, Old English
Procedia PDF Downloads 1367425 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks
Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam
Abstract:
In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion
Procedia PDF Downloads 1267424 Finding Bicluster on Gene Expression Data of Lymphoma Based on Singular Value Decomposition and Hierarchical Clustering
Authors: Alhadi Bustaman, Soeganda Formalidin, Titin Siswantining
Abstract:
DNA microarray technology is used to analyze thousand gene expression data simultaneously and a very important task for drug development and test, function annotation, and cancer diagnosis. Various clustering methods have been used for analyzing gene expression data. However, when analyzing very large and heterogeneous collections of gene expression data, conventional clustering methods often cannot produce a satisfactory solution. Biclustering algorithm has been used as an alternative approach to identifying structures from gene expression data. In this paper, we introduce a transform technique based on singular value decomposition to identify normalized matrix of gene expression data followed by Mixed-Clustering algorithm and the Lift algorithm, inspired in the node-deletion and node-addition phases proposed by Cheng and Church based on Agglomerative Hierarchical Clustering (AHC). Experimental study on standard datasets demonstrated the effectiveness of the algorithm in gene expression data.Keywords: agglomerative hierarchical clustering (AHC), biclustering, gene expression data, lymphoma, singular value decomposition (SVD)
Procedia PDF Downloads 2827423 Optimization of Steel Moment Frame Structures Using Genetic Algorithm
Authors: Mohammad Befkin, Alireza Momtaz
Abstract:
Structural design is the challenging aspect of every project due to limitations in dimensions, functionality of the structure, and more importantly, the allocated budget for construction. This research study aims to investigate the optimized design for three steel moment frame buildings with different number of stories using genetic algorithm code. The number and length of spans, and height of each floor were constant in all three buildings. The design of structures are carried out according to AISC code within the provisions of plastic design with allowable stress values. Genetic code for optimization is produced using MATLAB program, while buildings modeled in Opensees program and connected to the MATLAB code to perform iterations in optimization steps. In the end designs resulted from genetic algorithm code were compared with the analysis of buildings in ETABS program. The results demonstrated that suggested structural elements by the code utilize their full capacity, indicating the desirable efficiency of produced code.Keywords: genetic algorithm, structural analysis, steel moment frame, structural design
Procedia PDF Downloads 1247422 Complete Enumeration Approach for Calculation of Residual Entropy for Diluted Spin Ice
Authors: Yuriy A. Shevchenko, Konstantin V. Nefedev
Abstract:
We consider the antiferromagnetic systems of Ising spins located at the sites of the hexagonal, triangular and pyrochlore lattices. Such systems can be diluted to a certain concentration level by randomly replacing the magnetic spins with nonmagnetic ones. Quite recently we studied density of states (DOS) was calculated by the Wang-Landau method. Based on the obtained data, we calculated the dependence of the residual entropy (entropy at a temperature tending to zero) on the dilution concentration for quite large systems (more than 2000 spins). In the current study, we obtained the same data for small systems (less than 20 spins) by a complete search of all possible magnetic configurations and compared the result with the result for large systems. The shape of the curve remains unchanged in both cases, but the specific values of the residual entropy are different because of the finite size effect.Keywords: entropy, pyrochlore, spin ice, Wang-Landau algorithm
Procedia PDF Downloads 2707421 A Deletion-Cost Based Fast Compression Algorithm for Linear Vector Data
Authors: Qiuxiao Chen, Yan Hou, Ning Wu
Abstract:
As there are deficiencies of the classic Douglas-Peucker Algorithm (DPA), such as high risks of deleting key nodes by mistake, high complexity, time consumption and relatively slow execution speed, a new Deletion-Cost Based Compression Algorithm (DCA) for linear vector data was proposed. For each curve — the basic element of linear vector data, all the deletion costs of its middle nodes were calculated, and the minimum deletion cost was compared with the pre-defined threshold. If the former was greater than or equal to the latter, all remaining nodes were reserved and the curve’s compression process was finished. Otherwise, the node with the minimal deletion cost was deleted, its two neighbors' deletion costs were updated, and the same loop on the compressed curve was repeated till the termination. By several comparative experiments using different types of linear vector data, the comparison between DPA and DCA was performed from the aspects of compression quality and computing efficiency. Experiment results showed that DCA outperformed DPA in compression accuracy and execution efficiency as well.Keywords: Douglas-Peucker algorithm, linear vector data, compression, deletion cost
Procedia PDF Downloads 2547420 Design to Cryogenic System for Dilution Refrigerator with Cavity and Superconducting Magnet
Authors: Ki Woong Lee
Abstract:
The Center for Axion and Precision Physics Research is studying the search for dark matter using 12 tesla superconducting magnets. A dilution refrigerator is being used for search experiments, and superconducting magnets, superconducting cavities. The dilution refrigerator requires a stable cryogenic environment using liquid helium. Accordingly, a cryogenic system for a stable supply of liquid helium is to be established. This cryogenic system includes the liquefying, supply, storage, and purification of liquid helium. This article presents the basic design, construction, and operation plans for building cryogenic systems.Keywords: cryogenic system, dilution refrigerator, superconducting magnet, helium recovery system
Procedia PDF Downloads 127