Search results for: threshold method
19308 Socio-economic Baselining of Selected Icrmp Sites in Southwestern Cebu, Central Philippines
Authors: Rachel Luz P. Vivas-rica, Gloria G. Delan, Christine M. Corrales, Alfonso S. Piquero, Irene A. Monte
Abstract:
ABSTRACT -Selected Integrated Coastal Resource Management Program (ICRMP) sites in Southwestern Cebu were studied employing a stratified proportional sampling method using semi-structured questionnaires. Four hundred sixteen (416) respondents from five barangays with Marine Protected Areas (MPAs) and four barangays without marine sanctuaries were considered in the study. Results showed similarity of socio-economic characteristics in terms of average age, majority were middle aged, and married. Households were male dominated, obtained low education for both MPA and Non-MPA areas. In terms of occupation, majority in both areas engaged in fulltime fishing however part time jobs as carpenter, construction worker, driver or farmer as another income source. Most of the households were nuclear families with average family size of five for both MPA and Non-MPA. Fishing experience ranged from less than 1 year to more than 50 years. Fishing grounds were within the 15 kilometer radius of each considered site. Even if the respondents were totally dependent on fishing as a major source of income, still their income is way below the poverty threshold both in the MPA and Non-MPA areas. This is further explained by the marginality of their fishing implements wherein majority uses gill nets, hook & line, spear and paddle boat in fishing. Their volume of catch from an average of 6 hours fishing expedition ranges from half a kilo to a maximum of 4 kilos. Majority are not members of fishing groups or organizations.Keywords: integrated coastal resource management program, marine protected areas, socio-economic, poverty threshold
Procedia PDF Downloads 51819307 Improving the Detection of Depression in Sri Lanka: Cross-Sectional Study Evaluating the Efficacy of a 2-Question Screen for Depression
Authors: Prasad Urvashi, Wynn Yezarni, Williams Shehan, Ravindran Arun
Abstract:
Introduction: Primary health services are often the first point of contact that patients with mental illness have with the healthcare system. A number of tools have been developed to increase detection of depression in the context of primary care. However, one challenge amongst many includes utilizing these tools within the limited primary care consultation timeframe. Therefore, short questionnaires that screen for depression that are just as effective as more comprehensive diagnostic tools may be beneficial in improving detection rates of patients visiting a primary care setting. Objective: To develop and determine the sensitivity and specificity of a 2-Question Questionnaire (2-QQ) to screen for depression in in a suburban primary care clinic in Ragama, Sri Lanka. The purpose is to develop a short screening tool for depression that is culturally adapted in order to increase the detection of depression in the Sri Lankan patient population. Methods: This was a cross-sectional study involving two steps. Step one: verbal administration of 2-QQ to patients by their primary care physician. Step two: completion of the Peradeniya Depression Scale, a validated diagnostic tool for depression, the patient after their consultation with the primary care physician. The results from the PDS were then correlated to the results from the 2-QQ for each patient to determine sensitivity and specificity of the 2-QQ. Results: A score of 1/+ on the 2-QQ was most sensitive but least specific. Thus, setting the threshold at this level is effective for correctly identifying depressed patients, but also inaccurately captures patients who are not depressed. A score of 6 on the 2-QQ was most specific but least sensitive. Setting the threshold at this level is effective for correctly identifying patients without depression, but not very effective at capturing patients with depression. Discussion: In the context of primary care, it may be worthwhile setting the 2-QQ screen at a lower threshold for positivity (such as a score of 1 or above). This would generate a high test sensitivity and thus capture the majority of patients that have depression. On the other hand, by setting a low threshold for positivity, patients who do not have depression but score higher than 1 on the 2-QQ will also be falsely identified as testing positive for depression. However, the benefits of identifying patients who present with depression may outweigh the harms of falsely identifying a non-depressed patient. It is our hope that the 2-QQ will serve as a quick primary screen for depression in the primary care setting and serve as a catalyst to identify and treat individuals with depression.Keywords: depression, primary care, screening tool, Sri Lanka
Procedia PDF Downloads 25719306 Anti-Inflammatory and Analgesic Effects of Methanol Extract of Rhizophora racemosa Leaf in Albino Rats
Authors: Angalabiri-Owei E. Bekekeme, Brambaifa Nelson
Abstract:
In view of the peculiar environment of the Niger Delta, access to modern health care is limited, hence the inhabitants especially those in the swampy areas resorts to sourcing for alternatives cure for their ailments using plants commonly found in this area without scientific evaluation. Rhizophora racemosa, G. F. Meyer (Rhizophoraceae) is the most abundant mangrove plant in the Niger Delta Area of Nigeria. The plant has been observed to be used for relief of a toothache and dysmenorrhoea among some Ijaw communities in the region. This work has revealed the likely potential of the plant in drug discovery and development. The crude methanol extract at doses of 300 mg/kg and 600 mg/kg (intraperitoneal) were tested for analgesic effect using fresh egg albumin induced inflammatory pain and Randall–Sellito method to assess the pain threshold. The anti-inflammatory effect was also evaluated with the extract at doses of 300 mg/kg and 600 mg/kg (intraperitoneal) using acute inflammatory model; fresh egg albumin induced paw oedema and assessed using Plethysmometer in rats. The methanol extracts 300 mg/kg and 600 mg/kg exhibited a significant (P < 0.001) and dose-dependent analgesic activity compared with the negative control and a standard drug diclofenac using ANOVA with Least Significant Difference post hoc test as evidenced by increased pain threshold. Also, the extract significantly (P < 0.001) reduced the rat paw oedema induced by the sub plantar injection of fresh egg albumin when compared with the negative control and a standard diclofenac using above statistical methods. This study revealed that the plant possesses analgesic and anti-inflammatory activities hence provide scientific bases for use as medicine.Keywords: analgesic, anti-inflammatory, plethysmometer, Rhizophora racemosa
Procedia PDF Downloads 35719305 Particle Size Distribution Estimation of a Mixture of Regular and Irregular Sized Particles Using Acoustic Emissions
Authors: Ejay Nsugbe, Andrew Starr, Ian Jennions, Cristobal Ruiz-Carcel
Abstract:
This works investigates the possibility of using Acoustic Emissions (AE) to estimate the Particle Size Distribution (PSD) of a mixture of particles that comprise of particles of different densities and geometry. The experiments carried out involved the mixture of a set of glass and polyethylene particles that ranged from 150-212 microns and 150-250 microns respectively and an experimental rig that allowed the free fall of a continuous stream of particles on a target plate which the AE sensor was placed. By using a time domain based multiple threshold method, it was observed that the PSD of the particles in the mixture could be estimated.Keywords: acoustic emissions, particle sizing, process monitoring, signal processing
Procedia PDF Downloads 35219304 Assessment of Chromium Concentration and Human Health Risk in the Steelpoort River Sub-Catchment of the Olifants River Basin, South Africa
Authors: Abraham Addo-Bediako
Abstract:
Many freshwater ecosystems are facing immense pressure from anthropogenic activities, such as agricultural, industrial and mining. Trace metal pollution in freshwater ecosystems has become an issue of public health concern due to its toxicity and persistence in the environment. Trace elements pose a serious risk not only to the environment and aquatic biota but also humans. Chromium is one of such trace elements and its pollution in surface waters and groundwaters represents a serious environmental problem. In South Africa, agriculture, mining, industrial and domestic wastes are the main contributors to chromium discharge in rivers. The common forms of chromium are chromium (III) and chromium (VI). The latter is the most toxic because it can cause damage to human health. The aim of the study was to assess the contamination of chromium in the water and sediments of two rivers in the Steelpoort River sub-catchment of the Olifants River Basin, South Africa and human health risk. The concentration of Cr was analyzed using inductively coupled plasma–optical emission spectrometry (ICP-OES). The concentration of the metal was found to exceed the threshold limit, mainly in areas of high human activities. The hazard quotient through ingestion exposure did not exceed the threshold limit of 1 for adults and children and cancer risk for adults and children computed did not exceed the threshold limit of 10-4. Thus, there is no potential health risk from chromium through ingestion of drinking water for now. However, with increasing human activities, especially mining, the concentration could increase and become harmful to humans who depend on rivers for drinking water. It is recommended that proper management strategies should be taken to minimize the impact of chromium on the rivers and water from the rivers should properly be treated before domestic use.Keywords: land use, health risk, metal pollution, water quality
Procedia PDF Downloads 8719303 Design and Implementation of Low-code Model-building Methods
Authors: Zhilin Wang, Zhihao Zheng, Linxin Liu
Abstract:
This study proposes a low-code model-building approach that aims to simplify the development and deployment of artificial intelligence (AI) models. With an intuitive way to drag and drop and connect components, users can easily build complex models and integrate multiple algorithms for training. After the training is completed, the system automatically generates a callable model service API. This method not only lowers the technical threshold of AI development and improves development efficiency but also enhances the flexibility of algorithm integration and simplifies the deployment process of models. The core strength of this method lies in its ease of use and efficiency. Users do not need to have a deep programming background and can complete the design and implementation of complex models with a simple drag-and-drop operation. This feature greatly expands the scope of AI technology, allowing more non-technical people to participate in the development of AI models. At the same time, the method performs well in algorithm integration, supporting many different types of algorithms to work together, which further improves the performance and applicability of the model. In the experimental part, we performed several performance tests on the method. The results show that compared with traditional model construction methods, this method can make more efficient use, save computing resources, and greatly shorten the model training time. In addition, the system-generated model service interface has been optimized for high availability and scalability, which can adapt to the needs of different application scenarios.Keywords: low-code, model building, artificial intelligence, algorithm integration, model deployment
Procedia PDF Downloads 2919302 Dynamic Degradation Mechanism of SiC VDMOS under Proton Irradiation
Authors: Junhong Feng, Wenyu Lu, Xinhong Cheng, Li Zheng, Yuehui Yu
Abstract:
The effects of proton irradiation on the properties of gate oxide were evaluated by monitoring the static parameters (such as threshold voltage and on-resistance) and dynamic parameters (Miller plateau time) of 1700V SiC VDMOS before and after proton irradiation. The incident proton energy was 3MeV, and the doses were 5 × 10¹² P / cm², 1 × 10¹³ P / cm², respectively. The results show that the threshold voltage of MOS exhibits negative drift under proton irradiation, and the near-interface traps in the gate oxide layer are occupied by holes generated by the ionization effect of irradiation, thus forming more positive charges. The basis for selecting TMiller is that the change time of Vgs is the time when Vds just shows an upward trend until it rises to a stable value. The degradation of the turn-off time of the Miller platform verifies that the capacitance Cgd becomes larger, reflecting that the gate oxide layer is introduced into the trap by the displacement effect caused by proton irradiation, and the interface state deteriorates. As a more sensitive area in the irradiation process, the gate oxide layer will be optimized for its parameters (such as thickness, type, etc.) in subsequent studies.Keywords: SiC VDMOS, proton radiation, Miller time, gate oxide
Procedia PDF Downloads 9019301 Experimental Demonstration of an Ultra-Low Power Vertical-Cavity Surface-Emitting Laser for Optical Power Generation
Authors: S. Nazhan, Hassan K. Al-Musawi, Khalid A. Humood
Abstract:
This paper reports on an experimental investigation into the influence of current modulation on the properties of a vertical-cavity surface-emitting laser (VCSEL) with a direct square wave modulation. The optical output power response, as a function of the pumping current, modulation frequency, and amplitude, is measured for an 850 nm VCSEL. We demonstrate that modulation frequency and amplitude play important roles in reducing the VCSEL’s power consumption for optical generation. Indeed, even when the biasing current is below the static threshold, the VCSEL emits optical power under the square wave modulation. The power consumed by the device to generate light is significantly reduced to > 50%, which is below the threshold current, in response to both the modulation frequency and amplitude. An operating VCSEL device at low power is very desirable for less thermal effects, which are essential for a high-speed modulation bandwidth.Keywords: vertical-cavity surface-emitting lasers, VCSELs, optical power generation, power consumption, square wave modulation
Procedia PDF Downloads 16519300 EarlyWarning for Financial Stress Events:A Credit-Regime Switching Approach
Abstract:
We propose a new early warning model for predicting financial stress events for a given future time. In this model, we examine whether credit conditions play an important role as a nonlinear propagator of shocks when predicting the likelihood of occurrence of financial stress events for a given future time. This propagation takes the form of a threshold regression in which a regime change occurs if credit conditions cross a critical threshold. Given the new early warning model for financial stress events, we evaluate the performance of this model and currently available alternatives, such as the model from signal extraction approach, and linear regression model. In-sample forecasting results indicate that the three types of models are useful tools for predicting financial stress events while none of them outperforms others across all criteria considered. The out-of-sample forecasting results suggest that the credit-regime switching model performs better than the two others across all criteria and all forecasting horizons considered.Keywords: cut-off probability, early warning model, financial crisis, financial stress, regime-switching model, forecasting horizons
Procedia PDF Downloads 43519299 A Comparative Analysis on QRS Peak Detection Using BIOPAC and MATLAB Software
Authors: Chandra Mukherjee
Abstract:
The present paper is a representation of the work done in the field of ECG signal analysis using MATLAB 7.1 Platform. An accurate and simple ECG feature extraction algorithm is presented in this paper and developed algorithm is validated using BIOPAC software. To detect the QRS peak, ECG signal is processed by following mentioned stages- First Derivative, Second Derivative and then squaring of that second derivative. Efficiency of developed algorithm is tested on ECG samples from different database and real time ECG signals acquired using BIOPAC system. Firstly we have lead wise specified threshold value the samples above that value is marked and in the original signal, where these marked samples face change of slope are spotted as R-peak. On the left and right side of the R-peak, faces change of slope identified as Q and S peak, respectively. Now the inbuilt Detection algorithm of BIOPAC software is performed on same output sample and both outputs are compared. ECG baseline modulation correction is done after detecting characteristics points. The efficiency of the algorithm is tested using some validation parameters like Sensitivity, Positive Predictivity and we got satisfied value of these parameters.Keywords: first derivative, variable threshold, slope reversal, baseline modulation correction
Procedia PDF Downloads 41119298 Assessment of the Electrical, Mechanical, and Thermal Nociceptive Thresholds for Stimulation and Pain Measurements at the Bovine Hind Limb
Authors: Samaneh Yavari, Christiane Pferrer, Elisabeth Engelke, Alexander Starke, Juergen Rehage
Abstract:
Background: Three nociceptive thresholds of thermal, electrical, and mechanical thresholds commonly use to evaluate the local anesthesia in many species, for instance, cow, horse, cat, dog, rabbit, and so on. Due to the lack of investigations to evaluate and/or validate such those nociceptive thresholds, our plan was the comparison of two-foot local anesthesia methods of Intravenous Regional Anesthesia (IVRA) and our modified four-point Nerve Block Anesthesia (NBA). Materials and Methods: Eight healthy nonpregnant nondairy Holstein Frisian cows in a cross-over study design were selected for this study. All cows divided into two different groups to receive two local anesthesia techniques of IVRA and our modified four-point NBA. Three thermal, electrical, and mechanical force and pinpricks were applied to evaluate the quality of local anesthesia methods before and after local anesthesia application. Results: The statistical evaluation demonstrated that our four-point NBA has a qualification to select as a standard foot local anesthesia. However, the recorded results of our study revealed no significant difference between two groups of local anesthesia techniques of IVRA and modified four-point NBA related to quality and duration of anesthesia stimulated by electrical, mechanical and thermal nociceptive stimuli. Conclusion and discussion: All three nociceptive threshold stimuli of electrical, mechanical and heat nociceptive thresholds can be applied to measure and evaluate the efficacy of foot local anesthesia of dairy cows. However, our study revealed no superiority of those three nociceptive methods to evaluate the duration and quality of bovine foot local anesthesia methods. Veterinarians to investigate the duration and quality of their selected anesthesia method can use any of those heat, mechanical, and electrical methods.Keywords: mechanical, thermal, electrical threshold, IVRA, NBA, hind limb, dairy cow
Procedia PDF Downloads 24519297 Electrocardiogram Signal Denoising Using a Hybrid Technique
Authors: R. Latif, W. Jenkal, A. Toumanari, A. Hatim
Abstract:
This paper presents an efficient method of electrocardiogram signal denoising based on a hybrid approach. Two techniques are brought together to create an efficient denoising process. The first is an Adaptive Dual Threshold Filter (ADTF) and the second is the Discrete Wavelet Transform (DWT). The presented approach is based on three steps of denoising, the DWT decomposition, the ADTF step and the highest peaks correction step. This paper presents some application of the approach on some electrocardiogram signals of the MIT-BIH database. The results of these applications are promising compared to other recently published techniques.Keywords: hybrid technique, ADTF, DWT, thresholding, ECG signal
Procedia PDF Downloads 32219296 Algorithm for Path Recognition in-between Tree Rows for Agricultural Wheeled-Mobile Robots
Authors: Anderson Rocha, Pedro Miguel de Figueiredo Dinis Oliveira Gaspar
Abstract:
Machine vision has been widely used in recent years in agriculture, as a tool to promote the automation of processes and increase the levels of productivity. The aim of this work is the development of a path recognition algorithm based on image processing to guide a terrestrial robot in-between tree rows. The proposed algorithm was developed using the software MATLAB, and it uses several image processing operations, such as threshold detection, morphological erosion, histogram equalization and the Hough transform, to find edge lines along tree rows on an image and to create a path to be followed by a mobile robot. To develop the algorithm, a set of images of different types of orchards was used, which made possible the construction of a method capable of identifying paths between trees of different heights and aspects. The algorithm was evaluated using several images with different characteristics of quality and the results showed that the proposed method can successfully detect a path in different types of environments.Keywords: agricultural mobile robot, image processing, path recognition, hough transform
Procedia PDF Downloads 14619295 A Combined Feature Extraction and Thresholding Technique for Silence Removal in Percussive Sounds
Authors: B. Kishore Kumar, Pogula Rakesh, T. Kishore Kumar
Abstract:
The music analysis is a part of the audio content analysis used to analyze the music by using the different features of audio signal. In music analysis, the first step is to divide the music signal to different sections based on the feature profiles of the music signal. In this paper, we present a music segmentation technique that will effectively segmentize the signal and thresholding technique to remove silence from the percussive sounds produced by percussive instruments, which uses two features of music, namely signal energy and spectral centroid. The proposed method impose thresholds on both the features which will vary depends on the music signal. Depends on the threshold, silence part is removed and the segmentation is done. The effectiveness of the proposed method is analyzed using MATLAB.Keywords: percussive sounds, spectral centroid, spectral energy, silence removal, feature extraction
Procedia PDF Downloads 59319294 Nonlinear Relationship between Globalization and Control of Corruption along with Economic Growth
Authors: Elnaz Entezar, Reza Ezzati
Abstract:
In recent decades, trade flows, capital, workforce, technology and information have increased between international borders and the globalization has turned to an undeniable process in international economics. Meanwhile, despite the positive aspects of globalization, the critics of globalization opine that the risks and costs of globalization for developing vulnerable economies and the world's impoverished people are high and significant. In this regard, this study by using the data of KOF Economic Institute and the World Bank for 113 different countries during the period 2002-2012, by taking advantage of panel smooth transition regression, and by taking the gross domestic product as transmission variables discuss the nonlinear relationship between research variables. The results have revealed that globalization in low regime (countries with low GDP) has negative impact whereas in high regime (countries with high GDP) has a positive impact. In spite of the fact that in the early stages of growth, control of corruption has a positive impact on economic growth, after a threshold has a negative impact on economic growth.Keywords: globalization, corruption, panel smooth transition model, economic growth, threshold, economic convergence
Procedia PDF Downloads 29019293 Dielectrophoretic Characterization of Tin Oxide Nanowires for Biotechnology Application
Authors: Ahmad Sabry Mohamad, Kai F. Hoettges, Michael Pycraft Hughes
Abstract:
This study investigates nanowires using Dielectrophoresis (DEP) in non-aqueous suspension of Tin (IV) Oxide (SnO2) nanoparticles dispersed in N,N-dimenthylformamide (DMF). The self assembly of nanowires in DEP impedance spectroscopy can be determined. In this work, dielectrophoretic method was used to measure non-organic molecules for estimating the permittivity and conductivity characteristic of the nanowires. As in aqueous such as salt solution has been dominating the transport of SnO2, which are the wire growth threshold, depend on applied voltage. While DEP assembly of nanowires depend on applied frequency, the applications of dielectrophoretic collection are measured using impedance spectroscopy.Keywords: dielectrophoresis, impedance spectroscopy, nanowires, N, N-dimenthylformamide, SnO2
Procedia PDF Downloads 65919292 Plant Leaf Recognition Using Deep Learning
Authors: Aadhya Kaul, Gautam Manocha, Preeti Nagrath
Abstract:
Our environment comprises of a wide variety of plants that are similar to each other and sometimes the similarity between the plants makes the identification process tedious thus increasing the workload of the botanist all over the world. Now all the botanists cannot be accessible all the time for such laborious plant identification; therefore, there is an urge for a quick classification model. Also, along with the identification of the plants, it is also necessary to classify the plant as healthy or not as for a good lifestyle, humans require good food and this food comes from healthy plants. A large number of techniques have been applied to classify the plants as healthy or diseased in order to provide the solution. This paper proposes one such method known as anomaly detection using autoencoders using a set of collections of leaves. In this method, an autoencoder model is built using Keras and then the reconstruction of the original images of the leaves is done and the threshold loss is found in order to classify the plant leaves as healthy or diseased. A dataset of plant leaves is considered to judge the reconstructed performance by convolutional autoencoders and the average accuracy obtained is 71.55% for the purpose.Keywords: convolutional autoencoder, anomaly detection, web application, FLASK
Procedia PDF Downloads 16319291 PVMODREL© Development Based on Reliability Evaluation of a PV Module Using Accelerated Degradation Testing
Authors: Abderafi Charki, David Bigaud
Abstract:
The aim of this oral speach is to present the PVMODREL© (PhotoVoltaic MODule RELiability) new software developed in the University of Angers. This new tool permits us to evaluate the lifetime and reliability of a PV module whatever its geographical location and environmental conditions. The electrical power output of a PV module decreases with time mainly as a result of the effects of corrosion, encapsulation discoloration, and solder bond failure. The failure of a PV module is defined as the point where the electrical power degradation reaches a given threshold value. Accelerated life tests (ALTs) are commonly used to assess the reliability of a PV module. However, ALTs provide limited data on the failure of a module and these tests are expensive to carry out. One possible solution is to conduct accelerated degradation tests. The Wiener process in conjunction with the accelerated failure time model makes it possible to carry out numerous simulations and thus to determine the failure time distribution based on the aforementioned threshold value. By this means, the failure time distribution and the lifetime (mean and uncertainty) can be evaluated. An example using the damp heat test is shown to demonstrate the usefulness PVMODREL.Keywords: lifetime, reliability, PV Module, accelerated life testing, accelerated degradation testing
Procedia PDF Downloads 57419290 Short Association Bundle Atlas for Lateralization Studies from dMRI Data
Authors: C. Román, M. Guevara, P. Salas, D. Duclap, J. Houenou, C. Poupon, J. F. Mangin, P. Guevara
Abstract:
Diffusion Magnetic Resonance Imaging (dMRI) allows the non-invasive study of human brain white matter. From diffusion data, it is possible to reconstruct fiber trajectories using tractography algorithms. Our previous work consists in an automatic method for the identification of short association bundles of the superficial white matter (SWM), based on a whole brain inter-subject hierarchical clustering applied to a HARDI database. The method finds representative clusters of similar fibers, belonging to a group of subjects, according to a distance measure between fibers, using a non-linear registration (DTI-TK). The algorithm performs an automatic labeling based on the anatomy, defined by a cortex mesh parcelated with FreeSurfer software. The clustering was applied to two independent groups of 37 subjects. The clusters resulting from both groups were compared using a restrictive threshold of mean distance between each pair of bundles from different groups, in order to keep reproducible connections. In the left hemisphere, 48 reproducible bundles were found, while 43 bundles where found in the right hemisphere. An inter-hemispheric bundle correspondence was then applied. The symmetric horizontal reflection of the right bundles was calculated, in order to obtain the position of them in the left hemisphere. Next, the intersection between similar bundles was calculated. The pairs of bundles with a fiber intersection percentage higher than 50% were considered similar. The similar bundles between both hemispheres were fused and symmetrized. We obtained 30 common bundles between hemispheres. An atlas was created with the resulting bundles and used to segment 78 new subjects from another HARDI database, using a distance threshold between 6-8 mm according to the bundle length. Finally, a laterality index was calculated based on the bundle volume. Seven bundles of the atlas presented right laterality (IP_SP_1i, LO_LO_1i, Op_Tr_0i, PoC_PoC_0i, PoC_PreC_2i, PreC_SM_0i, y RoMF_RoMF_0i) and one presented left laterality (IP_SP_2i), there is no tendency of lateralization according to the brain region. Many factors can affect the results, like tractography artifacts, subject registration, and bundle segmentation. Further studies are necessary in order to establish the influence of these factors and evaluate SWM laterality.Keywords: dMRI, hierarchical clustering, lateralization index, tractography
Procedia PDF Downloads 33119289 Object Detection in Digital Images under Non-Standardized Conditions Using Illumination and Shadow Filtering
Authors: Waqqas-ur-Rehman Butt, Martin Servin, Marion Pause
Abstract:
In recent years, object detection has gained much attention and very encouraging research area in the field of computer vision. The robust object boundaries detection in an image is demanded in numerous applications of human computer interaction and automated surveillance systems. Many methods and approaches have been developed for automatic object detection in various fields, such as automotive, quality control management and environmental services. Inappropriately, to the best of our knowledge, object detection under illumination with shadow consideration has not been well solved yet. Furthermore, this problem is also one of the major hurdles to keeping an object detection method from the practical applications. This paper presents an approach to automatic object detection in images under non-standardized environmental conditions. A key challenge is how to detect the object, particularly under uneven illumination conditions. Image capturing conditions the algorithms need to consider a variety of possible environmental factors as the colour information, lightening and shadows varies from image to image. Existing methods mostly failed to produce the appropriate result due to variation in colour information, lightening effects, threshold specifications, histogram dependencies and colour ranges. To overcome these limitations we propose an object detection algorithm, with pre-processing methods, to reduce the interference caused by shadow and illumination effects without fixed parameters. We use the Y CrCb colour model without any specific colour ranges and predefined threshold values. The segmented object regions are further classified using morphological operations (Erosion and Dilation) and contours. Proposed approach applied on a large image data set acquired under various environmental conditions for wood stack detection. Experiments show the promising result of the proposed approach in comparison with existing methods.Keywords: image processing, illumination equalization, shadow filtering, object detection
Procedia PDF Downloads 21619288 A Guide to User-Friendly Bash Prompt: Adding Natural Language Processing Plus Bash Explanation to the Command Interface
Authors: Teh Kean Kheng, Low Soon Yee, Burra Venkata Durga Kumar
Abstract:
In 2022, as the future world becomes increasingly computer-related, more individuals are attempting to study coding for themselves or in school. This is because they have discovered the value of learning code and the benefits it will provide them. But learning coding is difficult for most people. Even senior programmers that have experience for a decade year still need help from the online source while coding. The reason causing this is that coding is not like talking to other people; it has the specific syntax to make the computer understand what we want it to do, so coding will be hard for normal people if they don’t have contact in this field before. Coding is hard. If a user wants to learn bash code with bash prompt, it will be harder because if we look at the bash prompt, we will find that it is just an empty box and waiting for a user to tell the computer what we want to do, if we don’t refer to the internet, we will not know what we can do with the prompt. From here, we can conclude that the bash prompt is not user-friendly for new users who are learning bash code. Our goal in writing this paper is to give an idea to implement a user-friendly Bash prompt in Ubuntu OS using Artificial Intelligent (AI) to lower the threshold of learning in Bash code, to make the user use their own words and concept to write and learn Bash code.Keywords: user-friendly, bash code, artificial intelligence, threshold, semantic similarity, lexical similarity
Procedia PDF Downloads 14219287 Identification of Damage Mechanisms in Interlock Reinforced Composites Using a Pattern Recognition Approach of Acoustic Emission Data
Authors: M. Kharrat, G. Moreau, Z. Aboura
Abstract:
The latest advances in the weaving industry, combined with increasingly sophisticated means of materials processing, have made it possible to produce complex 3D composite structures. Mainly used in aeronautics, composite materials with 3D architecture offer better mechanical properties than 2D reinforced composites. Nevertheless, these materials require a good understanding of their behavior. Because of the complexity of such materials, the damage mechanisms are multiple, and the scenario of their appearance and evolution depends on the nature of the exerted solicitations. The AE technique is a well-established tool for discriminating between the damage mechanisms. Suitable sensors are used during the mechanical test to monitor the structural health of the material. Relevant AE-features are then extracted from the recorded signals, followed by a data analysis using pattern recognition techniques. In order to better understand the damage scenarios of interlock composite materials, a multi-instrumentation was set-up in this work for tracking damage initiation and development, especially in the vicinity of the first significant damage, called macro-damage. The deployed instrumentation includes video-microscopy, Digital Image Correlation, Acoustic Emission (AE) and micro-tomography. In this study, a multi-variable AE data analysis approach was developed for the discrimination between the different signal classes representing the different emission sources during testing. An unsupervised classification technique was adopted to perform AE data clustering without a priori knowledge. The multi-instrumentation and the clustered data served to label the different signal families and to build a learning database. This latter is useful to construct a supervised classifier that can be used for automatic recognition of the AE signals. Several materials with different ingredients were tested under various solicitations in order to feed and enrich the learning database. The methodology presented in this work was useful to refine the damage threshold for the new generation materials. The damage mechanisms around this threshold were highlighted. The obtained signal classes were assigned to the different mechanisms. The isolation of a 'noise' class makes it possible to discriminate between the signals emitted by damages without resorting to spatial filtering or increasing the AE detection threshold. The approach was validated on different material configurations. For the same material and the same type of solicitation, the identified classes are reproducible and little disturbed. The supervised classifier constructed based on the learning database was able to predict the labels of the classified signals.Keywords: acoustic emission, classifier, damage mechanisms, first damage threshold, interlock composite materials, pattern recognition
Procedia PDF Downloads 15519286 Flood Monitoring in the Vietnamese Mekong Delta Using Sentinel-1 SAR with Global Flood Mapper
Authors: Ahmed S. Afifi, Ahmed Magdy
Abstract:
Satellite monitoring is an essential tool to study, understand, and map large-scale environmental changes that affect humans, climate, and biodiversity. The Sentinel-1 Synthetic Aperture Radar (SAR) instrument provides a high collection of data in all-weather, short revisit time, and high spatial resolution that can be used effectively in flood management. Floods occur when an overflow of water submerges dry land that requires to be distinguished from flooded areas. In this study, we use global flood mapper (GFM), a new google earth engine application that allows users to quickly map floods using Sentinel-1 SAR. The GFM enables the users to adjust manually the flood map parameters, e.g., the threshold for Z-value for VV and VH bands and the elevation and slope mask threshold. The composite R:G:B image results by coupling the bands of Sentinel-1 (VH:VV:VH) reduces false classification to a large extent compared to using one separate band (e.g., VH polarization band). The flood mapping algorithm in the GFM and the Otsu thresholding are compared with Sentinel-2 optical data. And the results show that the GFM algorithm can overcome the misclassification of a flooded area in An Giang, Vietnam.Keywords: SAR backscattering, Sentinel-1, flood mapping, disaster
Procedia PDF Downloads 10519285 Comparison of Methodologies to Compute the Probabilistic Seismic Hazard Involving Faults and Associated Uncertainties
Authors: Aude Gounelle, Gloria Senfaute, Ludivine Saint-Mard, Thomas Chartier
Abstract:
The long-term deformation rates of faults are not fully captured by Probabilistic Seismic Hazard Assessment (PSHA). PSHA that use catalogues to develop area or smoothed-seismicity sources is limited by the data available to constraint future earthquakes activity rates. The integration of faults in PSHA can at least partially address the long-term deformation. However, careful treatment of fault sources is required, particularly, in low strain rate regions, where estimated seismic hazard levels are highly sensitive to assumptions concerning fault geometry, segmentation and slip rate. When integrating faults in PSHA various constraints on earthquake rates from geologic and seismologic data have to be satisfied. For low strain rate regions where such data is scarce it would be especially challenging. Faults in PSHA requires conversion of the geologic and seismologic data into fault geometries, slip rates and then into earthquake activity rates. Several approaches exist for translating slip rates into earthquake activity rates. In the most frequently used approach, the background earthquakes are handled using a truncated approach, in which earthquakes with a magnitude lower or equal to a threshold magnitude (Mw) occur in the background zone, with a rate defined by the rate in the earthquake catalogue. Although magnitudes higher than the threshold are located on the fault with a rate defined using the average slip rate of the fault. As high-lighted by several research, seismic events with magnitudes stronger than the selected magnitude threshold may potentially occur in the background and not only at the fault, especially in regions of slow tectonic deformation. It also has been known that several sections of a fault or several faults could rupture during a single fault-to-fault rupture. It is then essential to apply a consistent modelling procedure to allow for a large set of possible fault-to-fault ruptures to occur aleatory in the hazard model while reflecting the individual slip rate of each section of the fault. In 2019, a tool named SHERIFS (Seismic Hazard and Earthquake Rates in Fault Systems) was published. The tool is using a methodology to calculate the earthquake rates in a fault system where the slip-rate budget of each fault is conversed into rupture rates for all possible single faults and faultto-fault ruptures. The objective of this paper is to compare the SHERIFS method with one other frequently used model to analyse the impact on the seismic hazard and through sensibility studies better understand the influence of key parameters and assumptions. For this application, a simplified but realistic case study was selected, which is in an area of moderate to hight seismicity (South Est of France) and where the fault is supposed to have a low strain.Keywords: deformation rates, faults, probabilistic seismic hazard, PSHA
Procedia PDF Downloads 6619284 Real-Time Detection of Space Manipulator Self-Collision
Authors: Zhang Xiaodong, Tang Zixin, Liu Xin
Abstract:
In order to avoid self-collision of space manipulators during operation process, a real-time detection method is proposed in this paper. The manipulator is fitted into a cylinder enveloping surface, and then the detection algorithm of collision between cylinders is analyzed. The collision model of space manipulator self-links can be detected by using this algorithm in real-time detection during the operation process. To ensure security of the operation, a safety threshold is designed. The simulation and experiment results verify the effectiveness of the proposed algorithm for a 7-DOF space manipulator.Keywords: space manipulator, collision detection, self-collision, the real-time collision detection
Procedia PDF Downloads 46919283 Non-Linear Regression Modeling for Composite Distributions
Authors: Mostafa Aminzadeh, Min Deng
Abstract:
Modeling loss data is an important part of actuarial science. Actuaries use models to predict future losses and manage financial risk, which can be beneficial for marketing purposes. In the insurance industry, small claims happen frequently while large claims are rare. Traditional distributions such as Normal, Exponential, and inverse-Gaussian are not suitable for describing insurance data, which often show skewness and fat tails. Several authors have studied classical and Bayesian inference for parameters of composite distributions, such as Exponential-Pareto, Weibull-Pareto, and Inverse Gamma-Pareto. These models separate small to moderate losses from large losses using a threshold parameter. This research introduces a computational approach using a nonlinear regression model for loss data that relies on multiple predictors. Simulation studies were conducted to assess the accuracy of the proposed estimation method. The simulations confirmed that the proposed method provides precise estimates for regression parameters. It's important to note that this approach can be applied to datasets if goodness-of-fit tests confirm that the composite distribution under study fits the data well. To demonstrate the computations, a real data set from the insurance industry is analyzed. A Mathematica code uses the Fisher information algorithm as an iteration method to obtain the maximum likelihood estimation (MLE) of regression parameters.Keywords: maximum likelihood estimation, fisher scoring method, non-linear regression models, composite distributions
Procedia PDF Downloads 3319282 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering
Authors: Emiel Caron
Abstract:
Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics
Procedia PDF Downloads 19419281 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.Keywords: anomaly detection, autoencoder, data centers, deep learning
Procedia PDF Downloads 19419280 Statistical Inferences for GQARCH-It\^{o} - Jumps Model Based on The Realized Range Volatility
Authors: Fu Jinyu, Lin Jinguan
Abstract:
This paper introduces a novel approach that unifies two types of models: one is the continuous-time jump-diffusion used to model high-frequency data, and the other is discrete-time GQARCH employed to model low-frequency financial data by embedding the discrete GQARCH structure with jumps in the instantaneous volatility process. This model is named “GQARCH-It\^{o} -Jumps mode.” We adopt the realized range-based threshold estimation for high-frequency financial data rather than the realized return-based volatility estimators, which entail the loss of intra-day information of the price movement. Meanwhile, a quasi-likelihood function for the low-frequency GQARCH structure with jumps is developed for the parametric estimate. The asymptotic theories are mainly established for the proposed estimators in the case of finite activity jumps. Moreover, simulation studies are implemented to check the finite sample performance of the proposed methodology. Specifically, it is demonstrated that how our proposed approaches can be practically used on some financial data.Keywords: It\^{o} process, GQARCH, leverage effects, threshold, realized range-based volatility estimator, quasi-maximum likelihood estimate
Procedia PDF Downloads 15519279 Thermal Image Segmentation Method for Stratification of Freezing Temperatures
Authors: Azam Fazelpour, Saeed R. Dehghani, Vlastimil Masek, Yuri S. Muzychka
Abstract:
The study uses an image analysis technique employing thermal imaging to measure the percentage of areas with various temperatures on a freezing surface. An image segmentation method using threshold values is applied to a sequence of image recording the freezing process. The phenomenon is transient and temperatures vary fast to reach the freezing point and complete the freezing process. Freezing salt water is subjected to the salt rejection that makes the freezing point dynamic and dependent on the salinity at the phase interface. For a specific area of freezing, nucleation starts from one side and end to another side, which causes a dynamic and transient temperature in that area. Thermal cameras are able to reveal a difference in temperature due to their sensitivity to infrared radiance. Using Experimental setup, a video is recorded by a thermal camera to monitor radiance and temperatures during the freezing process. Image processing techniques are applied to all frames to detect and classify temperatures on the surface. Image processing segmentation method is used to find contours with same temperatures on the icing surface. Each segment is obtained using the temperature range appeared in the image and correspond pixel values in the image. Using the contours extracted from image and camera parameters, stratified areas with different temperatures are calculated. To observe temperature contours on the icing surface using the thermal camera, the salt water sample is dropped on a cold surface with the temperature of -20°C. A thermal video is recorded for 2 minutes to observe the temperature field. Examining the results obtained by the method and the experimental observations verifies the accuracy and applicability of the method.Keywords: ice contour boundary, image processing, image segmentation, salt ice, thermal image
Procedia PDF Downloads 320