Search results for: sensory processing sensitivity
1884 Geographical Data Visualization Using Video Games Technologies
Authors: Nizar Karim Uribe-Orihuela, Fernando Brambila-Paz, Ivette Caldelas, Rodrigo Montufar-Chaveznava
Abstract:
In this paper, we present the advances corresponding to the implementation of a strategy to visualize geographical data using a Software Development Kit (SDK) for video games. We use multispectral images from Landsat 7 platform and Laser Imaging Detection and Ranging (LIDAR) data from The National Institute of Geography and Statistics of Mexican (INEGI). We select a place of interest to visualize from Landsat platform and make some processing to the image (rotations, atmospheric correction and enhancement). The resulting image will be our gray scale color-map to fusion with the LIDAR data, which was selected using the same coordinates than in Landsat. The LIDAR data is translated to 8-bit raw data. Both images are fused in a software developed using Unity (an SDK employed for video games). The resulting image is then displayed and can be explored moving around. The idea is the software could be used for students of geology and geophysics at the Engineering School of the National University of Mexico. They will download the software and images corresponding to a geological place of interest to a smartphone and could virtually visit and explore the site with a virtual reality visor such as Google cardboard.Keywords: virtual reality, interactive technologies, geographical data visualization, video games technologies, educational material
Procedia PDF Downloads 2461883 Narratives of Self-Renewal: Looking for A Middle Earth In-Between Psychoanalysis and the Search for Consciousness
Authors: Marilena Fatigante
Abstract:
Contemporary psychoanalysis is increasingly acknowledging the existential demands of clients in psychotherapy. A significant aspect of the personal crises that patients face today is often rooted in the difficulty to find meaning in their own existence, even after working through or resolving traumatic memories and experiences. Tracing back to the correspondence between Freud and Romain Rolland (1927), psychoanalysis could not ignore that investigation of the psyche also encompasses the encounter with deep, psycho-sensory experiences, which involve a sense of "being one with the external world as a whole", the well-known “oceanic feeling”, as Rolland posed it. Despite the recognition of Non-ordinary States of Consciousness (NSC) as catalysts for transformation in clinical practice, highlighted by neuroscience and results from psychedelic-assisted therapies, there is few research on how psychoanalytic knowledge can integrate with other treatment traditions. These traditions, commonly rooted in non -Western, unconventional, and non-formal psychological knowledge, emphasize the individual’s innate tendency toward existential integrity and transcendence of self-boundaries. Inspired by an autobiographical account, this paper examines narratives of 12 individuals, who engaged in psychoanalytic therapy and also underwent treatment involving a non-formal helping relationship with an expert guide in consciousness, which included experience of this nature. The guide relies on 35 yrs of experience in Psychological, multidisciplinary studies in Human Sciences and Art, and demonstrates knowledge of many wisdom traditions, ranging from Eastern to Western philosophy, including Psychoanalysis and its development in cultural perspective (e.g, Ethnopsychiatry). Analyses focused primarily on two dimensions that research has identified as central in assessing the degree of treatment “success” in the patients’ narrative accounts of their therapies: agency and coherence, defined respectively as the increase, expressed in language, of the client’s perceived ability to manage his/her own challenges and the capacity, inherent in “narrative” itself as a resource for meaning making (Bruner, 1990), to provide the subject with a sense of unity, endowing his /her life experience with temporal and logical sequentiality. The present study reports that, in all narratives from the participants, agency and coherence are described differently than in “common” psychotherapy narratives. Although the participants consistently identified themselves as responsible agentic subject, the sense of agency derived from the non-conventional guidance pathway is never reduced to a personal, individual accomplishment. Rather, the more a new, fuller sense of “Life” (more than “Self”) develops out of the guidance pathway they engage with the expert guide, the more they “surrender” their own sense of autonomy and self-containment. Something, which Safran (2016) identified as well talking about the sense of surrender and “grace” in psychoanalytic sessions. Secondly, narratives of individuals engaging with the expert guide describe coherence not as repairing or enforcing continuity but as enhancing their ability to navigate dramatic discontinuities, falls, abrupt leaps and passages marked by feelings of loss and bereavement. The paper ultimately explores whether valid criteria can be established to analyze experiences of non-conventional paths of self-evolution. These paths are not opposed or alternative to conventional ones, and should not be simplistically dismissed as exotic or magical.Keywords: oceanic feeling, non conventional guidance, consciousness, narratives, treatment outcomes
Procedia PDF Downloads 381882 Human Computer Interaction Using Computer Vision and Speech Processing
Authors: Shreyansh Jain Jeetmal, Shobith P. Chadaga, Shreyas H. Srinivas
Abstract:
Internet of Things (IoT) is seen as the next major step in the ongoing revolution in the Information Age. It is predicted that in the near future billions of embedded devices will be communicating with each other to perform a plethora of tasks with or without human intervention. One of the major ongoing hotbed of research activity in IoT is Human Computer Interaction (HCI). HCI is used to facilitate communication between an intelligent system and a user. An intelligent system typically comprises of a system consisting of various sensors, actuators and embedded controllers which communicate with each other to monitor data collected from the environment. Communication by the user to the system is typically done using voice. One of the major ongoing applications of HCI is in home automation as a personal assistant. The prime objective of our project is to implement a use case of HCI for home automation. Our system is designed to detect and recognize the users and personalize the appliances in the house according to their individual preferences. Our HCI system is also capable of speaking with the user when certain commands are spoken such as searching on the web for information and controlling appliances. Our system can also monitor the environment in the house such as air quality and gas leakages for added safety.Keywords: human computer interaction, internet of things, computer vision, sensor networks, speech to text, text to speech, android
Procedia PDF Downloads 3621881 Evaluating the Feasibility of Chemical Dermal Exposure Assessment Model
Authors: P. S. Hsi, Y. F. Wang, Y. F. Ho, P. C. Hung
Abstract:
The aim of the present study was to explore the dermal exposure assessment model of chemicals that have been developed abroad and to evaluate the feasibility of chemical dermal exposure assessment model for manufacturing industry in Taiwan. We conducted and analyzed six semi-quantitative risk management tools, including UK - Control of substances hazardous to health ( COSHH ) Europe – Risk assessment of occupational dermal exposure ( RISKOFDERM ), Netherlands - Dose related effect assessment model ( DREAM ), Netherlands – Stoffenmanager ( STOFFEN ), Nicaragua-Dermal exposure ranking method ( DERM ) and USA / Canada - Public Health Engineering Department ( PHED ). Five types of manufacturing industry were selected to evaluate. The Monte Carlo simulation was used to analyze the sensitivity of each factor, and the correlation between the assessment results of each semi-quantitative model and the exposure factors used in the model was analyzed to understand the important evaluation indicators of the dermal exposure assessment model. To assess the effectiveness of the semi-quantitative assessment models, this study also conduct quantitative dermal exposure results using prediction model and verify the correlation via Pearson's test. Results show that COSHH was unable to determine the strength of its decision factor because the results evaluated at all industries belong to the same risk level. In the DERM model, it can be found that the transmission process, the exposed area, and the clothing protection factor are all positively correlated. In the STOFFEN model, the fugitive, operation, near-field concentrations, the far-field concentration, and the operating time and frequency have a positive correlation. There is a positive correlation between skin exposure, work relative time, and working environment in the DREAM model. In the RISKOFDERM model, the actual exposure situation and exposure time have a positive correlation. We also found high correlation with the DERM and RISKOFDERM models, with coefficient coefficients of 0.92 and 0.93 (p<0.05), respectively. The STOFFEN and DREAM models have poor correlation, the coefficients are 0.24 and 0.29 (p>0.05), respectively. According to the results, both the DERM and RISKOFDERM models are suitable for performance in these selected manufacturing industries. However, considering the small sample size evaluated in this study, more categories of industries should be evaluated to reduce its uncertainty and enhance its applicability in the future.Keywords: dermal exposure, risk management, quantitative estimation, feasibility evaluation
Procedia PDF Downloads 1691880 HcDD: The Hybrid Combination of Disk Drives in Active Storage Systems
Authors: Shu Yin, Zhiyang Ding, Jianzhong Huang, Xiaojun Ruan, Xiaomin Zhu, Xiao Qin
Abstract:
Since large-scale and data-intensive applications have been widely deployed, there is a growing demand for high-performance storage systems to support data-intensive applications. Compared with traditional storage systems, next-generation systems will embrace dedicated processor to reduce computational load of host machines and will have hybrid combinations of different storage devices. The advent of flash- memory-based solid state disk has become a critical role in revolutionizing the storage world. However, instead of simply replacing the traditional magnetic hard disk with the solid state disk, it is believed that finding a complementary approach to corporate both of them is more challenging and attractive. This paper explores an idea of active storage, an emerging new storage configuration, in terms of the architecture and design, the parallel processing capability, the cooperation of other machines in cluster computing environment, and a disk configuration, the hybrid combination of different types of disk drives. Experimental results indicate that the proposed HcDD achieves better I/O performance and longer storage system lifespan.Keywords: arallel storage system, hybrid storage system, data inten- sive, solid state disks, reliability
Procedia PDF Downloads 4481879 The Impact of the General Data Protection Regulation on Human Resources Management in Schools
Authors: Alexandra Aslanidou
Abstract:
The General Data Protection Regulation (GDPR), concerning the protection of natural persons within the European Union with regard to the processing of personal data and on the free movement of such data, became applicable in the European Union (EU) on 25 May 2018 and transformed the way personal data were being treated under the Data Protection Directive (DPD) regime, generating sweeping organizational changes to both public sector and business. A social practice that is considerably influenced in the way of its day-to-day operations is Human Resource (HR) management, for which the importance of GDPR cannot be underestimated. That is because HR processes personal data coming in all shapes and sizes from many different systems and sources. The significance of the proper functioning of an HR department, specifically in human-centered, service-oriented environments such as the education field, is decisive due to the fact that HR operations in schools, conducted effectively, determine the quality of the provided services and consequently have a considerable impact on the success of the educational system. The purpose of this paper is to analyze the decisive role that GDPR plays in HR departments that operate in schools and in order to practically evaluate the aftermath of the Regulation during the first months of its applicability; a comparative use cases analysis in five highly dynamic schools, across three EU Member States, was attempted.Keywords: general data protection regulation, human resource management, educational system
Procedia PDF Downloads 1001878 Automatic Tagging and Accuracy in Assamese Text Data
Authors: Chayanika Hazarika Bordoloi
Abstract:
This paper is an attempt to work on a highly inflectional language called Assamese. This is also one of the national languages of India and very little has been achieved in terms of computational research. Building a language processing tool for a natural language is not very smooth as the standard and language representation change at various levels. This paper presents inflectional suffixes of Assamese verbs and how the statistical tools, along with linguistic features, can improve the tagging accuracy. Conditional random fields (CRF tool) was used to automatically tag and train the text data; however, accuracy was improved after linguistic featured were fed into the training data. Assamese is a highly inflectional language; hence, it is challenging to standardizing its morphology. Inflectional suffixes are used as a feature of the text data. In order to analyze the inflections of Assamese word forms, a list of suffixes is prepared. This list comprises suffixes, comprising of all possible suffixes that various categories can take is prepared. Assamese words can be classified into inflected classes (noun, pronoun, adjective and verb) and un-inflected classes (adverb and particle). The corpus used for this morphological analysis has huge tokens. The corpus is a mixed corpus and it has given satisfactory accuracy. The accuracy rate of the tagger has gradually improved with the modified training data.Keywords: CRF, morphology, tagging, tagset
Procedia PDF Downloads 1941877 Prevalence of Hepatitis B Virus Infection and Its Determinants among Pregnant Women in East Africa: Systematic Review and Meta-Analysis
Authors: Bantie Getnet Yirsaw, Muluken Chanie Agimas, Gebrie Getu Alemu, Tigabu Kidie Tesfie, Nebiyu Mekonnen Derseh, Habtamu Wagnew Abuhay, Meron Asmamaw Alemayehu, Getaneh Awoke Yismaw
Abstract:
Introduction: Hepatitis B virus (HBV) is one of the major public health problems globally and needs an urgent response. It is one of the most responsible causes of mortality among the five hepatitis viruses, and it affects almost every class of individuals. Thus, the main objective of this study was to determine the pooled prevalence and its determinants among pregnant women in East Africa. Methods: We searched studies using PubMed, Scopus, Embase, ScienceDirect, Google Scholar, and grey literature that were published between January 01/2020 to January 30/2024. The studies were assessed using the Newcastle Ottawa Scale (NOS) quality assessment scale. The random-effect (DerSimonian) model was used to determine the pooled prevalence and associated factors of HBV among pregnant women. Heterogeneity was assessed by I² statistic, sub-group analysis, and sensitivity analysis. Publication bias was assessed by the Egger test, and the analysis was done using STATA version 17. Result: A total of 45 studies with 35639 pregnant women were included in this systematic review and meta-analysis. The overall pooled prevalence of HBV among pregnant women in East Africa was 6.0% (95% CI: 6.0%−7.0%, I² = 89.7%). The highest prevalence of 8% ((95% CI: 6%, 10%), I² = 91.08%) was seen in 2021, and the lowest prevalence of 5% ((95% CI: 4%, 6%) I² = 52.52%) was observed in 2022. A pooled meta-analysis showed that history of surgical procedure (OR = 2.14 (95% CI: 1.27, 3.61)), having multiple sexual partners (OR = 3.87 (95% CI: 2.52, 5.95), history of body tattooing (OR = 2.55 (95% CI: 1.62, 4.01)), history of tooth extraction (OR = 2.09 (95% CI: 1.29, 3.39)), abortion history(OR = 2.20(95% CI: 1.38, 3.50)), history of sharing sharp material (OR = 1.88 (95% CI: 1.07, 3.31)), blood transfusion (OR = 2.41 (95% CI: 1.62, 3.57)), family history of HBV (OR = 4.87 (95% CI: 2.95, 8.05)) and history needle injury (OR = 2.62 (95% CI: 1.20, 5.72)) were significant risk factors associated with HBV infection among pregnant women. Conclusions: The pooled prevalence of HBV infection among pregnant women in East Africa was at an intermediate level and different across countries, ranging from 1.5% to 22.2%. The result of this pooled prevalence was an indication of the need for screening, prevention, and control of HBV infection among pregnant women in the region. Therefore, early identification of risk factors, awareness creation of the mode of transmission of HBV, and implementation of preventive measures are essential in reducing the burden of HBV infection among pregnant women.Keywords: hepatitis B virus, prevalence, determinants, pregnant women, meta-analysis, East Africa
Procedia PDF Downloads 391876 FPGA Implementation of a Marginalized Particle Filter for Delineation of P and T Waves of ECG Signal
Authors: Jugal Bhandari, K. Hari Priya
Abstract:
The ECG signal provides important clinical information which could be used to pretend the diseases related to heart. Accordingly, delineation of ECG signal is an important task. Whereas delineation of P and T waves is a complex task. This paper deals with the Study of ECG signal and analysis of signal by means of Verilog Design of efficient filters and MATLAB tool effectively. It includes generation and simulation of ECG signal, by means of real time ECG data, ECG signal filtering and processing by analysis of different algorithms and techniques. In this paper, we design a basic particle filter which generates a dynamic model depending on the present and past input samples and then produces the desired output. Afterwards, the output will be processed by MATLAB to get the actual shape and accurate values of the ranges of P-wave and T-wave of ECG signal. In this paper, Questasim is a tool of mentor graphics which is being used for simulation and functional verification. The same design is again verified using Xilinx ISE which will be also used for synthesis, mapping and bit file generation. Xilinx FPGA board will be used for implementation of system. The final results of FPGA shall be verified with ChipScope Pro where the output data can be observed.Keywords: ECG, MATLAB, Bayesian filtering, particle filter, Verilog hardware descriptive language
Procedia PDF Downloads 3671875 Clinical Risk Score for Mortality and Predictors of Severe Disease in Adult Patients with Dengue
Authors: Siddharth Jain, Abhenil Mittal, Surendra Kumar Sharma
Abstract:
Background: With its recent emergence and re-emergence, dengue has become a major international public health concern, imposing significant financial burden especially in developing countries. Despite aggressive control measures in place, India experienced one of its largest outbreaks in 2015 with Delhi being most severely affected. There is a lack of reliable predictors of disease severity and mortality in dengue. The present study was carried out to identify these predictors during the 2015 outbreak. Methods: This prospective observational study conducted at an apex tertiary care center in Delhi, India included confirmed adult dengue patients admitted between August-November 2015. Patient demographics, clinical details, and laboratory findings were recorded in a predesigned proforma. Appropriate statistical tests were used to summarize and compare the clinical and laboratory characteristics and derive predictors of mortality and severe disease, while developing a clinical risk score for mortality. Serotype analysis was also done for 75 representative samples to identify the dominant serotypes. Results: Data of 369 patients were analyzed (mean age 30.9 years; 67% males). Of these, 198 (54%) patients had dengue fever, 125 (34%) had dengue hemorrhagic fever (DHF Grade 1,2)and 46 (12%) developed dengue shock syndrome (DSS). Twenty two (6%) patients died. Late presentation to the hospital (≥5 days after onset) and dyspnoea at rest were identified as independent predictors of severe disease. Age ≥ 24 years, dyspnoea at rest and altered sensorium were identified as independent predictors of mortality. A clinical risk score was developed (12*age + 14*sensorium + 10*dyspnoea) which, if ≥ 22, predicted mortality with a high sensitivity (81.8%) and specificity (79.2%). The predominant serotypes in Delhi (2015) were DENV-2 and DENV-4. Conclusion: Age ≥ 24 years, dyspnoea at rest and altered sensorium were identified as independent predictors of mortality. Platelet counts did not determine the outcome in dengue patients. Timely referral/access to health care is important. Development and use of validated predictors of disease severity and simple clinical risk scores, which can be applied in all healthcare settings, can help minimize mortality and morbidity, especially in resource limited settings.Keywords: dengue, mortality, predictors, severity
Procedia PDF Downloads 3071874 The Advancements of Transformer Models in Part-of-Speech Tagging System for Low-Resource Tigrinya Language
Authors: Shamm Kidane, Ibrahim Abdella, Fitsum Gaim, Simon Mulugeta, Sirak Asmerom, Natnael Ambasager, Yoel Ghebrihiwot
Abstract:
The call for natural language processing (NLP) systems for low-resource languages has become more apparent than ever in the past few years, with the arduous challenges still present in preparing such systems. This paper presents an improved dataset version of the Nagaoka Tigrinya Corpus for Parts-of-Speech (POS) classification system in the Tigrinya language. The size of the initial Nagaoka dataset was incremented, totaling the new tagged corpus to 118K tokens, which comprised the 12 basic POS annotations used previously. The additional content was also annotated manually in a stringent manner, followed similar rules to the former dataset and was formatted in CONLL format. The system made use of the novel approach in NLP tasks and use of the monolingually pre-trained TiELECTRA, TiBERT and TiRoBERTa transformer models. The highest achieved score is an impressive weighted F1-score of 94.2%, which surpassed the previous systems by a significant measure. The system will prove useful in the progress of NLP-related tasks for Tigrinya and similarly related low-resource languages with room for cross-referencing higher-resource languages.Keywords: Tigrinya POS corpus, TiBERT, TiRoBERTa, conditional random fields
Procedia PDF Downloads 1031873 Effect of Citric Acid and Clove on Cured Smoked Meat: A Traditional Meat Product
Authors: Esther Eduzor, Charles A. Negbenebor, Helen O. Agu
Abstract:
Smoking of meat enhances the taste and look of meat, it also increases its longevity, and helps preserve the meat by slowing down the spoilage of fat and growth of bacteria. The Lean meat from the forequarter of beef carcass was obtained from the Maiduguri abattoir. The meat was cut into four portions with weight ranging from 525-545 g. The meat was cut into bits measuring about 8 cm in length, 3.5 cm in thickness and weighed 64.5 g. Meat samples were washed, cured with various concentration of sodium chloride, sodium nitrate, citric acid and clove for 30 min, drained and smoked in a smoking kiln at a temperature range of 55-600°C, for 8 hr a day for 3 days. The products were stored at ambient temperature and evaluated microbiologically and organoleptically. In terms of processing and storage there were increases in pH, free fatty acid content, a decrease in water holding capacity and microbial count of the cured smoked meat. The panelists rated control samples significantly (p < 0.05) higher in terms of colour, texture, taste and overall acceptability. The following organisms were isolated and identified during storage: Bacillus specie, Bacillus subtilis, streptococcus, Pseudomonas, Aspergillus niger, Candida and Penicillium specie. The study forms a basis for new product development for meat industry.Keywords: citric acid, cloves, smoked meat, bioengineering
Procedia PDF Downloads 4451872 Study of Aerosol Deposition and Shielding Effects on Fluorescent Imaging Quantitative Evaluation in Protective Equipment Validation
Authors: Shinhao Yang, Hsiao-Chien Huang, Chin-Hsiang Luo
Abstract:
The leakage of protective clothing is an important issue in the occupational health field. There is no quantitative method for measuring the leakage of personal protective equipment. This work aims to measure the quantitative leakage of the personal protective equipment by using the fluorochrome aerosol tracer. The fluorescent aerosols were employed as airborne particulates in a controlled chamber with ultraviolet (UV) light-detectable stickers. After an exposure-and-leakage test, the protective equipment was removed and photographed with UV-scanning to evaluate areas, color depth ratio, and aerosol deposition and shielding effects of the areas where fluorescent aerosols had adhered to the body through the protective equipment. Thus, this work built a calculation software for quantitative leakage ratio of protective clothing based on fluorescent illumination depth/aerosol concentration ratio, illumination/Fa ratio, aerosol deposition and shielding effects, and the leakage area ratio on the segmentation. The results indicated that the two-repetition total leakage rate of the X, Y, and Z type protective clothing for subject T were about 3.05, 4.21, and 3.52 (mg/m2). For five-repetition, the leakage rate of T were about 4.12, 4.52, and 5.11 (mg/m2).Keywords: fluorochrome, deposition, shielding effects, digital image processing, leakage ratio, personal protective equipment
Procedia PDF Downloads 3231871 Statistical Tools for SFRA Diagnosis in Power Transformers
Authors: Rahul Srivastava, Priti Pundir, Y. R. Sood, Rajnish Shrivastava
Abstract:
For the interpretation of the signatures of sweep frequency response analysis(SFRA) of transformer different types of statistical techniques serves as an effective tool for doing either phase to phase comparison or sister unit comparison. In this paper with the discussion on SFRA several statistics techniques like cross correlation coefficient (CCF), root square error (RSQ), comparative standard deviation (CSD), Absolute difference, mean square error(MSE),Min-Max ratio(MM) are presented through several case studies. These methods require sample data size and spot frequencies of SFRA signatures that are being compared. The techniques used are based on power signal processing tools that can simplify result and limits can be created for the severity of the fault occurring in the transformer due to several short circuit forces or due to ageing. The advantages of using statistics techniques for analyzing of SFRA result are being indicated through several case studies and hence the results are obtained which determines the state of the transformer.Keywords: absolute difference (DABS), cross correlation coefficient (CCF), mean square error (MSE), min-max ratio (MM-ratio), root square error (RSQ), standard deviation (CSD), sweep frequency response analysis (SFRA)
Procedia PDF Downloads 6971870 Application of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise and Multipoint Optimal Minimum Entropy Deconvolution in Railway Bearings Fault Diagnosis
Authors: Yao Cheng, Weihua Zhang
Abstract:
Although the measured vibration signal contains rich information on machine health conditions, the white noise interferences and the discrete harmonic coming from blade, shaft and mash make the fault diagnosis of rolling element bearings difficult. In order to overcome the interferences of useless signals, a new fault diagnosis method combining Complete Ensemble Empirical Mode Decomposition with adaptive noise (CEEMDAN) and Multipoint Optimal Minimum Entropy Deconvolution (MOMED) is proposed for the fault diagnosis of high-speed train bearings. Firstly, the CEEMDAN technique is applied to adaptively decompose the raw vibration signal into a series of finite intrinsic mode functions (IMFs) and a residue. Compared with Ensemble Empirical Mode Decomposition (EEMD), the CEEMDAN can provide an exact reconstruction of the original signal and a better spectral separation of the modes, which improves the accuracy of fault diagnosis. An effective sensitivity index based on the Pearson's correlation coefficients between IMFs and raw signal is adopted to select sensitive IMFs that contain bearing fault information. The composite signal of the sensitive IMFs is applied to further analysis of fault identification. Next, for propose of identifying the fault information precisely, the MOMED is utilized to enhance the periodic impulses in composite signal. As a non-iterative method, the MOMED has better deconvolution performance than the classical deconvolution methods such Minimum Entropy Deconvolution (MED) and Maximum Correlated Kurtosis Deconvolution (MCKD). Third, the envelope spectrum analysis is applied to detect the existence of bearing fault. The simulated bearing fault signals with white noise and discrete harmonic interferences are used to validate the effectiveness of the proposed method. Finally, the superiorities of the proposed method are further demonstrated by high-speed train bearing fault datasets measured from test rig. The analysis results indicate that the proposed method has strong practicability.Keywords: bearing, complete ensemble empirical mode decomposition with adaptive noise, fault diagnosis, multipoint optimal minimum entropy deconvolution
Procedia PDF Downloads 3741869 Damage Identification Using Experimental Modal Analysis
Authors: Niladri Sekhar Barma, Satish Dhandole
Abstract:
Damage identification in the context of safety, nowadays, has become a fundamental research interest area in the field of mechanical, civil, and aerospace engineering structures. The following research is aimed to identify damage in a mechanical beam structure and quantify the severity or extent of damage in terms of loss of stiffness, and obtain an updated analytical Finite Element (FE) model. An FE model is used for analysis, and the location of damage for single and multiple damage cases is identified numerically using the modal strain energy method and mode shape curvature method. Experimental data has been acquired with the help of an accelerometer. Fast Fourier Transform (FFT) algorithm is applied to the measured signal, and subsequently, post-processing is done in MEscopeVes software. The two sets of data, the numerical FE model and experimental results, are compared to locate the damage accurately. The extent of the damage is identified via modal frequencies using a mixed numerical-experimental technique. Mode shape comparison is performed by Modal Assurance Criteria (MAC). The analytical FE model is adjusted by the direct method of model updating. The same study has been extended to some real-life structures such as plate and GARTEUR structures.Keywords: damage identification, damage quantification, damage detection using modal analysis, structural damage identification
Procedia PDF Downloads 1161868 Graph Codes - 2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval
Authors: Stefan Wagenpfeil, Felix Engel, Paul McKevitt, Matthias Hemmje
Abstract:
Multimedia Indexing and Retrieval is generally designed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, especially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelization. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.Keywords: indexing, retrieval, multimedia, graph algorithm, graph code
Procedia PDF Downloads 1611867 Physical Properties and Elastic Studies of Fluoroaluminate Glasses Based on Alkali
Authors: C. Benhamideche
Abstract:
Fluoroaluminate glasses have been reported as the earliest heavy metal fluoride glasses. By comparison with flurozirconate glasses, they offer a set of similar optical features, but also some differences in their elastic and chemical properties. In practice they have been less developed because their stability against devitrification is smaller than that of the most stable fluoroziconates. The purpose of this study was to investigate glass formation in systems AlF3-YF3-PbF2-MgF2-MF2 (M= Li, Na, K). Synthesis was implemented at room atmosphere using the ammonium fluoride processing. After fining, the liquid was into a preheated brass mold, then annealed below the glass transition temperature for several hours. The samples were polished for optical measurements. Glass formation has been investigated in a systematic way, using pseudo ternary systems in order to allow parameters to vary at the same time. We have chosen the most stable glass compositions for the determination of the physical properties. These properties including characteristic temperatures, density and proprieties elastic. Glass stability increases in multicomponent glasses. Bulk samples have been prepared for physical characterization. These glasses have a potential interest for passive optical fibers because they are less sensitive to water attack than ZBLAN glass, mechanically stronger. It is expected they could have a larger damage threshold for laser power transmission.Keywords: fluoride glass, aluminium fluoride, thermal properties, density, proprieties elastic
Procedia PDF Downloads 2411866 A Differential Scanning Calorimetric Study of Frozen Liquid Egg Yolk Thawed by Different Thawing Methods
Authors: Karina I. Hidas, Csaba Németh, Anna Visy, Judit Csonka, László Friedrich, Ildikó Cs. Nyulas-Zeke
Abstract:
Egg yolk is a popular ingredient in the food industry due to its gelling, emulsifying, colouring, and coagulating properties. Because of the heat sensitivity of proteins, egg yolk can only be heat treated at low temperatures, so its shelf life, even with the addition of a preservative, is only a few weeks. Freezing can increase the shelf life of liquid egg yolk up to 1 year, but it undergoes gelling below -6 ° C, which is an irreversible phenomenon. The degree of gelation depends on the time and temperature of freezing and is influenced by the process of thawing. Therefore, in our experiment, we examined egg yolks thawed in different ways. In this study, unpasteurized, industrially broken, separated, and homogenized liquid egg yolk was used. Freshly produced samples were frozen in plastic containers at -18°C in a laboratory freezer. Frozen storage was performed for 90 days. Samples were analysed at day zero (unfrozen) and after frozen storage for 1, 7, 14, 30, 60 and 90 days. Samples were thawed in two ways (at 5°C for 24 hours and 30°C for 3 hours) before testing. Calorimetric properties were examined by differential scanning calorimetry, where heat flow curves were recorded. Denaturation enthalpy values were calculated by fitting a linear baseline, and denaturation temperature values were evaluated. Besides, dry matter content of samples was measured by the oven method with drying at 105°C to constant weight. For statistical analysis two-way ANOVA (α = 0.05) was employed, where thawing mode and freezing time were the fixed factors. Denaturation enthalpy values decreased from 1.1 to 0.47 at the end of the storage experiment, which represents a reduction of about 60%. The effect of freezing time was significant on these values, already the enthalpy of samples stored frozen for 1 day was significantly reduced. However, the mode of thawing did not significantly affect the denaturation enthalpy of the samples, and no interaction was seen between the two factors. The denaturation temperature and dry matter content did not change significantly either during the freezing period or during the defrosting mode. Results of our study show that slow freezing and frozen storage at -18°C greatly reduces the amount of protein that can be denatured in egg yolk, indicating that the proteins have been subjected to aggregation, denaturation or other protein conversions regardless of how they were thawed.Keywords: denaturation enthalpy, differential scanning calorimetry, liquid egg yolk, slow freezing
Procedia PDF Downloads 1291865 Green-Synthesized β-Cyclodextrin Membranes for Humidity Sensors
Authors: Zeineb Baatout, Safa Teka, Nejmeddine Jaballah, Nawfel Sakly, Xiaonan Sun, Mustapha Majdoub
Abstract:
Currently, the economic interests linked to the development of bio-based materials make biomass one of the most interesting areas for science development. We are interested in the β-cyclodextrin (β-CD), one of the popular bio-sourced macromolecule, produced from the starch via enzymatic conversion. It is a cyclic oligosaccharide formed by the association of seven glucose units. It presents a rigid conical and amphiphilic structure with hydrophilic exterior, allowing it to be water-soluble. It has also a hydrophobic interior enabling the formation of inclusion complexes, which support its application for the elaboration of electrochemical and optical sensors. Nevertheless, the solubility of β-CD in water makes its use as sensitive layer limit and difficult due to their instability in aqueous media. To overcome this limitation, we chose to precede by modification of the hydroxyl groups to obtain hydrophobic derivatives which lead to water-stable sensing layers. Hence, a series of benzylated β-CDs were synthesized in basic aqueous media in one pot. This work reports the synthesis of a new family of substituted amphiphilic β-CDs using a green methodology. The obtained β-CDs showed different degree of substitution (DS) between 0.85 and 2.03. These organic macromolecular materials were soluble in common organic volatile solvents, and their structures were investigated by NMR, FT-IR and MALDI-TOF spectroscopies. Thermal analysis showed a correlation between the thermal properties of these derivatives and the benzylation degree. The surface properties of the thin films based on the benzylated β-CDs were characterized by contact angle measurements and atomic force microscopy (AFM). These organic materials were investigated as sensitive layers, deposited on quartz crystal microbalance (QCM) gravimetric transducer, for humidity sensor at room temperature. The results showed that the performances of the prepared sensors are greatly influenced by the benzylation degree of β-CD. The partially modified β-CD (DS=1) shows linear response with best sensitivity, good reproducibility, low hysteresis, fast response time (15s) and recovery time (17s) at higher relative humidity levels (RH) between 11% and 98% in room temperature.Keywords: β-cyclodextrin, green synthesis, humidity sensor, quartz crystal microbalance
Procedia PDF Downloads 2711864 Aerodynamics of Spherical Combat Platform Levitation
Authors: Aelina Franz
Abstract:
In recent years, the scientific community has witnessed a paradigm shift in the exploration of unconventional levitation methods, particularly in the domain of spherical combat platforms. This paper explores aerodynamics and levitational dynamics inherent in these spheres by examining interactions at the quantum level. Our research unravels the nuanced aerodynamic phenomena governing the levitation of spherical combat platforms. Through an analysis of the quantum fluid dynamics surrounding these spheres, we reveal the crucial interactions between air resistance, surface irregularities, and the quantum fluctuations that influence their levitational behavior. Our findings challenge conventional understanding, providing a perspective on the aerodynamic forces at play during the levitation of spherical combat platforms. Furthermore, we propose design modifications and control strategies informed by both classical aerodynamics and quantum information processing principles. These advancements not only enhance the stability and maneuverability of the combat platforms but also open new avenues for exploration in the interdisciplinary realm of engineering and quantum information sciences. This paper aims to contribute to levitation technologies and their applications in the field of spherical combat platforms. We anticipate that our work will stimulate further research to create a deeper understanding of aerodynamics and quantum phenomena in unconventional levitation systems.Keywords: spherical combat platforms, levitation technologies, aerodynamics, maneuverable platforms
Procedia PDF Downloads 571863 Medical Ethics in the Hospital: Towards Quality Ethics Consultation
Authors: Dina Siniora, Jasia Baig
Abstract:
During the past few decades, the healthcare system has undergone profound changes in their healthcare decision-making competencies and moral aptitudes due to the vast advancement in technology, clinical skills, and scientific knowledge. Healthcare decision-making deals with morally contentious dilemmas ranging from illness, life and death judgments that require sensitivity and awareness towards the patient’s preferences while taking into consideration medicine’s abilities and boundaries. As the ever-evolving field of medicine continues to become more scientifically and morally multifarious; physicians and the hospital administrators increasingly rely on ethics committees to resolve problems that arise in everyday patient care. The role and latitude of responsibilities of ethics committees which includes being dispute intermediaries, moral analysts, policy educators, counselors, advocates, and reviewers; suggest the importance and effectiveness of a fully integrated committee. Despite achievements on Integrated Ethics and progress in standards and competencies, there is an imminent necessity for further improvement in quality within ethics consultation services in areas of credentialing, professionalism and standards of quality, as well as the quality of healthcare throughout the system. These concerns can be resolved first by collecting data about particular quality gaps and comprehend the level to which ethics committees are consistent with newly published ASBH quality standards. Policymakers should pursue improvement strategies that target both academic bioethics community and major stakeholders at hospitals, who directly influence ethics committees. This broader approach oriented towards education and intervention outcome in conjunction with preventive ethics to address disparities in quality on a systematic level. Adopting tools for improving competencies and processes within ethics consultation by implementing a credentialing process, upholding normative significance for the ASBH core competencies, advocating for professional Code of Ethics, and further clarifying the internal structures will improve productivity, patient satisfaction, and institutional integrity. This cannot be systemically achieved without a written certification exam for HCEC practitioners, credentialing and privileging HCEC practitioners at the hospital level, and accrediting HCEC services at the institutional level.Keywords: ethics consultation, hospital, medical ethics, quality
Procedia PDF Downloads 1891862 Educational Path for Pedagogical Skills: A Football School Experience
Authors: A. Giani
Abstract:
The current pedagogical culture recognizes an educational scope within the sports practices. It is widely accepted, in the pedagogical culture, that thanks to the acquisition and development of motor skills, it is also possible to exercise abilities that concern the way of facing and managing the difficulties of everyday life. Sport is a peculiar educational environment: the children have the opportunity to discover the possibilities of their body, to correlate with their peers, and to learn how to manage the rules and the relationship with authorities, such as coaches. Educational aspects of the sport concern both non-formal and formal educational environments. Coaches play a critical role in an agonistic sphere: exactly like the competencies developed by the children, coaches have to work on their skills to properly set up the educational scene. Facing these new educational tasks - which are not new per se, but new because they are brought back to awareness - a few questions arise: does the coach have adequate preparation? Is the training of the coach in this specific area appropriate? This contribution aims to explore the issue in depth by focusing on the reality of the Football School. Starting from a possible sense of pedagogical inadequacy detected during a series of meetings with several football clubs in Piedmont (Italy), there have been highlighted some important educational needs within the professional training of sports coaches. It is indeed necessary for the coach to know the processes underlying the educational relationship in order to better understand the centrality of the assessment during the educational intervention and to be able to manage the asymmetry in the coach-athlete relationship. In order to provide a response to these pedagogical needs, a formative plan has been designed to allow both an in-depth study of educational issues and a correct self-evaluation of certain pedagogical skills’ control levels, led by the coach. This plan has been based on particular practices, the Educational Practices of Pre-test (EPP), a specific version of community practices designed for the extracurricular activities. The above-mentioned practices realized through the use of texts meant as pre-tests, promoted a reflection within the group of coaches: they set up real and plausible sports experiences - in particular football, triggering a reflection about the relationship’s object, spaces, and methods. The characteristic aspect of pre-tests is that it is impossible to anticipate the reflection as it is necessarily connected to the personal experience and sensitivity, requiring a strong interest and involvement by participants: situations must be considered by the coaches as possible settings in which they could be found on the field.Keywords: relational needs, values, responsibility, self-evaluation
Procedia PDF Downloads 1181861 Identifying the Structural Components of Old Buildings from Floor Plans
Authors: Shi-Yu Xu
Abstract:
The top three risk factors that have contributed to building collapses during past earthquake events in Taiwan are: "irregular floor plans or elevations," "insufficient columns in single-bay buildings," and the "weak-story problem." Fortunately, these unsound structural characteristics can be directly identified from the floor plans. However, due to the vast number of old buildings, conducting manual inspections to identify these compromised structural features in all existing structures would be time-consuming and prone to human errors. This study aims to develop an algorithm that utilizes artificial intelligence techniques to automatically pinpoint the structural components within a building's floor plans. The obtained spatial information will be utilized to construct a digital structural model of the building. This information, particularly regarding the distribution of columns in the floor plan, can then be used to conduct preliminary seismic assessments of the building. The study employs various image processing and pattern recognition techniques to enhance detection efficiency and accuracy. The study enables a large-scale evaluation of structural vulnerability for numerous old buildings, providing ample time to arrange for structural retrofitting in those buildings that are at risk of significant damage or collapse during earthquakes.Keywords: structural vulnerability detection, object recognition, seismic capacity assessment, old buildings, artificial intelligence
Procedia PDF Downloads 891860 A Review on Existing Challenges of Data Mining and Future Research Perspectives
Authors: Hema Bhardwaj, D. Srinivasa Rao
Abstract:
Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges
Procedia PDF Downloads 1101859 Random Subspace Neural Classifier for Meteor Recognition in the Night Sky
Authors: Carlos Vera, Tetyana Baydyk, Ernst Kussul, Graciela Velasco, Miguel Aparicio
Abstract:
This article describes the Random Subspace Neural Classifier (RSC) for the recognition of meteors in the night sky. We used images of meteors entering the atmosphere at night between 8:00 p.m.-5: 00 a.m. The objective of this project is to classify meteor and star images (with stars as the image background). The monitoring of the sky and the classification of meteors are made for future applications by scientists. The image database was collected from different websites. We worked with RGB-type images with dimensions of 220x220 pixels stored in the BitMap Protocol (BMP) format. Subsequent window scanning and processing were carried out for each image. The scan window where the characteristics were extracted had the size of 20x20 pixels with a scanning step size of 10 pixels. Brightness, contrast and contour orientation histograms were used as inputs for the RSC. The RSC worked with two classes and classified into: 1) with meteors and 2) without meteors. Different tests were carried out by varying the number of training cycles and the number of images for training and recognition. The percentage error for the neural classifier was calculated. The results show a good RSC classifier response with 89% correct recognition. The results of these experiments are presented and discussed.Keywords: contour orientation histogram, meteors, night sky, RSC neural classifier, stars
Procedia PDF Downloads 1391858 Multi-Modal Feature Fusion Network for Speaker Recognition Task
Authors: Xiang Shijie, Zhou Dong, Tian Dan
Abstract:
Speaker recognition is a crucial task in the field of speech processing, aimed at identifying individuals based on their vocal characteristics. However, existing speaker recognition methods face numerous challenges. Traditional methods primarily rely on audio signals, which often suffer from limitations in noisy environments, variations in speaking style, and insufficient sample sizes. Additionally, relying solely on audio features can sometimes fail to capture the unique identity of the speaker comprehensively, impacting recognition accuracy. To address these issues, we propose a multi-modal network architecture that simultaneously processes both audio and text signals. By gradually integrating audio and text features, we leverage the strengths of both modalities to enhance the robustness and accuracy of speaker recognition. Our experiments demonstrate significant improvements with this multi-modal approach, particularly in complex environments, where recognition performance has been notably enhanced. Our research not only highlights the limitations of current speaker recognition methods but also showcases the effectiveness of multi-modal fusion techniques in overcoming these limitations, providing valuable insights for future research.Keywords: feature fusion, memory network, multimodal input, speaker recognition
Procedia PDF Downloads 331857 Solving Process Planning, Weighted Apparent Tardiness Cost Dispatching, and Weighted Processing plus Weight Due-Date Assignment Simultaneously Using a Hybrid Search
Authors: Halil Ibrahim Demir, Caner Erden, Abdullah Hulusi Kokcam, Mumtaz Ipek
Abstract:
Process planning, scheduling, and due date assignment are three important manufacturing functions which are studied independently in literature. There are hundreds of works on IPPS and SWDDA problems but a few works on IPPSDDA problem. Integrating these three functions is very crucial due to the high relationship between them. Since the scheduling problem is in the NP-Hard problem class without any integration, an integrated problem is even harder to solve. This study focuses on the integration of these functions. Sum of weighted tardiness, earliness, and due date related costs are used as a penalty function. Random search and hybrid metaheuristics are used to solve the integrated problem. Marginal improvement in random search is very high in the early iterations and reduces enormously in later iterations. At that point directed search contribute to marginal improvement more than random search. In this study, random and genetic search methods are combined to find better solutions. Results show that overall performance becomes better as the integration level increases.Keywords: process planning, genetic algorithm, hybrid search, random search, weighted due-date assignment, weighted scheduling
Procedia PDF Downloads 3621856 Electrochemical Treatment and Chemical Analyses of Tannery Wastewater Using Sacrificial Aluminum Electrode, Ethiopia
Authors: Dessie Tibebe, Muluken Asmare, Marye Mulugeta, Yezbie Kassa, Zerubabel Moges, Dereje Yenealem, Tarekegn Fentie, Agmas Amare
Abstract:
The performance of electrocoagulation (EC) using Aluminium electrodes for the treatment of effluent-containing chromium metal using a fixed bed electrochemical batch reactor was studied. In the present work, the efficiency evaluation of EC in removing physicochemical and heavy metals from real industrial tannery wastewater in the Amhara region, collected from Bahirdar, Debre Brihan, and Haik, was investigated. The treated and untreated samples were determined by AAS and ICP OES spectrophotometers. The results indicated that selected heavy metals were removed in all experiments with high removal percentages. The optimal results were obtained regarding both cost and electrocoagulation efficiency with initial pH = 3, initial concentration = 40 mg/L, electrolysis time = 30 min, current density = 40 mA/cm2, and temperature = 25oC favored metal removal. The maximum removal percentages of selected metals obtained were 84.42% for Haik, 92.64% for Bahir Dar and 94.90% for Debre Brihan. The sacrificial electrode and sludge were characterized by FT-IR, SEM and XRD. After treatment, some metals like chromium will be used again as a tanning agent in leather processing to promote a circular economy.Keywords: electrochemical, treatment, aluminum, tannery effluent
Procedia PDF Downloads 1101855 Investigation of Glacier Activity Using Optical and Radar Data in Zardkooh
Authors: Mehrnoosh Ghadimi, Golnoush Ghadimi
Abstract:
Precise monitoring of glacier velocity is critical in determining glacier-related hazards. Zardkooh Mountain was studied in terms of glacial activity rate in Zagros Mountainous region in Iran. In this study, we assessed the ability of optical and radar imagery to derive glacier-surface velocities in mountainous terrain. We processed Landsat 8 for optical data and Sentinel-1a for radar data. We used methods that are commonly used to measure glacier surface movements, such as cross correlation of optical and radar satellite images, SAR tracking techniques, and multiple aperture InSAR (MAI). We also assessed time series glacier surface displacement using our modified method, Enhanced Small Baseline Subset (ESBAS). The ESBAS has been implemented in StaMPS software, with several aspects of the processing chain modified, including filtering prior to phase unwrapping, topographic correction within three-dimensional phase unwrapping, reducing atmospheric noise, and removing the ramp caused by ionosphere turbulence and/or orbit errors. Our findings indicate an average surface velocity rate of 32 mm/yr in the Zardkooh mountainous areas.Keywords: active rock glaciers, landsat 8, sentinel-1a, zagros mountainous region
Procedia PDF Downloads 77