Search results for: bagging ensemble methods
12086 Safety Climate Assessment and Its Impact on the Productivity of Construction Enterprises
Authors: Krzysztof J. Czarnocki, F. Silveira, E. Czarnocka, K. Szaniawska
Abstract:
Research background: Problems related to the occupational health and decreasing level of safety occur commonly in the construction industry. Important factor in the occupational safety in construction industry is scaffold use. All scaffolds used in construction, renovation, and demolition shall be erected, dismantled and maintained in accordance with safety procedure. Increasing demand for new construction projects unfortunately still is linked to high level of occupational accidents. Therefore, it is crucial to implement concrete actions while dealing with scaffolds and risk assessment in construction industry, the way on doing assessment and liability of assessment is critical for both construction workers and regulatory framework. Unfortunately, professionals, who tend to rely heavily on their own experience and knowledge when taking decisions regarding risk assessment, may show lack of reliability in checking the results of decisions taken. Purpose of the article: The aim was to indicate crucial parameters that could be modeling with Risk Assessment Model (RAM) use for improving both building enterprise productivity and/or developing potential and safety climate. The developed RAM could be a benefit for predicting high-risk construction activities and thus preventing accidents occurred based on a set of historical accident data. Methodology/Methods: A RAM has been developed for assessing risk levels as various construction process stages with various work trades impacting different spheres of enterprise activity. This project includes research carried out by teams of researchers on over 60 construction sites in Poland and Portugal, under which over 450 individual research cycles were carried out. The conducted research trials included variable conditions of employee exposure to harmful physical and chemical factors, variable levels of stress of employees and differences in behaviors and habits of staff. Genetic modeling tool has been used for developing the RAM. Findings and value added: Common types of trades, accidents, and accident causes have been explored, in addition to suitable risk assessment methods and criteria. We have found that the initial worker stress level is more direct predictor for developing the unsafe chain leading to the accident rather than the workload, or concentration of harmful factors at the workplace or even training frequency and management involvement.Keywords: safety climate, occupational health, civil engineering, productivity
Procedia PDF Downloads 31812085 An Accurate Method for Phylogeny Tree Reconstruction Based on a Modified Wild Dog Algorithm
Authors: Essam Al Daoud
Abstract:
This study solves a phylogeny problem by using modified wild dog pack optimization. The least squares error is considered as a cost function that needs to be minimized. Therefore, in each iteration, new distance matrices based on the constructed trees are calculated and used to select the alpha dog. To test the suggested algorithm, ten homologous genes are selected and collected from National Center for Biotechnology Information (NCBI) databanks (i.e., 16S, 18S, 28S, Cox 1, ITS1, ITS2, ETS, ATPB, Hsp90, and STN). The data are divided into three categories: 50 taxa, 100 taxa and 500 taxa. The empirical results show that the proposed algorithm is more reliable and accurate than other implemented methods.Keywords: least square, neighbor joining, phylogenetic tree, wild dog pack
Procedia PDF Downloads 32012084 Performance of the SrSnO₃/SnO₂ Nanocomposite Catalyst on the Photocatalytic Degradation of Dyes
Authors: H. Boucheloukh, N. Aoun, M. Denni, A. Mahrouk, T. Sehili
Abstract:
Perovskite materials with strontium alkaline earth metal have attracted researchers in photocatalysis. Thus, nanocomposite-based strontium has been synthesized by the sol-gel method, calciened at 700 °C, and characterized by different methods such as X-ray difraction (DRX), Fourier transformed infrared (FTIR), and diffuse relectance spectroscopy (DRS). After that, the photocatlytic performance of SrNO3/SnO2 has been tested under sunlight in an aqueous solution for two dyes methylene blue and congo-red. The results reveal that 70% of methylene blue has already been degraded after 45 minutes of exposure to sun light, while 80% of Congo red has been eliminated by adsorption on SrSnO₃/SnO₂ in 120 minutes of contact.Keywords: congo-red, methylene blue, photocatalysis, perovskite
Procedia PDF Downloads 5512083 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results
Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter
Abstract:
Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.Keywords: classification, CRISP-DM, machine learning, predictive quality, regression
Procedia PDF Downloads 14412082 A Study on Computational Fluid Dynamics (CFD)-Based Design Optimization Techniques Using Multi-Objective Evolutionary Algorithms (MOEA)
Authors: Ahmed E. Hodaib, Mohamed A. Hashem
Abstract:
In engineering applications, a design has to be as fully perfect as possible in some defined case. The designer has to overcome many challenges in order to reach the optimal solution to a specific problem. This process is called optimization. Generally, there is always a function called “objective function” that is required to be maximized or minimized by choosing input parameters called “degrees of freedom” within an allowed domain called “search space” and computing the values of the objective function for these input values. It becomes more complex when we have more than one objective for our design. As an example for Multi-Objective Optimization Problem (MOP): A structural design that aims to minimize weight and maximize strength. In such case, the Pareto Optimal Frontier (POF) is used, which is a curve plotting two objective functions for the best cases. At this point, a designer should make a decision to choose the point on the curve. Engineers use algorithms or iterative methods for optimization. In this paper, we will discuss the Evolutionary Algorithms (EA) which are widely used with Multi-objective Optimization Problems due to their robustness, simplicity, suitability to be coupled and to be parallelized. Evolutionary algorithms are developed to guarantee the convergence to an optimal solution. An EA uses mechanisms inspired by Darwinian evolution principles. Technically, they belong to the family of trial and error problem solvers and can be considered global optimization methods with a stochastic optimization character. The optimization is initialized by picking random solutions from the search space and then the solution progresses towards the optimal point by using operators such as Selection, Combination, Cross-over and/or Mutation. These operators are applied to the old solutions “parents” so that new sets of design variables called “children” appear. The process is repeated until the optimal solution to the problem is reached. Reliable and robust computational fluid dynamics solvers are nowadays commonly utilized in the design and analyses of various engineering systems, such as aircraft, turbo-machinery, and auto-motives. Coupling of Computational Fluid Dynamics “CFD” and Multi-Objective Evolutionary Algorithms “MOEA” has become substantial in aerospace engineering applications, such as in aerodynamic shape optimization and advanced turbo-machinery design.Keywords: mathematical optimization, multi-objective evolutionary algorithms "MOEA", computational fluid dynamics "CFD", aerodynamic shape optimization
Procedia PDF Downloads 25612081 Mathematical and Fuzzy Logic in the Interpretation of the Quran
Authors: Morteza Khorrami
Abstract:
The logic as an intellectual infrastructure plays an essential role in the Islamic sciences. Hence, there are a few of the verses of the Holy Quran that their interpretation is not possible due to lack of proper logic. In many verses in the Quran, argument and the respondent has requested from the audience that shows the logic rule is in the Quran. The paper which use a descriptive and analytic method, tries to show the role of logic in understanding of the Quran reasoning methods and display some of Quranic statements with mathematical symbols and point that we can help these symbols for interesting and interpretation and answering to some questions and doubts. In this paper, this problem has been mentioned that the Quran did not use two-valued logic (Aristotelian) in all cases, but the fuzzy logic can also be searched in the Quran.Keywords: aristotelian logic, fuzzy logic, interpretation, Holy Quran
Procedia PDF Downloads 67612080 An Analysis of Business Intelligence Requirements in South African Corporates
Authors: Adheesh Budree, Olaf Jacob, Louis CH Fourie, James Njenga, Gabriel D Hoffman
Abstract:
Business Intelligence (BI) is implemented by organisations for many reasons and chief among these is improved data support, decision support and savings. The main purpose of this study is to determine BI requirements and availability within South African organisations. The study addresses the following areas as identified as part of a literature review; assessing BI practices in businesses over a range of industries, sectors and managerial functions, determining the functionality of BI (technologies, architecture and methods). It was found that the overall satisfaction with BI in larger organisations is low due to lack of ability to meet user requirements.Keywords: business intelligence, business value, data management, South Africa
Procedia PDF Downloads 57712079 The Grammar of the Content Plane as a Style Marker in Forensic Authorship Attribution
Authors: Dayane de Almeida
Abstract:
This work aims at presenting a study that demonstrates the usability of categories of analysis from Discourse Semiotics – also known as Greimassian Semiotics in authorship cases in forensic contexts. It is necessary to know if the categories examined in semiotic analysis (the ‘grammar’ of the content plane) can distinguish authors. Thus, a study with 4 sets of texts from a corpus of ‘not on demand’ written samples (those texts differ in formality degree, purpose, addressees, themes, etc.) was performed. Each author contributed with 20 texts, separated into 2 groups of 10 (Author1A, Author1B, and so on). The hypothesis was that texts from a single author were semiotically more similar to each other than texts from different authors. The assumptions and issues that led to this idea are as follows: -The features analyzed in authorship studies mostly relate to the expression plane: they are manifested on the ‘surface’ of texts. If language is both expression and content, content would also have to be considered for more accurate results. Style is present in both planes. -Semiotics postulates the content plane is structured in a ‘grammar’ that underlies expression, and that presents different levels of abstraction. This ‘grammar’ would be a style marker. -Sociolinguistics demonstrates intra-speaker variation: an individual employs different linguistic uses in different situations. Then, how to determine if someone is the author of several texts, distinct in nature (as it is the case in most forensic sets), when it is known intra-speaker variation is dependent on so many factors?-The idea is that the more abstract the level in the content plane, the lower the intra-speaker variation, because there will be a greater chance for the author to choose the same thing. If two authors recurrently chose the same options, differently from one another, it means each one’s option has discriminatory power. -Size is another issue for various attribution methods. Since most texts in real forensic settings are short, methods relying only on the expression plane tend to fail. The analysis of the content plane as proposed by greimassian semiotics would be less size-dependable. -The semiotic analysis was performed using the software Corpus Tool, generating tags to allow the counting of data. Then, similarities and differences were quantitatively measured, through the application of the Jaccard coefficient (a statistical measure that compares the similarities and differences between samples). The results showed the hypothesis was confirmed and, hence, the grammatical categories of the content plane may successfully be used in questioned authorship scenarios.Keywords: authorship attribution, content plane, forensic linguistics, greimassian semiotics, intraspeaker variation, style
Procedia PDF Downloads 24212078 Nondestructive Prediction and Classification of Gel Strength in Ethanol-Treated Kudzu Starch Gels Using Near-Infrared Spectroscopy
Authors: John-Nelson Ekumah, Selorm Yao-Say Solomon Adade, Mingming Zhong, Yufan Sun, Qiufang Liang, Muhammad Safiullah Virk, Xorlali Nunekpeku, Nana Adwoa Nkuma Johnson, Bridget Ama Kwadzokpui, Xiaofeng Ren
Abstract:
Enhancing starch gel strength and stability is crucial. However, traditional gel property assessment methods are destructive, time-consuming, and resource-intensive. Thus, understanding ethanol treatment effects on kudzu starch gel strength and developing a rapid, nondestructive gel strength assessment method is essential for optimizing the treatment process and ensuring product quality consistency. This study investigated the effects of different ethanol concentrations on the microstructure of kudzu starch gels using a comprehensive microstructural analysis. We also developed a nondestructive method for predicting gel strength and classifying treatment levels using near-infrared (NIR) spectroscopy, and advanced data analytics. Scanning electron microscopy revealed progressive network densification and pore collapse with increasing ethanol concentration, correlating with enhanced mechanical properties. NIR spectroscopy, combined with various variable selection methods (CARS, GA, and UVE) and modeling algorithms (PLS, SVM, and ELM), was employed to develop predictive models for gel strength. The UVE-SVM model demonstrated exceptional performance, with the highest R² values (Rc = 0.9786, Rp = 0.9688) and lowest error rates (RMSEC = 6.1340, RMSEP = 6.0283). Pattern recognition algorithms (PCA, LDA, and KNN) successfully classified gels based on ethanol treatment levels, achieving near-perfect accuracy. This integrated approach provided a multiscale perspective on ethanol-induced starch gel modification, from molecular interactions to macroscopic properties. Our findings demonstrate the potential of NIR spectroscopy, coupled with advanced data analysis, as a powerful tool for rapid, nondestructive quality assessment in starch gel production. This study contributes significantly to the understanding of starch modification processes and opens new avenues for research and industrial applications in food science, pharmaceuticals, and biomaterials.Keywords: kudzu starch gel, near-infrared spectroscopy, gel strength prediction, support vector machine, pattern recognition algorithms, ethanol treatment
Procedia PDF Downloads 3712077 Holistic Approach to Teaching Mathematics in Secondary School as a Means of Improving Students’ Comprehension of Study Material
Authors: Natalia Podkhodova, Olga Sheremeteva, Mariia Soldaeva
Abstract:
Creating favorable conditions for students’ comprehension of mathematical content is one of the primary problems in teaching mathematics in secondary school. Psychology research has demonstrated that positive comprehension becomes possible when new information becomes part of student’s subjective experience and when linkages between the attributes of notions and various ways of their presentations can be established. The fact of comprehension includes the ability to build a working situational model and thus becomes an important means of solving mathematical problems. The article describes the implementation of a holistic approach to teaching mathematics designed to address the primary challenges of such teaching, specifically, the challenge of students’ comprehension. This approach consists of (1) establishing links between the attributes of a notion: the sense, the meaning, and the term; (2) taking into account the components of student’s subjective experience -emotional and value, contextual, procedural, communicative- during the educational process; (3) links between different ways to present mathematical information; (4) identifying and leveraging the relationships between real, perceptual and conceptual (scientific) mathematical spaces by applying real-life situational modeling. The article describes approaches to the practical use of these foundational concepts. Identifying how proposed methods and technology influence understanding of material used in teaching mathematics was the research’s primary goal. The research included an experiment in which 256 secondary school students took part: 142 in the experimental group and 114 in the control group. All students in these groups had similar levels of achievement in math and studied math under the same curriculum. In the course of the experiment, comprehension of two topics -'Derivative' and 'Trigonometric functions'- was evaluated. Control group participants were taught using traditional methods. Students in the experimental group were taught using the holistic method: under the teacher’s guidance, they carried out problems designed to establish linkages between notion’s characteristics, to convert information from one mode of presentation to another, as well as problems that required the ability to operate with all modes of presentation. The use of the technology that forms inter-subject notions based on linkages between perceptional, real, and conceptual mathematical spaces proved to be of special interest to the students. Results of the experiment were analyzed by presenting students in each of the groups with a final test in each of the studied topics. The test included problems that required building real situational models. Statistical analysis was used to aggregate test results. Pierson criterion was used to reveal the statistical significance of results (pass-fail the modeling test). A significant difference in results was revealed (p < 0.001), which allowed the authors to conclude that students in the study group showed better comprehension of mathematical information than those in the control group. Also, it was revealed (used Student’s t-test) that the students of the experimental group performed reliably (p = 0.0001) more problems in comparison with those in the control group. The results obtained allow us to conclude that increasing comprehension and assimilation of study material took place as a result of applying implemented methods and techniques.Keywords: comprehension of mathematical content, holistic approach to teaching mathematics in secondary school, subjective experience, technology of the formation of inter-subject notions
Procedia PDF Downloads 17612076 DenseNet and Autoencoder Architecture for COVID-19 Chest X-Ray Image Classification and Improved U-Net Lung X-Ray Segmentation
Authors: Jonathan Gong
Abstract:
Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.Keywords: artificial intelligence, convolutional neural networks, deep learning, image processing, machine learning
Procedia PDF Downloads 13012075 Assessment of Occupational Exposure and Individual Radio-Sensitivity in People Subjected to Ionizing Radiation
Authors: Oksana G. Cherednichenko, Anastasia L. Pilyugina, Sergey N.Lukashenko, Elena G. Gubitskaya
Abstract:
The estimation of accumulated radiation doses in people professionally exposed to ionizing radiation was performed using methods of biological (chromosomal aberrations frequency in lymphocytes) and physical (radionuclides analysis in urine, whole-body radiation meter, individual thermoluminescent dosimeters) dosimetry. A group of 84 "A" category employees after their work in the territory of former Semipalatinsk test site (Kazakhstan) was investigated. The dose rate in some funnels exceeds 40 μSv/h. After radionuclides determination in urine using radiochemical and WBC methods, it was shown that the total effective dose of personnel internal exposure did not exceed 0.2 mSv/year, while an acceptable dose limit for staff is 20 mSv/year. The range of external radiation doses measured with individual thermo-luminescent dosimeters was 0.3-1.406 µSv. The cytogenetic examination showed that chromosomal aberrations frequency in staff was 4.27±0.22%, which is significantly higher than at the people from non-polluting settlement Tausugur (0.87±0.1%) (р ≤ 0.01) and citizens of Almaty (1.6±0.12%) (р≤ 0.01). Chromosomal type aberrations accounted for 2.32±0.16%, 0.27±0.06% of which were dicentrics and centric rings. The cytogenetic analysis of different types group radiosensitivity among «professionals» (age, sex, ethnic group, epidemiological data) revealed no significant differences between the compared values. Using various techniques by frequency of dicentrics and centric rings, the average cumulative radiation dose for group was calculated, and that was 0.084-0.143 Gy. To perform comparative individual dosimetry using physical and biological methods of dose assessment, calibration curves (including own ones) and regression equations based on general frequency of chromosomal aberrations obtained after irradiation of blood samples by gamma-radiation with the dose rate of 0,1 Gy/min were used. Herewith, on the assumption of individual variation of chromosomal aberrations frequency (1–10%), the accumulated dose of radiation varied 0-0.3 Gy. The main problem in the interpretation of individual dosimetry results is reduced to different reaction of the objects to irradiation - radiosensitivity, which dictates the need of quantitative definition of this individual reaction and its consideration in the calculation of the received radiation dose. The entire examined contingent was assigned to a group based on the received dose and detected cytogenetic aberrations. Radiosensitive individuals, at the lowest received dose in a year, showed the highest frequency of chromosomal aberrations (5.72%). In opposite, radioresistant individuals showed the lowest frequency of chromosomal aberrations (2.8%). The cohort correlation according to the criterion of radio-sensitivity in our research was distributed as follows: radio-sensitive (26.2%) — medium radio-sensitivity (57.1%), radioresistant (16.7%). Herewith, the dispersion for radioresistant individuals is 2.3; for the group with medium radio-sensitivity — 3.3; and for radio-sensitive group — 9. These data indicate the highest variation of characteristic (reactions to radiation effect) in the group of radio-sensitive individuals. People with medium radio-sensitivity show significant long-term correlation (0.66; n=48, β ≥ 0.999) between the values of doses defined according to the results of cytogenetic analysis and dose of external radiation obtained with the help of thermoluminescent dosimeters. Mathematical models based on the type of violation of the radiation dose according to the professionals radiosensitivity level were offered.Keywords: biodosimetry, chromosomal aberrations, ionizing radiation, radiosensitivity
Procedia PDF Downloads 18412074 Towards End-To-End Disease Prediction from Raw Metagenomic Data
Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker
Abstract:
Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine
Procedia PDF Downloads 12512073 A Virtual Set-Up to Evaluate Augmented Reality Effect on Simulated Driving
Authors: Alicia Yanadira Nava Fuentes, Ilse Cervantes Camacho, Amadeo José Argüelles Cruz, Ana María Balboa Verduzco
Abstract:
Augmented reality promises being present in future driving, with its immersive technology let to show directions and maps to identify important places indicating with graphic elements when the car driver requires the information. On the other side, driving is considered a multitasking activity and, for some people, a complex activity where different situations commonly occur that require the immediate attention of the car driver to make decisions that contribute to avoid accidents; therefore, the main aim of the project is the instrumentation of a platform with biometric sensors that allows evaluating the performance in driving vehicles with the influence of augmented reality devices to detect the level of attention in drivers, since it is important to know the effect that it produces. In this study, the physiological sensors EPOC X (EEG), ECG06 PRO and EMG Myoware are joined in the driving test platform with a Logitech G29 steering wheel and the simulation software City Car Driving in which the level of traffic can be controlled, as well as the number of pedestrians that exist within the simulation obtaining a driver interaction in real mode and through a MSP430 microcontroller achieves the acquisition of data for storage. The sensors bring a continuous analog signal in time that needs signal conditioning, at this point, a signal amplifier is incorporated due to the acquired signals having a sensitive range of 1.25 mm/mV, also filtering that consists in eliminating the frequency bands of the signal in order to be interpretative and without noise to convert it from an analog signal into a digital signal to analyze the physiological signals of the drivers, these values are stored in a database. Based on this compilation, we work on the extraction of signal features and implement K-NN (k-nearest neighbor) classification methods and decision trees (unsupervised learning) that enable the study of data for the identification of patterns and determine by classification methods different effects of augmented reality on drivers. The expected results of this project include are a test platform instrumented with biometric sensors for data acquisition during driving and a database with the required variables to determine the effect caused by augmented reality on people in simulated driving.Keywords: augmented reality, driving, physiological signals, test platform
Procedia PDF Downloads 14212072 Reasons and Complexities around Using Alcohol and Other Drugs among Aboriginal People Experiencing Homelessness
Authors: Mandy Wilson, Emma Vieira, Jocelyn Jones, Alice V. Brown, Lindey Andrews, Louise Southalan, Jackie Oakley, Dorothy Bagshaw, Patrick Egan, Laura Dent, Duc Dau, Lucy Spanswick
Abstract:
Alcohol and drug dependency are pertinent issues for those experiencing homelessness. This includes Aboriginal and Torres Strait Islander people, Australia’s traditional owners, living in Perth, Western Australia (WA). Societal narratives around the drivers behind drug and alcohol dependency in Aboriginal communities, particularly those experiencing homelessness, have been biased and unchanging, with little regard for complexity. This can include the idea that Aboriginal people have ‘chosen’ to use alcohol or other drugs without consideration for intergenerational trauma and the trauma of homelessness that may influence their choices. These narratives have flow-on impacts on policies and services that directly impact Aboriginal people experiencing homelessness. In 2021, we commenced a project which aimed to listen to and elevate the voices of 70-90 Aboriginal people experiencing homelessness in Perth. The project is community-driven, led by an Aboriginal Community Controlled Organisation in partnership with a university research institute. A community-ownership group of Aboriginal Elders endorsed the project’s methods, chosen to ensure their suitability for the Aboriginal community. In this paper, we detail these methods, including semi-structured interviews influenced by an Aboriginal yarning approach – an important style of conversation for Aboriginal people which follows cultural protocols; and photovoice – supporting people to share their stories through photography. Through these engagements, we detail the reasons Aboriginal people in Perth shared for using alcohol or other drugs while experiencing homelessness. These included supporting their survival on the streets, managing their mental health, and coping while on the journey to finding support. We also detail why they sought to discontinue alcohol and other drug use, including wanting to reconnect with family and changing priorities. Finally, we share how Aboriginal people experiencing homelessness have said they are impacted by their family’s alcohol and other drug use, including feeling uncomfortable living with a family who is drug and alcohol-dependent and having to care for grandchildren despite their own homelessness. These findings provide a richer understanding of alcohol and drug use for Aboriginal people experiencing homelessness in Perth, shedding light on potential changes to targeted policy and service approaches.Keywords: Aboriginal and Torres Strait Islander peoples, alcohol and other drugs, homelessness, community-led research
Procedia PDF Downloads 13112071 Fractional Calculus into Structural Dynamics
Authors: Jorge Lopez
Abstract:
In this work, we introduce fractional calculus in order to study the dynamics of a damped multistory building with some symmetry. Initially we make a review of the dynamics of a free and damped multistory building. Then we introduce those concepts of fractional calculus that will be involved in our study. It has been noticed that fractional calculus provides models with less parameters than those based on classical calculus. In particular, a damped classical oscilator is more naturally described by using fractional derivatives. Accordingly, we model our multistory building as a set of coupled fractional oscillators and compare its dynamics with the results coming from traditional methods.Keywords: coupled oscillators, fractional calculus, fractional oscillator, structural dynamics
Procedia PDF Downloads 24312070 Optimization of Dez Dam Reservoir Operation Using Genetic Algorithm
Authors: Alireza Nikbakht Shahbazi, Emadeddin Shirali
Abstract:
Since optimization issues of water resources are complicated due to the variety of decision making criteria and objective functions, it is sometimes impossible to resolve them through regular optimization methods or, it is time or money consuming. Therefore, the use of modern tools and methods is inevitable in resolving such problems. An accurate and essential utilization policy has to be determined in order to use natural resources such as water reservoirs optimally. Water reservoir programming studies aim to determine the final cultivated land area based on predefined agricultural models and water requirements. Dam utilization rule curve is also provided in such studies. The basic information applied in water reservoir programming studies generally include meteorological, hydrological, agricultural and water reservoir related data, and the geometric characteristics of the reservoir. The system of Dez dam water resources was simulated applying the basic information in order to determine the capability of its reservoir to provide the objectives of the performed plan. As a meta-exploratory method, genetic algorithm was applied in order to provide utilization rule curves (intersecting the reservoir volume). MATLAB software was used in order to resolve the foresaid model. Rule curves were firstly obtained through genetic algorithm. Then the significance of using rule curves and the decrease in decision making variables in the system was determined through system simulation and comparing the results with optimization results (Standard Operating Procedure). One of the most essential issues in optimization of a complicated water resource system is the increasing number of variables. Therefore a lot of time is required to find an optimum answer and in some cases, no desirable result is obtained. In this research, intersecting the reservoir volume has been applied as a modern model in order to reduce the number of variables. Water reservoir programming studies has been performed based on basic information, general hypotheses and standards and applying monthly simulation technique for a statistical period of 30 years. Results indicated that application of rule curve prevents the extreme shortages and decrease the monthly shortages.Keywords: optimization, rule curve, genetic algorithm method, Dez dam reservoir
Procedia PDF Downloads 26512069 Critical Investigation on Performance of Polymeric Materials in Rehabilitation of Metallic Components
Authors: Parastou Kharazmi
Abstract:
Failure and leakage of metallic components because of corrosion in infrastructure structures is a considerably problematic and expensive issue and the traditional solution of replacing the component is costly and time-consuming. Rehabilitation techniques by using advanced polymeric materials are an alternative solution towards this problem. This paper provides a summary of analyses on relined rehabilitated metallic samples after exposure in practice and real condition to study the composite material performance when it is exposed to water, heat and chemicals in real condition. The study was carried out by using different test methods such as microscopy, thermal and chemical as well as mechanical analyses.Keywords: composite, material, rehabilitation, structure
Procedia PDF Downloads 23612068 Review of Cable Fault Locating Methods and Usage of VLF for Real Cases of High Resistance Fault Locating
Authors: Saadat Ali, Rashid Abdulla Ahmed Alshehhi
Abstract:
Cable faults are always probable and common during or after commissioning, causing significant delays and disrupting power distribution or transmission network, which is intolerable for the utilities&service providers being their reliability and business continuity measures. Therefore, the adoption of rapid localization & rectification methodology is the main concern for them. This paper explores the present techniques available for high voltage cable localization & rectification and which is preferable with regards to easier, faster, and also less harmful to cables. It also provides insight experience of high resistance fault locating by utilization of the Very Low Frequency (VLF) method.Keywords: faults, VLF, real cases, cables
Procedia PDF Downloads 11212067 Elevating Healthcare Social Work: Implementing and Evaluating the (Introduction, Subjective, Objective, Assessment, Plan, Summary) Documentation Model
Authors: Shir Daphna-Tekoah, Nurit Eitan-Gutman, Uri Balla
Abstract:
Background: Systemic documentation is essential in social work practice. Collaboration between an institution of higher education and social work health care services enabled adaptation of the medical documentation model of SOAP in the field of social work, by creating the ISOAPS model (Introduction, Subjective, Objective, Assessment, Plan, Summary) model. Aims: The article describes the ISOAPS model and its implementation in the field of social work, as a tool for standardization of documentation and the enhancement of multidisciplinary collaboration. Methods: We examined the changes in standardization using a mixed methods study, both before and after implementation of the model. A review of social workers’ documentation was carried out by medical staff and social workers in the Clalit Healthcare Services, the largest provider of public and semi-private health services in Israel. After implementation of the model, semi-structured qualitative interviews were undertaken. Main findings: The percentage of reviewers who evaluated their documentation as correct increased from 46%, prior to implementation, to 61% after implementation. After implementation, 81% of the social workers noted that their documentation had become standardized. The training process prepared them for the change in documentation and most of them (83%) started using the model on a regular basis. The qualitative data indicate that the use of the ISOAPS model creates uniform documentation, improves standards and is important to teach social work students. Conclusions: The ISOAPS model standardizes documentation and promotes communication between social workers and medical staffs. Implications for practice: In the intricate realm of healthcare, efficient documentation systems are pivotal to ensuring coherent interdisciplinary communication and patient care. The ISOAPS model emerges as a quintessential instrument, meticulously tailored to the nuances of social work documentation. While it extends its utility across the broad spectrum of social work, its specificity is most pronounced in the medical domain. This model not only exemplifies rigorous academic and professional standards but also serves as a testament to the potential of contextualized documentation systems in elevating the overall stature of social work within healthcare. Such a strategic documentation tool can not only streamline the intricate processes inherent in medical social work but also underscore the indispensable role that social workers play in the broader healthcare ecosystem.Keywords: ISOAPS, professional documentation, medial social-work, social work
Procedia PDF Downloads 7012066 Automatic Intelligent Analysis of Malware Behaviour
Authors: Hermann Dornhackl, Konstantin Kadletz, Robert Luh, Paul Tavolato
Abstract:
In this paper we describe the use of formal methods to model malware behaviour. The modelling of harmful behaviour rests upon syntactic structures that represent malicious procedures inside malware. The malicious activities are modelled by a formal grammar, where API calls’ components are the terminals and the set of API calls used in combination to achieve a goal are designated non-terminals. The combination of different non-terminals in various ways and tiers make up the attack vectors that are used by harmful software. Based on these syntactic structures a parser can be generated which takes execution traces as input for pattern recognition.Keywords: malware behaviour, modelling, parsing, search, pattern matching
Procedia PDF Downloads 33212065 Zero-Dissipative Explicit Runge-Kutta Method for Periodic Initial Value Problems
Authors: N. Senu, I. A. Kasim, F. Ismail, N. Bachok
Abstract:
In this paper zero-dissipative explicit Runge-Kutta method is derived for solving second-order ordinary differential equations with periodical solutions. The phase-lag and dissipation properties for Runge-Kutta (RK) method are also discussed. The new method has algebraic order three with dissipation of order infinity. The numerical results for the new method are compared with existing method when solving the second-order differential equations with periodic solutions using constant step size.Keywords: dissipation, oscillatory solutions, phase-lag, Runge-Kutta methods
Procedia PDF Downloads 41112064 [Keynote Speech]: Facilitating Familial Support of Saudi Arabians Living with HIV/AIDS
Authors: Noor Attar
Abstract:
The paper provides an overview of the current situation of HIV/AIDS patients in the Kingdom of Saudi Arabia (KSA) and a literature review of the concepts of stigma communication, communication of social support. These concepts provide the basis for the proposed methods, which will include conducting a textual analysis of materials that are currently distributed to family members of persons living with HIV/AIDS (PLWHIV/A) in KSA and creating an educational brochure. The brochure will aim to help families of PLWHIV/A in KSA (1) understand how stigma shapes the experience of PLWHIV/A, (2) realize the role of positive communication as a helpful social support, and (3) develop the ability to provide positive social support for their loved ones. Procedia PDF Downloads 31212063 Professional Development in EFL Classroom: Motivation and Reflection
Authors: Iman Jabbar
Abstract:
Within the scope of professionalism and in order to compete with the modern world, teachers, are expected to develop their teaching skills and activities in addition to their professional knowledge. At the college level, the teacher should be able to face classroom challenges through his engagement with the learning situation to understand the students and their needs. In our field of TESOL, the role of the English teacher is no longer restricted to teaching English texts, but rather he should endeavor to enhance the students’ skills such as communication and critical analysis. Within the literature of professionalism, there are certain strategies and tools that an English teacher should adopt to develop his competence and performance. Reflective practice, which is an exploratory process, is one of these strategies. Another strategy contributing to classroom development is motivation. It is crucial in students’ learning as it affects the quality of learning English in the classroom in addition to determining success or failure as well as language achievement. This is a qualitative study grounded on interpretive perspectives of teachers and students regarding the process of professional development. This study aims at (a) understanding how teachers at the college level conceptualize reflective practice and motivation inside EFL classroom, and (b) exploring the methods and strategies that they implement to practice reflection and motivation. This study and is based on two questions: 1. How do EFL teachers perceive and view reflection and motivation in relation to their teaching and professional development? 2. How can reflective practice and motivation be developed into practical strategies and actions in EFL teachers’ professional context? The study is organized into two parts, theoretical and practical. The theoretical part reviews the literature on the concept of reflective practice and motivation in relation to professional development through providing certain definitions, theoretical models, and strategies. The practical part draws on the theoretical one, however; it is the core of the study since it deals with two issues. It involves the research design, methodology, and methods of data collection, sampling, and data analysis. It ends up with an overall discussion of findings and the researcher's reflections on the investigated topic. In terms of significance, the study is intended to contribute to the field of TESOL at the academic level through the selection of the topic and investigating it from theoretical and practical perspectives. Professional development is the path that leads to enhancing the quality of teaching English as a foreign or second language in a way that suits the modern trends of globalization and advanced technology.Keywords: professional development, motivation, reflection, learning
Procedia PDF Downloads 45112062 Developing an Intonation Labeled Dataset for Hindi
Authors: Esha Banerjee, Atul Kumar Ojha, Girish Nath Jha
Abstract:
This study aims to develop an intonation labeled database for Hindi. Although no single standard for prosody labeling exists in Hindi, researchers in the past have employed perceptual and statistical methods in literature to draw inferences about the behavior of prosody patterns in Hindi. Based on such existing research and largely agreed upon intonational theories in Hindi, this study attempts to develop a manually annotated prosodic corpus of Hindi speech data, which can be used for training speech models for natural-sounding speech in the future. 100 sentences ( 500 words) each for declarative and interrogative types have been labeled using Praat.Keywords: speech dataset, Hindi, intonation, labeled corpus
Procedia PDF Downloads 19912061 Extension of the Simplified Theory of Plastic Zones for Analyzing Elastic Shakedown in a Multi-Dimensional Load Domain
Authors: Bastian Vollrath, Hartwig Hubel
Abstract:
In case of over-elastic and cyclic loading, strain may accumulate due to a ratcheting mechanism until the state of shakedown is possibly achieved. Load history dependent numerical investigations by a step-by-step analysis are rather costly in terms of engineering time and numerical effort. In the case of multi-parameter loading, where various independent loadings affect the final state of shakedown, the computational effort becomes an additional challenge. Therefore, direct methods like the Simplified Theory of Plastic Zones (STPZ) are developed to solve the problem with a few linear elastic analyses. Post-shakedown quantities such as strain ranges and cyclic accumulated strains are calculated approximately by disregarding the load history. The STPZ is based on estimates of a transformed internal variable, which can be used to perform modified elastic analyses, where the elastic material parameters are modified, and initial strains are applied as modified loading, resulting in residual stresses and strains. The STPZ already turned out to work well with respect to cyclic loading between two states of loading. Usually, few linear elastic analyses are sufficient to obtain a good approximation to the post-shakedown quantities. In a multi-dimensional load domain, the approximation of the transformed internal variable transforms from a plane problem into a hyperspace problem, where time-consuming approximation methods need to be applied. Therefore, a solution restricted to structures with four stress components was developed to estimate the transformed internal variable by means of three-dimensional vector algebra. This paper presents the extension to cyclic multi-parameter loading so that an unlimited number of load cases can be taken into account. The theoretical basis and basic presumptions of the Simplified Theory of Plastic Zones are outlined for the case of elastic shakedown. The extension of the method to many load cases is explained, and a workflow of the procedure is illustrated. An example, adopting the FE-implementation of the method into ANSYS and considering multilinear hardening is given which highlights the advantages of the method compared to incremental, step-by-step analysis.Keywords: cyclic loading, direct method, elastic shakedown, multi-parameter loading, STPZ
Procedia PDF Downloads 16212060 Performance Comparison of a Low Cost Air Quality Sensor with a Commercial Electronic Nose
Authors: Ünal Kızıl, Levent Genç, Sefa Aksu, Ahmet Tapınç
Abstract:
The Figaro AM-1 sensor module which employs TGS 2600 model gas sensor in air quality assessment was used. The system was coupled with a microprocessor that enables sensor module to create warning message via telephone. This low cot sensor system’s performance was compared with a Diagnose II commercial electronic nose system. Both air quality sensor and electronic nose system employ metal oxide chemical gas sensors. In the study experimental setup, data acquisition methods for electronic nose system, and performance of the low cost air quality system were evaluated and explained.Keywords: air quality, electronic nose, environmental quality, gas sensor
Procedia PDF Downloads 44412059 Detection and Identification of Antibiotic Resistant Bacteria Using Infra-Red-Microscopy and Advanced Multivariate Analysis
Authors: Uraib Sharaha, Ahmad Salman, Eladio Rodriguez-Diaz, Elad Shufan, Klaris Riesenberg, Irving J. Bigio, Mahmoud Huleihel
Abstract:
Antimicrobial drugs have an important role in controlling illness associated with infectious diseases in animals and humans. However, the increasing resistance of bacteria to a broad spectrum of commonly used antibiotics has become a global health-care problem. Rapid determination of antimicrobial susceptibility of a clinical isolate is often crucial for the optimal antimicrobial therapy of infected patients and in many cases can save lives. The conventional methods for susceptibility testing like disk diffusion are time-consuming and other method including E-test, genotyping are relatively expensive. Fourier transform infrared (FTIR) microscopy is rapid, safe, and low cost method that was widely and successfully used in different studies for the identification of various biological samples including bacteria. The new modern infrared (IR) spectrometers with high spectral resolution enable measuring unprecedented biochemical information from cells at the molecular level. Moreover, the development of new bioinformatics analyses combined with IR spectroscopy becomes a powerful technique, which enables the detection of structural changes associated with resistivity. The main goal of this study is to evaluate the potential of the FTIR microscopy in tandem with machine learning algorithms for rapid and reliable identification of bacterial susceptibility to antibiotics in time span of few minutes. The bacterial samples, which were identified at the species level by MALDI-TOF and examined for their susceptibility by the routine assay (micro-diffusion discs), are obtained from the bacteriology laboratories in Soroka University Medical Center (SUMC). These samples were examined by FTIR microscopy and analyzed by advanced statistical methods. Our results, based on 550 E.coli samples, were promising and showed that by using infrared spectroscopic technique together with multivariate analysis, it is possible to classify the tested bacteria into sensitive and resistant with success rate higher than 85% for eight different antibiotics. Based on these preliminary results, it is worthwhile to continue developing the FTIR microscopy technique as a rapid and reliable method for identification antibiotic susceptibility.Keywords: antibiotics, E. coli, FTIR, multivariate analysis, susceptibility
Procedia PDF Downloads 26512058 Artificial Membrane Comparison for Skin Permeation in Skin PAMPA
Authors: Aurea C. L. Lacerda, Paulo R. H. Moreno, Bruna M. P. Vianna, Cristina H. R. Serra, Airton Martin, André R. Baby, Vladi O. Consiglieri, Telma M. Kaneko
Abstract:
The modified Franz cell is the most widely used model for in vitro permeation studies, however it still presents some disadvantages. Thus, some alternative methods have been developed such as Skin PAMPA, which is a bio- artificial membrane that has been applied for skin penetration estimation of xenobiotics based on HT permeability model consisting. Skin PAMPA greatest advantage is to carry out more tests, in a fast and inexpensive way. The membrane system mimics the stratum corneum characteristics, which is the primary skin barrier. The barrier properties are given by corneocytes embedded in a multilamellar lipid matrix. This layer is the main penetration route through the paracellular permeation pathway and it consists of a mixture of cholesterol, ceramides, and fatty acids as the dominant components. However, there is no consensus on the membrane composition. The objective of this work was to compare the performance among different bio-artificial membranes for studying the permeation in skin PAMPA system. Material and methods: In order to mimetize the lipid composition`s present in the human stratum corneum six membranes were developed. The membrane composition was equimolar mixture of cholesterol, ceramides 1-O-C18:1, C22, and C20, plus fatty acids C20 and C24. The membrane integrity assay was based on the transport of Brilliant Cresyl Blue, which has a low permeability; and Lucifer Yellow with very poor permeability and should effectively be completely rejected. The membrane characterization was performed using Confocal Laser Raman Spectroscopy, using stabilized laser at 785 nm with 10 second integration time and 2 accumulations. The membrane behaviour results on the PAMPA system were statistically evaluated and all of the compositions have shown integrity and permeability. The confocal Raman spectra were obtained in the region of 800-1200 cm-1 that is associated with the C-C stretches of the carbon scaffold from the stratum corneum lipids showed similar pattern for all the membranes. The ceramides, long chain fatty acids and cholesterol in equimolar ratio permitted to obtain lipid mixtures with self-organization capability, similar to that occurring into the stratum corneum. Conclusion: The artificial biological membranes studied for Skin PAMPA showed to be similar and with comparable properties to the stratum corneum.Keywords: bio-artificial membranes, comparison, confocal Raman, skin PAMPA
Procedia PDF Downloads 50912057 Challenges to Developing a Trans-European Programme for Health Professionals to Recognize and Respond to Survivors of Domestic Violence and Abuse
Authors: June Keeling, Christina Athanasiades, Vaiva Hendrixson, Delyth Wyndham
Abstract:
Recognition and education in violence, abuse, and neglect for medical and healthcare practitioners (REVAMP) is a trans-European project aiming to introduce a training programme that has been specifically developed by partners across seven European countries to meet the needs of medical and healthcare practitioners. Amalgamating the knowledge and experience of clinicians, researchers, and educators from interdisciplinary and multi-professional backgrounds, REVAMP has tackled the under-resourced and underdeveloped area of domestic violence and abuse. The team designed an online training programme to support medical and healthcare practitioners to recognise and respond appropriately to survivors of domestic violence and abuse at their point of contact with a health provider. The REVAMP partner countries include Europe: France, Lithuania, Germany, Greece, Iceland, Norway, and the UK. The training is delivered through a series of interactive online modules, adapting evidence-based pedagogical approaches to learning. Capturing and addressing the complexities of the project impacted the methodological decisions and approaches to evaluation. The challenge was to find an evaluation methodology that captured valid data across all partner languages to demonstrate the extent of the change in knowledge and understanding. Co-development by all team members was a lengthy iterative process, challenged by a lack of consistency in terminology. A mixed methods approach enabled both qualitative and quantitative data to be collected, at the start, during, and at the conclusion of the training for the purposes of evaluation. The module content and evaluation instrument were accessible in each partner country's language. Collecting both types of data provided a high-level snapshot of attainment via the quantitative dataset and an in-depth understanding of the impact of the training from the qualitative dataset. The analysis was mixed methods, with integration at multiple interfaces. The primary focus of the analysis was to support the overall project evaluation for the funding agency. A key project outcome was identifying that the trans-European approach posed several challenges. Firstly, the project partners did not share a first language or a legal or professional approach to domestic abuse and neglect. This was negotiated through complex, systematic, and iterative interaction between team members so that consensus could be achieved. Secondly, the context of the data collection in several different cultural, educational, and healthcare systems across Europe challenged the development of a robust evaluation. The participants in the pilot evaluation shared that the training was contemporary, well-designed, and of great relevance to inform practice. Initial results from the evaluation indicated that the participants were drawn from more than eight partner countries due to the online nature of the training. The primary results indicated a high level of engagement with the content and achievement through the online assessment. The main finding was that the participants perceived the impact of domestic abuse and neglect in very different ways in their individual professional contexts. Most significantly, the participants recognised the need for the training and the gap that existed previously. It is notable that a mixed-methods evaluation of a trans-European project is unusual at this scale.Keywords: domestic violence, e-learning, health professionals, trans-European
Procedia PDF Downloads 83