Search results for: coping techniques
5844 The Staphylococcus aureus Exotoxin Recognition Using Nanobiosensor Designed by an Antibody-Attached Nanosilica Method
Authors: Hamed Ahari, Behrouz Akbari Adreghani, Vadood Razavilar, Amirali Anvar, Sima Moradi, Hourieh Shalchi
Abstract:
Considering the ever increasing population and industrialization of the developmental trend of humankind's life, we are no longer able to detect the toxins produced in food products using the traditional techniques. This is due to the fact that the isolation time for food products is not cost-effective and even in most of the cases, the precision in the practical techniques like the bacterial cultivation and other techniques suffer from operator errors or the errors of the mixtures used. Hence with the advent of nanotechnology, the design of selective and smart sensors is one of the greatest industrial revelations of the quality control of food products that in few minutes time, and with a very high precision can identify the volume and toxicity of the bacteria. Methods and Materials: In this technique, based on the bacterial antibody connection to nanoparticle, a sensor was used. In this part of the research, as the basis for absorption for the recognition of bacterial toxin, medium sized silica nanoparticles of 10 nanometer in form of solid powder were utilized with Notrino brand. Then the suspension produced from agent-linked nanosilica which was connected to bacterial antibody was positioned near the samples of distilled water, which were contaminated with Staphylococcus aureus bacterial toxin with the density of 10-3, so that in case any toxin exists in the sample, a connection between toxin antigen and antibody would be formed. Finally, the light absorption related to the connection of antigen to the particle attached antibody was measured using spectrophotometry. The gene of 23S rRNA that is conserved in all Staphylococcus spp., also used as control. The accuracy of the test was monitored by using serial dilution (l0-6) of overnight cell culture of Staphylococcus spp., bacteria (OD600: 0.02 = 107 cell). It showed that the sensitivity of PCR is 10 bacteria per ml of cells within few hours. Result: The results indicate that the sensor detects up to 10-4 density. Additionally, the sensitivity of the sensors was examined after 60 days, the sensor by the 56 days had confirmatory results and started to decrease after those time periods. Conclusions: Comparing practical nano biosensory to conventional methods like that culture and biotechnology methods(such as polymerase chain reaction) is accuracy, sensitiveness and being unique. In the other way, they reduce the time from the hours to the 30 minutes.Keywords: exotoxin, nanobiosensor, recognition, Staphylococcus aureus
Procedia PDF Downloads 3855843 Indigenous Understandings of Climate Vulnerability in Chile: A Qualitative Approach
Authors: Rosario Carmona
Abstract:
This article aims to discuss the importance of indigenous people participation in climate change mitigation and adaptation. Specifically, it analyses different understandings of climate vulnerability among diverse actors involved in climate change policies in Chile: indigenous people, state officials, and academics. These data were collected through participant observation and interviews conducted during October 2017 and January 2019 in Chile. Following Karen O’Brien, there are two types of vulnerability, outcome vulnerability and contextual vulnerability. How vulnerability to climate change is understood determines the approach, which actors are involved and which knowledge is considered to address it. Because climate change is a very complex phenomenon, it is necessary to transform the institutions and their responses. To do so, it is fundamental to consider these two perspectives and different types of knowledge, particularly those of the most vulnerable, such as indigenous people. For centuries and thanks to a long coexistence with the environment, indigenous societies have elaborated coping strategies, and some of them are already adapting to climate change. Indigenous people from Chile are not an exception. But, indigenous people tend to be excluded from decision-making processes. And indigenous knowledge is frequently seen as subjective and arbitrary in relation to science. Nevertheless, last years indigenous knowledge has gained particular relevance in the academic world, and indigenous actors are getting prominence in international negotiations. There are some mechanisms that promote their participation (e.g., Cancun safeguards, World Bank operational policies, REDD+), which are not absent from difficulties. And since 2016 parties are working on a Local Communities and Indigenous Peoples Platform. This paper also explores the incidence of this process in Chile. Although there is progress in the participation of indigenous people, this participation responds to the operational policies of the funding agencies and not to a real commitment of the state with this sector. The State of Chile omits a review of the structure that promotes inequality and the exclusion of indigenous people. In this way, climate change policies could be configured as a new mechanism of coloniality that validates a single type of knowledge and leads to new territorial control strategies, which increases vulnerability.Keywords: indigenous knowledge, climate change, vulnerability, Chile
Procedia PDF Downloads 1265842 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier
Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh
Abstract:
This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems
Procedia PDF Downloads 455841 Effects of Waist-to-Hip Ratio and Visceral Fat Measurements Improvement on Offshore Petrochemical Company Shift Employees' Work Efficiency
Authors: Essam Amerian
Abstract:
The aim of this study was to investigate the effects of improving waist-to-hip ratio (WHR) and visceral fat components on the health of shift workers in an offshore petrochemical company. A total of 100 male shift workers participated in the study, with an average age of 40.5 years and an average BMI of 28.2 kg/m². The study employed a randomized controlled trial design, with participants assigned to either an intervention group or a control group. The intervention group received a 12-week program that included dietary counseling, physical activity recommendations, and stress management techniques. The control group received no intervention. The outcomes measured were changes in WHR, visceral fat components, blood pressure, and lipid profile. The results showed that the intervention group had a statistically significant improvement in WHR (p<0.001) and visceral fat components (p<0.001) compared to the control group. Furthermore, there were statistically significant improvements in systolic blood pressure (p=0.015) and total cholesterol (p=0.034) in the intervention group compared to the control group. These findings suggest that implementing a 12-week program that includes dietary counseling, physical activity recommendations, and stress management techniques can effectively improve WHR, visceral fat components, and cardiovascular health among shift workers in an offshore petrochemical company.Keywords: body composition, waist-hip-ratio, visceral fat, shift worker, work efficiency
Procedia PDF Downloads 795840 Different Processing Methods to Obtain a Carbon Composite Element for Cycling
Authors: Maria Fonseca, Ana Branco, Joao Graca, Rui Mendes, Pedro Mimoso
Abstract:
The present work is focused on the production of a carbon composite element for cycling through different techniques, namely, blow-molding and high-pressure resin transfer injection (HP-RTM). The main objective of this work is to compare both processes to produce carbon composite elements for the cycling industry. It is well known that the carbon composite components for cycling are produced mainly through blow-molding; however, this technique depends strongly on manual labour, resulting in a time-consuming production process. Comparatively, HP-RTM offers a more automated process which should lead to higher production rates. Nevertheless, a comparison of the elements produced through both techniques must be done, in order to assess if the final products comply with the required standards of the industry. The main difference between said techniques lies in the used material. Blow-moulding uses carbon prepreg (carbon fibres pre-impregnated with a resin system), and the material is laid up by hand, piece by piece, on a mould or on a hard male. After that, the material is cured at a high temperature. On the other hand, in the HP-RTM technique, dry carbon fibres are placed on a mould, and then resin is injected at high pressure. After some research regarding the best material systems (prepregs and braids) and suppliers, an element was designed (similar to a handlebar) to be constructed. The next step was to perform FEM simulations in order to determine what the best layup of the composite material was. The simulations were done for the prepreg material, and the obtained layup was transposed to the braids. The selected material was a prepreg with T700 carbon fibre (24K) and an epoxy resin system, for the blow-molding technique. For HP-RTM, carbon fibre elastic UD tubes and ± 45º braids were used, with both 3K and 6K filaments per tow, and the resin system was an epoxy as well. After the simulations for the prepreg material, the optimized layup was: [45°, -45°,45°, -45°,0°,0°]. For HP-RTM, the transposed layup was [ ± 45° (6k); 0° (6k); partial ± 45° (6k); partial ± 45° (6k); ± 45° (3k); ± 45° (3k)]. The mechanical tests showed that both elements can withstand the maximum load (in this case, 1000 N); however, the one produced through blow-molding can support higher loads (≈1300N against 1100N from HP-RTM). In what concerns to the fibre volume fraction (FVF), the HP-RTM element has a slightly higher value ( > 61% compared to 59% of the blow-molding technique). The optical microscopy has shown that both elements have a low void content. In conclusion, the elements produced using HP-RTM can compare to the ones produced through blow-molding, both in mechanical testing and in the visual aspect. Nevertheless, there is still space for improvement in the HP-RTM elements since the layup of the braids, and UD tubes could be optimized.Keywords: HP-RTM, carbon composites, cycling, FEM
Procedia PDF Downloads 1325839 High Resolution Satellite Imagery and Lidar Data for Object-Based Tree Species Classification in Quebec, Canada
Authors: Bilel Chalghaf, Mathieu Varin
Abstract:
Forest characterization in Quebec, Canada, is usually assessed based on photo-interpretation at the stand level. For species identification, this often results in a lack of precision. Very high spatial resolution imagery, such as DigitalGlobe, and Light Detection and Ranging (LiDAR), have the potential to overcome the limitations of aerial imagery. To date, few studies have used that data to map a large number of species at the tree level using machine learning techniques. The main objective of this study is to map 11 individual high tree species ( > 17m) at the tree level using an object-based approach in the broadleaf forest of Kenauk Nature, Quebec. For the individual tree crown segmentation, three canopy-height models (CHMs) from LiDAR data were assessed: 1) the original, 2) a filtered, and 3) a corrected model. The corrected CHM gave the best accuracy and was then coupled with imagery to refine tree species crown identification. When compared with photo-interpretation, 90% of the objects represented a single species. For modeling, 313 variables were derived from 16-band WorldView-3 imagery and LiDAR data, using radiance, reflectance, pixel, and object-based calculation techniques. Variable selection procedures were employed to reduce their number from 313 to 16, using only 11 bands to aid reproducibility. For classification, a global approach using all 11 species was compared to a semi-hierarchical hybrid classification approach at two levels: (1) tree type (broadleaf/conifer) and (2) individual broadleaf (five) and conifer (six) species. Five different model techniques were used: (1) support vector machine (SVM), (2) classification and regression tree (CART), (3) random forest (RF), (4) k-nearest neighbors (k-NN), and (5) linear discriminant analysis (LDA). Each model was tuned separately for all approaches and levels. For the global approach, the best model was the SVM using eight variables (overall accuracy (OA): 80%, Kappa: 0.77). With the semi-hierarchical hybrid approach, at the tree type level, the best model was the k-NN using six variables (OA: 100% and Kappa: 1.00). At the level of identifying broadleaf and conifer species, the best model was the SVM, with OA of 80% and 97% and Kappa values of 0.74 and 0.97, respectively, using seven variables for both models. This paper demonstrates that a hybrid classification approach gives better results and that using 16-band WorldView-3 with LiDAR data leads to more precise predictions for tree segmentation and classification, especially when the number of tree species is large.Keywords: tree species, object-based, classification, multispectral, machine learning, WorldView-3, LiDAR
Procedia PDF Downloads 1345838 Task Validity in Neuroimaging Studies: Perspectives from Applied Linguistics
Authors: L. Freeborn
Abstract:
Recent years have seen an increasing number of neuroimaging studies related to language learning as imaging techniques such as fMRI and EEG have become more widely accessible to researchers. By using a variety of structural and functional neuroimaging techniques, these studies have already made considerable progress in terms of our understanding of neural networks and processing related to first and second language acquisition. However, the methodological designs employed in neuroimaging studies to test language learning have been questioned by applied linguists working within the field of second language acquisition (SLA). One of the major criticisms is that tasks designed to measure language learning gains rarely have a communicative function, and seldom assess learners’ ability to use the language in authentic situations. This brings the validity of many neuroimaging tasks into question. The fundamental reason why people learn a language is to communicate, and it is well-known that both first and second language proficiency are developed through meaningful social interaction. With this in mind, the SLA field is in agreement that second language acquisition and proficiency should be measured through learners’ ability to communicate in authentic real-life situations. Whilst authenticity is not always possible to achieve in a classroom environment, the importance of task authenticity should be reflected in the design of language assessments, teaching materials, and curricula. Tasks that bear little relation to how language is used in real-life situations can be considered to lack construct validity. This paper first describes the typical tasks used in neuroimaging studies to measure language gains and proficiency, then analyses to what extent these tasks can validly assess these constructs.Keywords: neuroimaging studies, research design, second language acquisition, task validity
Procedia PDF Downloads 1385837 Profiling Risky Code Using Machine Learning
Authors: Zunaira Zaman, David Bohannon
Abstract:
This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties
Procedia PDF Downloads 1075836 Extracting the Coupled Dynamics in Thin-Walled Beams from Numerical Data Bases
Authors: Mohammad A. Bani-Khaled
Abstract:
In this work we use the Discrete Proper Orthogonal Decomposition transform to characterize the properties of coupled dynamics in thin-walled beams by exploiting numerical simulations obtained from finite element simulations. The outcomes of the will improve our understanding of the linear and nonlinear coupled behavior of thin-walled beams structures. Thin-walled beams have widespread usage in modern engineering application in both large scale structures (aeronautical structures), as well as in nano-structures (nano-tubes). Therefore, detailed knowledge in regard to the properties of coupled vibrations and buckling in these structures are of great interest in the research community. Due to the geometric complexity in the overall structure and in particular in the cross-sections it is necessary to involve computational mechanics to numerically simulate the dynamics. In using numerical computational techniques, it is not necessary to over simplify a model in order to solve the equations of motions. Computational dynamics methods produce databases of controlled resolution in time and space. These numerical databases contain information on the properties of the coupled dynamics. In order to extract the system dynamic properties and strength of coupling among the various fields of the motion, processing techniques are required. Time- Proper Orthogonal Decomposition transform is a powerful tool for processing databases for the dynamics. It will be used to study the coupled dynamics of thin-walled basic structures. These structures are ideal to form a basis for a systematic study of coupled dynamics in structures of complex geometry.Keywords: coupled dynamics, geometric complexity, proper orthogonal decomposition (POD), thin walled beams
Procedia PDF Downloads 4185835 Effect of Post Circuit Resistance Exercise Glucose Feeding on Energy and Hormonal Indexes in Plasma and Lymphocyte in Free-Style Wrestlers
Authors: Miesam Golzadeh Gangraj, Younes Parvasi, Mohammad Ghasemi, Ahmad Abdi, Saeid Fazelifar
Abstract:
The purpose of the study was to determine the effect of glucose feeding on energy and hormonal indexes in plasma and lymphocyte immediately after wrestling – base techniques circuit exercise (WBTCE) in young male freestyle wrestlers. Sixteen wrestlers (weight = 75/45 ± 12/92 kg, age = 22/29 ± 0/90 years, BMI = 26/23 ± 2/64 kg/m²) were randomly divided into two groups: control (water), glucose (2 gr per kg body weight). Blood samples were obtained before, immediately, and 90 minutes of the post-exercise recovery period. Glucose (2 g/kg of body weight, 1W/5V) and water (equal volumes) solutions were given immediately after the second blood sampling. Data were analyzed by using an ANOVA (a repeated measure) and a suitable post hoc test (LSD). A significant decrease was observed in lymphocytes glycogen immediately after exercise (P < 0.001). In the experimental group, increase Lymphocyte glycogen concentration (P < 0.028) than in the control group in 90 min post-exercise. Plasma glucose concentrations increased in all groups immediately after exercise (P < 0.05). Plasma insulin concentrations in both groups decreased immediately after exercise, but at 90 min after exercise, its level was significantly increased only in glucose group (P < 0.001). Our results suggested that WBTCE protocol could be affected cellular energy sources and hormonal response. Furthermore, Glucose consumption can increase the lymphocyte glycogen and better energy within the cell.Keywords: glucose feeding, lymphocyte, Wrestling – base techniques circuit , exercise
Procedia PDF Downloads 2715834 Modeling and Simulation of Ship Structures Using Finite Element Method
Authors: Javid Iqbal, Zhu Shifan
Abstract:
The development in the construction of unconventional ships and the implementation of lightweight materials have shown a large impulse towards finite element (FE) method, making it a general tool for ship design. This paper briefly presents the modeling and analysis techniques of ship structures using FE method for complex boundary conditions which are difficult to analyze by existing Ship Classification Societies rules. During operation, all ships experience complex loading conditions. These loads are general categories into thermal loads, linear static, dynamic and non-linear loads. General strength of the ship structure is analyzed using static FE analysis. FE method is also suitable to consider the local loads generated by ballast tanks and cargo in addition to hydrostatic and hydrodynamic loads. Vibration analysis of a ship structure and its components can be performed using FE method which helps in obtaining the dynamic stability of the ship. FE method has developed better techniques for calculation of natural frequencies and different mode shapes of ship structure to avoid resonance both globally and locally. There is a lot of development towards the ideal design in ship industry over the past few years for solving complex engineering problems by employing the data stored in the FE model. This paper provides an overview of ship modeling methodology for FE analysis and its general application. Historical background, the basic concept of FE, advantages, and disadvantages of FE analysis are also reported along with examples related to hull strength and structural components.Keywords: dynamic analysis, finite element methods, ship structure, vibration analysis
Procedia PDF Downloads 1375833 Modelling Conceptual Quantities Using Support Vector Machines
Authors: Ka C. Lam, Oluwafunmibi S. Idowu
Abstract:
Uncertainty in cost is a major factor affecting performance of construction projects. To our knowledge, several conceptual cost models have been developed with varying degrees of accuracy. Incorporating conceptual quantities into conceptual cost models could improve the accuracy of early predesign cost estimates. Hence, the development of quantity models for estimating conceptual quantities of framed reinforced concrete structures using supervised machine learning is the aim of the current research. Using measured quantities of structural elements and design variables such as live loads and soil bearing pressures, response and predictor variables were defined and used for constructing conceptual quantities models. Twenty-four models were developed for comparison using a combination of non-parametric support vector regression, linear regression, and bootstrap resampling techniques. R programming language was used for data analysis and model implementation. Gross soil bearing pressure and gross floor loading were discovered to have a major influence on the quantities of concrete and reinforcement used for foundations. Building footprint and gross floor loading had a similar influence on beams and slabs. Future research could explore the modelling of other conceptual quantities for walls, finishes, and services using machine learning techniques. Estimation of conceptual quantities would assist construction planners in early resource planning and enable detailed performance evaluation of early cost predictions.Keywords: bootstrapping, conceptual quantities, modelling, reinforced concrete, support vector regression
Procedia PDF Downloads 2065832 Sustainable Production of Tin Oxide Nanoparticles: Exploring Synthesis Techniques, Formation Mechanisms, and Versatile Applications
Authors: Yemane Tadesse Gebreslassie, Henok Gidey Gebretnsae
Abstract:
Nanotechnology has emerged as a highly promising field of research with wide-ranging applications across various scientific disciplines. In recent years, tin oxide has garnered significant attention due to its intriguing properties, particularly when synthesized in the nanoscale range. While numerous physical and chemical methods exist for producing tin oxide nanoparticles, these approaches tend to be costly, energy-intensive, and involve the use of toxic chemicals. Given the growing concerns regarding human health and environmental impact, there has been a shift towards developing cost-effective and environmentally friendly processes for tin oxide nanoparticle synthesis. Green synthesis methods utilizing biological entities such as plant extracts, bacteria, and natural biomolecules have shown promise in successfully producing tin oxide nanoparticles. However, scaling up the production to an industrial level using green synthesis approaches remains challenging due to the complexity of biological substrates, which hinders the elucidation of reaction mechanisms and formation processes. Thus, this review aims to provide an overview of the various sources of biological entities and methodologies employed in the green synthesis of tin oxide nanoparticles, as well as their impact on nanoparticle properties. Furthermore, this research delves into the strides made in comprehending the mechanisms behind the formation of nanoparticles as documented in existing literature. It also sheds light on the array of analytical techniques employed to investigate and elucidate the characteristics of these minuscule particles.Keywords: nanotechnology, tin oxide, green synthesis, formation mechanisms
Procedia PDF Downloads 535831 Numerical Investigation of Pressure Drop and Erosion Wear by Computational Fluid Dynamics Simulation
Authors: Praveen Kumar, Nitin Kumar, Hemant Kumar
Abstract:
The modernization of computer technology and commercial computational fluid dynamic (CFD) simulation has given better detailed results as compared to experimental investigation techniques. CFD techniques are widely used in different field due to its flexibility and performance. Evaluation of pipeline erosion is complex phenomenon to solve by numerical arithmetic technique, whereas CFD simulation is an easy tool to resolve that type of problem. Erosion wear behaviour due to solid–liquid mixture in the slurry pipeline has been investigated using commercial CFD code in FLUENT. Multi-phase Euler-Lagrange model was adopted to predict the solid particle erosion wear in 22.5° pipe bend for the flow of bottom ash-water suspension. The present study addresses erosion prediction in three dimensional 22.5° pipe bend for two-phase (solid and liquid) flow using finite volume method with standard k-ε turbulence, discrete phase model and evaluation of erosion wear rate with varying velocity 2-4 m/s. The result shows that velocity of solid-liquid mixture found to be highly dominating parameter as compared to solid concentration, density, and particle size. At low velocity, settling takes place in the pipe bend due to low inertia and gravitational effect on solid particulate which leads to high erosion at bottom side of pipeline.Keywords: computational fluid dynamics (CFD), erosion, slurry transportation, k-ε Model
Procedia PDF Downloads 4085830 Design and Performance Improvement of Three-Dimensional Optical Code Division Multiple Access Networks with NAND Detection Technique
Authors: Satyasen Panda, Urmila Bhanja
Abstract:
In this paper, we have presented and analyzed three-dimensional (3-D) matrices of wavelength/time/space code for optical code division multiple access (OCDMA) networks with NAND subtraction detection technique. The 3-D codes are constructed by integrating a two-dimensional modified quadratic congruence (MQC) code with one-dimensional modified prime (MP) code. The respective encoders and decoders were designed using fiber Bragg gratings and optical delay lines to minimize the bit error rate (BER). The performance analysis of the 3D-OCDMA system is based on measurement of signal to noise ratio (SNR), BER and eye diagram for a different number of simultaneous users. Also, in the analysis, various types of noises and multiple access interference (MAI) effects were considered. The results obtained with NAND detection technique were compared with those obtained with OR and AND subtraction techniques. The comparison results proved that the NAND detection technique with 3-D MQC\MP code can accommodate more number of simultaneous users for longer distances of fiber with minimum BER as compared to OR and AND subtraction techniques. The received optical power is also measured at various levels of BER to analyze the effect of attenuation.Keywords: Cross Correlation (CC), Three dimensional Optical Code Division Multiple Access (3-D OCDMA), Spectral Amplitude Coding Optical Code Division Multiple Access (SAC-OCDMA), Multiple Access Interference (MAI), Phase Induced Intensity Noise (PIIN), Three Dimensional Modified Quadratic Congruence/Modified Prime (3-D MQC/MP) code
Procedia PDF Downloads 4125829 Improvement of Sleep Quality Through Manual and Non-Pharmacological Treatment
Authors: Andreas Aceranti, Sergio Romanò, Simonetta Vernocchi, Silvia Arnaboldi, Emilio Mazza
Abstract:
As a result of the Sars-Cov2 pandemic, the incidence of thymism disorders has significantly increased and, often, patients are reluctant to want to take drugs aimed at stabilizing mood. In order to provide an alternative approach to drug therapies, we have prepared a study in order to evaluate the possibility of improving the quality of life of these subjects through osteopathic treatment. Patients were divided into visceral and fascial manual treatment with the aim of increasing serotonin levels and stimulating the vagus nerve through validated techniques. The results were evaluated through the administration of targeted questionnaires in order to assess quality of life, mood, sleep and intestinal functioning. At a first endpoint we found, in patients undergoing fascial treatment, an increase in quality of life and sleep: in fact, they report a decrease in the number of nocturnal awakenings; a reduction in falling asleep times and greater rest upon waking. In contrast, patients undergoing visceral treatment, as well as those included in the control group, did not show significant improvements. Patients in the fascial group have, in fact, reported an improvement in thymism and subjective quality of life with a generalized improvement in function. Although the study is still ongoing, based on the results of the first endpoint we can hypothesize that fascial stimulation of the vagus nerve with manual and osteopathic techniques may be a valid alternative to pharmacological treatments in mood and sleep disorders.Keywords: ostheopathy, insomnia, noctural awakening, thymism
Procedia PDF Downloads 905828 Considerations upon Structural Health Monitoring of Small to Medium Wind Turbines
Authors: Nicolae Constantin, Ştefan Sorohan
Abstract:
The small and medium wind turbines are running in quite different conditions as compared to the big ones. Consequently, they need also a different approach concerning the structural health monitoring (SHM) issues. There are four main differences between the above mentioned categories: (i) significantly smaller dimensions, (ii) considerably higher rotation speed, (iii) generally small distance between the turbine and the energy consumer and (iv) monitoring assumed in many situations by the owner. In such conditions, nondestructive inspections (NDI) have to be made as much as possible with affordable, yet effective techniques, requiring portable and accessible equipment. Additionally, the turbines and accessories should be easy to mount, dispose and repair. As the materials used for such unit can be metals, composites and combined, the technologies should be adapted accordingly. An example in which the two materials co-exist is the situation in which the damaged metallic skin of a blade is repaired with a composite patch. The paper presents the inspection of the bonding state of the patch, using portable ultrasonic equipment, able to put in place the Lamb wave method, which proves efficient in global and local inspections as well. The equipment is relatively easy to handle and can be borrowed from specialized laboratories or used by a community of small wind turbine users, upon the case. This evaluation is the first in a row, aimed to evaluate efficiency of NDI performed with rather accessible, less sophisticated equipment and related inspection techniques, having field inspection capabilities. The main goal is to extend such inspection procedures to other components of the wind power unit, such as the support tower, water storage tanks, etc.Keywords: structural health monitoring, small wind turbines, non-destructive inspection, field inspection capabilities
Procedia PDF Downloads 3395827 A Literature Review on Emotion Recognition Using Wireless Body Area Network
Authors: Christodoulou Christos, Politis Anastasios
Abstract:
The utilization of Wireless Body Area Network (WBAN) is experiencing a notable surge in popularity as a result of its widespread implementation in the field of smart health. WBANs utilize small sensors implanted within the human body to monitor and record physiological indicators. These sensors transmit the collected data to hospitals and healthcare facilities through designated access points. Bio-sensors exhibit a diverse array of shapes and sizes, and their deployment can be tailored to the condition of the individual. Multiple sensors may be strategically placed within, on, or around the human body to effectively observe, record, and transmit essential physiological indicators. These measurements serve as a basis for subsequent analysis, evaluation, and therapeutic interventions. In conjunction with physical health concerns, numerous smartwatches are engineered to employ artificial intelligence techniques for the purpose of detecting mental health conditions such as depression and anxiety. The utilization of smartwatches serves as a secure and cost-effective solution for monitoring mental health. Physiological signals are widely regarded as a highly dependable method for the recognition of emotions due to the inherent inability of individuals to deliberately influence them over extended periods of time. The techniques that WBANs employ to recognize emotions are thoroughly examined in this article.Keywords: emotion recognition, wireless body area network, WBAN, ERC, wearable devices, psychological signals, emotion, smart-watch, prediction
Procedia PDF Downloads 505826 Non-Cognitive Skills Associated with Learning in a Serious Gaming Environment: A Pretest-Posttest Experimental Design
Authors: Tanja Kreitenweis
Abstract:
Lifelong learning is increasingly seen as essential for coping with the rapidly changing work environment. To this end, serious games can provide convenient and straightforward access to complex knowledge for all age groups. However, learning achievements depend largely on a learner’s non-cognitive skill disposition (e.g., motivation, self-belief, playfulness, and openness). With the aim of combining the fields of serious games and non-cognitive skills, this research focuses in particular on the use of a business simulation, which conveys change management insights. Business simulations are a subset of serious games and are perceived as a non-traditional learning method. The presented objectives of this work are versatile: (1) developing a scale, which measures learners’ knowledge and skills level before and after a business simulation was played, (2) investigating the influence of non-cognitive skills on learning in this business simulation environment and (3) exploring the moderating role of team preference in this type of learning setting. First, expert interviews have been conducted to develop an appropriate measure for learners’ skills and knowledge assessment. A pretest-posttest experimental design with German management students was implemented to approach the remaining objectives. By using the newly developed, reliable measure, it was found that students’ skills and knowledge state were higher after the simulation had been played, compared to before. A hierarchical regression analysis revealed two positive predictors for this outcome: motivation and self-esteem. Unexpectedly, playfulness had a negative impact. Team preference strengthened the link between grit and playfulness, respectively, and learners’ skills and knowledge state after completing the business simulation. Overall, the data underlined the potential of business simulations to improve learners’ skills and knowledge state. In addition, motivational factors were found as predictors for benefitting most from the applied business simulation. Recommendations are provided for how pedagogues can use these findings.Keywords: business simulations, change management, (experiential) learning, non-cognitive skills, serious games
Procedia PDF Downloads 1085825 Discovering Social Entrepreneurship: A Qualitative Study on Stimulants and Obstacles for Social Entrepreneurs in the Hague
Authors: Loes Nijskens
Abstract:
The city of The Hague is coping with several social issues: high unemployment rates, segregation and environmental pollution. The amount of social enterprises in The Hague that want to tackle these issues is increasing, but no clear image exists of the stimulants and obstacles social entrepreneurs encounter. In this qualitative study 20 starting and established social entrepreneurs, investors and stimulators of social entrepreneurship have been interviewed. The findings indicate that the majority of entrepreneurs situated in The Hague focuses on creating jobs (the so called social nurturers) and diminishing food waste. Moreover, the study found smaller groups of social connectors, (who focus on stimulating the social cohesion in the city) and social traders (who create a market for products from developing countries). For the social nurturers, working together with local government to find people with a distance to the labour market is a challenge. The entrepreneurs are missing a governance approach within the local government, wherein space is provided to develop suitable legislation and projects in cooperation with several stakeholders in order to diminish social problems. All entrepreneurs in the sample face(d) the challenge of having a clear purpose of their business in the beginning. Starting social entrepreneurs tend to be idealistic without having defined a business model. Without a defined business model it is difficult to find proper funding for their business. The more advanced enterprises cope with the challenge of measuring social impact. The larger they grow, the more they have to ‘defend’ themselves towards the local government and their customers, of mainly being social. Hence, the more experienced social nurturers still find it difficult to work together with the local government. They tend to settle their business in other municipalities, where they find more effective public-private partnerships. Al this said, the eco-system for social enterprises in The Hague is on the rise. To stimulate the amount and growth of social enterprises the cooperation between entrepreneurs and local government, the developing of social business models and measuring of impact needs more attention.Keywords: obstacles, social enterprises, stimulants, the Hague
Procedia PDF Downloads 2185824 Testing of Protective Coatings on Automotive Steel, a Correlation Between Salt Spray, Electrochemical Impedance Spectroscopy, and Linear Polarization Resistance Test
Authors: Dhanashree Aole, V. Hariharan, Swati Surushe
Abstract:
Corrosion can cause serious and expensive damage to the automobile components. Various proven techniques for controlling and preventing corrosion depend on the specific material to be protected. Electrochemical Impedance Spectroscopy (EIS) and salt spray tests are commonly used to assess the corrosion degradation mechanism of coatings on metallic surfaces. While, the only test which monitors the corrosion rate in real time is known as Linear Polarisation Resistance (LPR). In this study, electrochemical tests (EIS & LPR) and spray test are reviewed to assess the corrosion resistance and durability of different coatings. The main objective of this study is to correlate the test results obtained using linear polarization resistance (LPR) and Electrochemical Impedance Spectroscopy (EIS) with the results obtained using standard salt spray test. Another objective of this work is to evaluate the performance of various coating systems- CED, Epoxy, Powder coating, Autophoretic, and Zn-trivalent coating for vehicle underbody application. The corrosion resistance coating are assessed. From this study, a promising correlation between different corrosion testing techniques is noted. The most profound observation is that electrochemical tests gives quick estimation of corrosion resistance and can detect the degradation of coatings well before visible signs of damage appear. Furthermore, the corrosion resistances and salt spray life of the coatings investigated were found to be according to the order as follows- CED> powder coating > Autophoretic > epoxy coating > Zn- Trivalent plating.Keywords: Linear Polarization Resistance (LPR), Electrochemical Impedance Spectroscopy (EIS), salt spray test, sacrificial and barrier coatings
Procedia PDF Downloads 5265823 Application of Interferometric Techniques for Quality Control Oils Used in the Food Industry
Authors: Andres Piña, Amy Meléndez, Pablo Cano, Tomas Cahuich
Abstract:
The purpose of this project is to propose a quick and environmentally friendly alternative to measure the quality of oils used in food industry. There is evidence that repeated and indiscriminate use of oils in food processing cause physicochemical changes with formation of potentially toxic compounds that can affect the health of consumers and cause organoleptic changes. In order to assess the quality of oils, non-destructive optical techniques such as Interferometry offer a rapid alternative to the use of reagents, using only the interaction of light on the oil. Through this project, we used interferograms of samples of oil placed under different heating conditions to establish the changes in their quality. These interferograms were obtained by means of a Mach-Zehnder Interferometer using a beam of light from a HeNe laser of 10mW at 632.8nm. Each interferogram was captured, analyzed and measured full width at half-maximum (FWHM) using the software from Amcap and ImageJ. The total of FWHMs was organized in three groups. It was observed that the average obtained from each of the FWHMs of group A shows a behavior that is almost linear, therefore it is probable that the exposure time is not relevant when the oil is kept under constant temperature. Group B exhibits a slight exponential model when temperature raises between 373 K and 393 K. Results of the t-Student show a probability of 95% (0.05) of the existence of variation in the molecular composition of both samples. Furthermore, we found a correlation between the Iodine Indexes (Physicochemical Analysis) and the Interferograms (Optical Analysis) of group C. Based on these results, this project highlights the importance of the quality of the oils used in food industry and shows how Interferometry can be a useful tool for this purpose.Keywords: food industry, interferometric, oils, quality control
Procedia PDF Downloads 3725822 Deep Learning for Qualitative and Quantitative Grain Quality Analysis Using Hyperspectral Imaging
Authors: Ole-Christian Galbo Engstrøm, Erik Schou Dreier, Birthe Møller Jespersen, Kim Steenstrup Pedersen
Abstract:
Grain quality analysis is a multi-parameterized problem that includes a variety of qualitative and quantitative parameters such as grain type classification, damage type classification, and nutrient regression. Currently, these parameters require human inspection, a multitude of instruments employing a variety of sensor technologies, and predictive model types or destructive and slow chemical analysis. This paper investigates the feasibility of applying near-infrared hyperspectral imaging (NIR-HSI) to grain quality analysis. For this study two datasets of NIR hyperspectral images in the wavelength range of 900 nm - 1700 nm have been used. Both datasets contain images of sparsely and densely packed grain kernels. The first dataset contains ~87,000 image crops of bulk wheat samples from 63 harvests where protein value has been determined by the FOSS Infratec NOVA which is the golden industry standard for protein content estimation in bulk samples of cereal grain. The second dataset consists of ~28,000 image crops of bulk grain kernels from seven different wheat varieties and a single rye variety. In the first dataset, protein regression analysis is the problem to solve while variety classification analysis is the problem to solve in the second dataset. Deep convolutional neural networks (CNNs) have the potential to utilize spatio-spectral correlations within a hyperspectral image to simultaneously estimate the qualitative and quantitative parameters. CNNs can autonomously derive meaningful representations of the input data reducing the need for advanced preprocessing techniques required for classical chemometric model types such as artificial neural networks (ANNs) and partial least-squares regression (PLS-R). A comparison between different CNN architectures utilizing 2D and 3D convolution is conducted. These results are compared to the performance of ANNs and PLS-R. Additionally, a variety of preprocessing techniques from image analysis and chemometrics are tested. These include centering, scaling, standard normal variate (SNV), Savitzky-Golay (SG) filtering, and detrending. The results indicate that the combination of NIR-HSI and CNNs has the potential to be the foundation for an automatic system unifying qualitative and quantitative grain quality analysis within a single sensor technology and predictive model type.Keywords: deep learning, grain analysis, hyperspectral imaging, preprocessing techniques
Procedia PDF Downloads 995821 Integrated Free Space Optical Communication and Optical Sensor Network System with Artificial Intelligence Techniques
Authors: Yibeltal Chanie Manie, Zebider Asire Munyelet
Abstract:
5G and 6G technology offers enhanced quality of service with high data transmission rates, which necessitates the implementation of the Internet of Things (IoT) in 5G/6G architecture. In this paper, we proposed the integration of free space optical communication (FSO) with fiber sensor networks for IoT applications. Recently, free-space optical communications (FSO) are gaining popularity as an effective alternative technology to the limited availability of radio frequency (RF) spectrum. FSO is gaining popularity due to flexibility, high achievable optical bandwidth, and low power consumption in several applications of communications, such as disaster recovery, last-mile connectivity, drones, surveillance, backhaul, and satellite communications. Hence, high-speed FSO is an optimal choice for wireless networks to satisfy the full potential of 5G/6G technology, offering 100 Gbit/s or more speed in IoT applications. Moreover, machine learning must be integrated into the design, planning, and optimization of future optical wireless communication networks in order to actualize this vision of intelligent processing and operation. In addition, fiber sensors are important to achieve real-time, accurate, and smart monitoring in IoT applications. Moreover, we proposed deep learning techniques to estimate the strain changes and peak wavelength of multiple Fiber Bragg grating (FBG) sensors using only the spectrum of FBGs obtained from the real experiment.Keywords: optical sensor, artificial Intelligence, Internet of Things, free-space optics
Procedia PDF Downloads 635820 Identification of High-Rise Buildings Using Object Based Classification and Shadow Extraction Techniques
Authors: Subham Kharel, Sudha Ravindranath, A. Vidya, B. Chandrasekaran, K. Ganesha Raj, T. Shesadri
Abstract:
Digitization of urban features is a tedious and time-consuming process when done manually. In addition to this problem, Indian cities have complex habitat patterns and convoluted clustering patterns, which make it even more difficult to map features. This paper makes an attempt to classify urban objects in the satellite image using object-oriented classification techniques in which various classes such as vegetation, water bodies, buildings, and shadows adjacent to the buildings were mapped semi-automatically. Building layer obtained as a result of object-oriented classification along with already available building layers was used. The main focus, however, lay in the extraction of high-rise buildings using spatial technology, digital image processing, and modeling, which would otherwise be a very difficult task to carry out manually. Results indicated a considerable rise in the total number of buildings in the city. High-rise buildings were successfully mapped using satellite imagery, spatial technology along with logical reasoning and mathematical considerations. The results clearly depict the ability of Remote Sensing and GIS to solve complex problems in urban scenarios like studying urban sprawl and identification of more complex features in an urban area like high-rise buildings and multi-dwelling units. Object-Oriented Technique has been proven to be effective and has yielded an overall efficiency of 80 percent in the classification of high-rise buildings.Keywords: object oriented classification, shadow extraction, high-rise buildings, satellite imagery, spatial technology
Procedia PDF Downloads 1555819 System Identification of Timber Masonry Walls Using Shaking Table Test
Authors: Timir Baran Roy, Luis Guerreiro, Ashutosh Bagchi
Abstract:
Dynamic study is important in order to design, repair and rehabilitation of structures. It has played an important role in the behavior characterization of structures; such as bridges, dams, high-rise buildings etc. There had been a substantial development in this area over the last few decades, especially in the field of dynamic identification techniques of structural systems. Frequency Domain Decomposition (FDD) and Time Domain Decomposition are most commonly used methods to identify modal parameters; such as natural frequency, modal damping, and mode shape. The focus of the present research is to study the dynamic characteristics of typical timber masonry walls commonly used in Portugal. For that purpose, a multi-storey structural prototypes of such walls have been tested on a seismic shake table at the National Laboratory for Civil Engineering, Portugal (LNEC). Signal processing has been performed of the output response, which is collected from the shaking table experiment of the prototype using accelerometers. In the present work signal processing of the output response, based on the input response has been done in two ways: FDD and Stochastic Subspace Identification (SSI). In order to estimate the values of the modal parameters, algorithms for FDD are formulated, and parametric functions for the SSI are computed. Finally, estimated values from both the methods are compared to measure the accuracy of both the techniques.Keywords: frequency domain decomposition (fdd), modal parameters, signal processing, stochastic subspace identification (ssi), time domain decomposition
Procedia PDF Downloads 2645818 Developing Oral Communication Competence in a Second Language: The Communicative Approach
Authors: Ikechi Gilbert
Abstract:
Oral communication is the transmission of ideas or messages through the speech process. Acquiring competence in this area which, by its volatile nature, is prone to errors and inaccuracies would require the adoption of a well-suited teaching methodology. Efficient oral communication facilitates exchange of ideas and easy accomplishment of day-to-day tasks, by means of a demonstrated mastery of oral expression and the making of fine presentations to audiences or individuals while recognizing verbal signals and body language of others and interpreting them correctly. In Anglophone states such as Nigeria, Ghana, etc., the French language, for instance, is studied as a foreign language, being used majorly in teaching learners who have their own mother tongue different from French. The same applies to Francophone states where English is studied as a foreign language by people whose official language or mother tongue is different from English. The ideal approach would be to teach these languages in these environments through a pedagogical approach that properly takes care of the oral perspective for effective understanding and application by the learners. In this article, we are examining the communicative approach as a methodology for teaching oral communication in a foreign language. This method is a direct response to the communicative needs of the learner involving the use of appropriate materials and teaching techniques that meet those needs. It is also a vivid improvement to the traditional grammatical and audio-visual adaptations. Our contribution will focus on the pedagogical component of oral communication improvement, highlighting its merits and also proposing diverse techniques including aspects of information and communication technology that would assist the second language learner communicate better orally.Keywords: communication, competence, methodology, pedagogical component
Procedia PDF Downloads 2665817 Parameter Identification Analysis in the Design of Rock Fill Dams
Authors: G. Shahzadi, A. Soulaimani
Abstract:
This research work aims to identify the physical parameters of the constitutive soil model in the design of a rockfill dam by inverse analysis. The best parameters of the constitutive soil model, are those that minimize the objective function, defined as the difference between the measured and numerical results. The Finite Element code (Plaxis) has been utilized for numerical simulation. Polynomial and neural network-based response surfaces have been generated to analyze the relationship between soil parameters and displacements. The performance of surrogate models has been analyzed and compared by evaluating the root mean square error. A comparative study has been done based on objective functions and optimization techniques. Objective functions are categorized by considering measured data with and without uncertainty in instruments, defined by the least square method, which estimates the norm between the predicted displacements and the measured values. Hydro Quebec provided data sets for the measured values of the Romaine-2 dam. Stochastic optimization, an approach that can overcome local minima, and solve non-convex and non-differentiable problems with ease, is used to obtain an optimum value. Genetic Algorithm (GA), Particle Swarm Optimization (PSO) and Differential Evolution (DE) are compared for the minimization problem, although all these techniques take time to converge to an optimum value; however, PSO provided the better convergence and best soil parameters. Overall, parameter identification analysis could be effectively used for the rockfill dam application and has the potential to become a valuable tool for geotechnical engineers for assessing dam performance and dam safety.Keywords: Rockfill dam, parameter identification, stochastic analysis, regression, PLAXIS
Procedia PDF Downloads 1465816 Interpretation and Prediction of Geotechnical Soil Parameters Using Ensemble Machine Learning
Authors: Goudjil kamel, Boukhatem Ghania, Jlailia Djihene
Abstract:
This paper delves into the development of a sophisticated desktop application designed to calculate soil bearing capacity and predict limit pressure. Drawing from an extensive review of existing methodologies, the study meticulously examines various approaches employed in soil bearing capacity calculations, elucidating their theoretical foundations and practical applications. Furthermore, the study explores the burgeoning intersection of artificial intelligence (AI) and geotechnical engineering, underscoring the transformative potential of AI- driven solutions in enhancing predictive accuracy and efficiency.Central to the research is the utilization of cutting-edge machine learning techniques, including Artificial Neural Networks (ANN), XGBoost, and Random Forest, for predictive modeling. Through comprehensive experimentation and rigorous analysis, the efficacy and performance of each method are rigorously evaluated, with XGBoost emerging as the preeminent algorithm, showcasing superior predictive capabilities compared to its counterparts. The study culminates in a nuanced understanding of the intricate dynamics at play in geotechnical analysis, offering valuable insights into optimizing soil bearing capacity calculations and limit pressure predictions. By harnessing the power of advanced computational techniques and AI-driven algorithms, the paper presents a paradigm shift in the realm of geotechnical engineering, promising enhanced precision and reliability in civil engineering projects.Keywords: limit pressure of soil, xgboost, random forest, bearing capacity
Procedia PDF Downloads 225815 The Relationship between Resilient Qualities and Health Management in Video Testimonials of Adolescents and Young Adults with Cancer
Authors: A. Sainvil, J. Mallela, L. M. Pereira
Abstract:
Adolescents and young adults (AYA) diagnosed with cancer are tasked with managing their health through treatment, a time when reliance on and independence from parents may change in unexpected ways. Resilience allows patients to cope and manage their own health through treatment, promoting motivation and a healthier lifestyle. The film acts as a source of reflection through the cancer journey, which may have an impact on how patients cope. The current research investigated relationships between resilient linguistic qualities of the video narratives and attitudes toward personal health management. N=24 patients diagnosed between ages 11-18 were recruited. First, participants provided demographic information, then made a video testimonial about their cancer experience. After filming, participants then completed a questionnaire on the perceived benefits for themselves and others for making the video. Videos were transcribed and analyzed for thematic content via codebook and for linguistic qualities, indicating resilience with the use of the Linguistic Inquiry and Word Count Analysis Program (LIWC). Linear regressions were then calculated to explore relationships between resilient qualities, thematic content, and participants’ perceptions of their medical team and willingness to care for themselves. Participants who spoke with greater narrator connectedness were more likely to change their view of their medical team (β=.628 p=.034). When a participant believed that providers were likely to view their video, they were marginally more likely to want to take better care of themselves (β=.367, p=.078). Participants who spoke in depth about their health reported higher intention to take better care of themselves (β=.785, p=.033). AYAs with cancer who showcased certain resilient qualities within their narrative were more likely to consider taking better care of themselves. Additionally, the more patients reflected on their health, the more they wanted to take better care of themselves. These relationships were stronger when a patient believed that a provider would watch their video. Study findings highlight the utility of film in uncovering aspects of resilience and coping that may lead to healthier behaviors in AYAs with cancer.Keywords: adolescents, cancer, resilience, health management
Procedia PDF Downloads 89