Search results for: learning methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20344

Search results for: learning methods

14164 Dynamical Relation of Poisson Spike Trains in Hodkin-Huxley Neural Ion Current Model and Formation of Non-Canonical Bases, Islands, and Analog Bases in DNA, mRNA, and RNA at or near the Transcription

Authors: Michael Fundator

Abstract:

Groundbreaking application of biomathematical and biochemical research in neural networks processes to formation of non-canonical bases, islands, and analog bases in DNA and mRNA at or near the transcription that contradicts the long anticipated statistical assumptions for the distribution of bases and analog bases compounds is implemented through statistical and stochastic methods apparatus with addition of quantum principles, where the usual transience of Poisson spike train becomes very instrumental tool for finding even almost periodical type of solutions to Fokker-Plank stochastic differential equation. Present article develops new multidimensional methods of finding solutions to stochastic differential equations based on more rigorous approach to mathematical apparatus through Kolmogorov-Chentsov continuity theorem that allows the stochastic processes with jumps under certain conditions to have γ-Holder continuous modification that is used as basis for finding analogous parallels in dynamics of neutral networks and formation of analog bases and transcription in DNA.

Keywords: Fokker-Plank stochastic differential equation, Kolmogorov-Chentsov continuity theorem, neural networks, translation and transcription

Procedia PDF Downloads 399
14163 Learning from Flood: A Case Study of a Frequently Flooded Village in Hubei, China

Authors: Da Kuang

Abstract:

Resilience is a hotly debated topic in many research fields (e.g., engineering, ecology, society, psychology). In flood management studies, we are experiencing the paradigm shift from flood resistance to flood resilience. Flood resilience refers to tolerate flooding through adaptation or transformation. It is increasingly argued that our city as a social-ecological system holds the ability to learn from experience and adapt to flood rather than simply resist it. This research aims to investigate what kinds of adaptation knowledge the frequently flooded village learned from past experience and its advantages and limitations in coping with floods. The study area – Xinnongcun village, located in the west of Wuhan city, is a linear village and continuously suffered from both flash flood and drainage flood during the past 30 years. We have a field trip to the site in June 2017 and conducted semi-structured interviews with local residents. Our research summarizes two types of adaptation knowledge that people learned from the past floods. Firstly, at the village scale, it has formed a collective urban form which could help people live during both flood and dry season. All houses and front yards were elevated about 2m higher than the road. All the front yards in the village are linked and there is no barrier. During flooding time, people walk to neighbors through houses yards and boat to outside village on the lower road. Secondly, at individual scale, local people learned tacit knowledge of preparedness and emergency response to flood. Regarding the advantages and limitations, the adaptation knowledge could effectively help people to live with flood and reduce the chances of getting injuries. However, it cannot reduce local farmers’ losses on their agricultural land. After flood, it is impossible for local people to recover to the pre-disaster state as flood emerges during June and July will result in no harvest. Therefore, we argue that learning from past flood experience could increase people’s adaptive capacity. However, once the adaptive capacity cannot reduce people’s losses, it requires a transformation to a better regime.

Keywords: adaptation, flood resilience, tacit knowledge, transformation

Procedia PDF Downloads 329
14162 Design, Optimize the Damping System for Optical Scanning Equipment

Authors: Duy Nhat Tran, Van Tien Pham, Quang Trung Trinh, Tien Hai Tran, Van Cong Bui

Abstract:

In recent years, artificial intelligence and the Internet of Things have experienced significant advancements. Collecting image data and real-time analysis and processing of tasks have become increasingly popular in various aspects of life. Optical scanning devices are widely used to observe and analyze different environments, whether fixed outdoors, mounted on mobile devices, or used in unmanned aerial vehicles. As a result, the interaction between the physical environment and these devices has become more critical in terms of safety. Two commonly used methods for addressing these challenges are active and passive approaches. Each method has its advantages and disadvantages, but combining both methods can lead to higher efficiency. One solution is to utilize direct-drive motors for position control and real-time feedback within the operational range to determine appropriate control parameters with high precision. If the maximum motor torque is smaller than the inertial torque and the rotor reaches the operational limit, the spring system absorbs the impact force. Numerous experiments have been conducted to demonstrate the effectiveness of device protection during operation.

Keywords: optical device, collision safety, collision absorption, precise mechanics

Procedia PDF Downloads 56
14161 Effect of Polarized Light Therapy on Oral Mucositis in Cancer Patients Receiving Chemotherapy

Authors: Zakaria Mowafy Emam Mowafy, Hamed Abd Allah Hamed, Marwa Mahmoud Abd-Elmotalb, Andrew Anis Fakhray Mosaad

Abstract:

The purpose of this paper is to determine the efficacy of polarized light therapy for chemotherapy-treated cancer patients who have oral mucositis. Methods of evaluation are the measurement of the WHO oral mucositis scale and the common toxicity criteria scale. Methods: Thirty cancer patients receiving chemotherapy (males and females) who had oral mucositis and ulceration pain, and their ages ranged from 30 to 55 years, were divided into two groups. Group (A), composed of 15 patients, received the Bioptron light therapy (BLT) in addition to the routine medical care of oral mucositis. Group (B) received only the routine medical care of oral mucositis; the duration of the BLT application was 10 minutes applied daily for 30 days. Results and conclusion: Results showed that the application of the BLT had valuable healing effects on oral mucositis in cancer patients receiving chemotherapy, as evidenced by the high decreases of the WHO oral mucositis scale and the common toxicity criteria scale.

Keywords: Bioptron light therapy, oral mucositis, WHO oral mucositis scale, common toxicity criteria scale

Procedia PDF Downloads 96
14160 Needs of Omani Children in First Grade during Their Transition from Kindergarten to Primary School: An Ethnographic Study

Authors: Zainab Algharibi, Julie McAdam, Catherine Fagan

Abstract:

The purpose of this paper is to shed light on how Omani children in the first grade experience their needs during their transition to primary school. Theoretically, the paper was built on two perspectives: Dewey's concept of continuity of experience and the boundary objects introduced by Vygotsky (CHAT). The methodology of the study is based on the crucial role of children’s agency which is a very important activity as an educational tool to enhance the child’s participation in the learning process and develop their ability to face various issues in their life. Thus, the data were obtained from 45 children in grade one from 4 different primary schools using drawing and visual narrative activities, in addition to researcher observations during the start of the first weeks of the academic year for the first grade. As the study dealt with children, all of the necessary ethical laws were followed. This paper is considered original since it seeks to deal with the issue of children's transition from kindergarten to primary school in Oman, if not in the Arab region. Therefore, it is expected to fill an important gap in this field and present a proposal that will be a door for researchers to enter this research field later. The analysis of drawing and visual narrative was performed according to the social semiotics approach in two phases. The first is to read out the surface message “denotation,” while the second is to go in-depth via the symbolism obtained from children while they talked and drew letters and signs. This stage is known as “signified”; a video was recorded of each child talking about their drawing and expressing themself. Then, the data were organised and classified according to a cross-data network. Regarding the researcher observation analyses, the collected data were analysed according to the model was developed for the "grounded theory". It is based on comparing the recent data collected from observations with data previously encoded by other methods in which children were drawing alongside the visual narrative in the current study, in order to identify the similarities and differences, and also to clarify the meaning of the accessed categories and to identify sub-categories of them with a description of possible links between them. This is a kind of triangulation in data collection. The study came up with a set of findings, the most vital being that the children's greatest interest goes to their social and psychological needs, such as friends, their teacher, and playing. Also, their biggest fears are a new place, a new teacher, and not having friends, while they showed less concern for their need for educational knowledge and skills.

Keywords: children’s academic needs, children’s social needs, transition, primary school

Procedia PDF Downloads 103
14159 Comparison of Methods for the Detection of Biofilm Formation in Yeast and Lactic Acid Bacteria Species Isolated from Dairy Products

Authors: Goksen Arik, Mihriban Korukluoglu

Abstract:

Lactic acid bacteria (LAB) and some yeast species are common microorganisms found in dairy products and most of them are responsible for the fermentation of foods. Such cultures are isolated and used as a starter culture in the food industry because of providing standardisation of the final product during the food processing. Choice of starter culture is the most important step for the production of fermented food. Isolated LAB and yeast cultures which have the ability to create a biofilm layer can be preferred as a starter in the food industry. The biofilm formation could be beneficial to extend the period of usage time of microorganisms as a starter. On the other hand, it is an undesirable property in pathogens, since biofilm structure allows a microorganism become more resistant to stress conditions such as antibiotic presence. It is thought that the resistance mechanism could be turned into an advantage by promoting the effective microorganisms which are used in the food industry as starter culture and also which have potential to stimulate the gastrointestinal system. Development of the biofilm layer is observed in some LAB and yeast strains. The resistance could make LAB and yeast strains dominant microflora in the human gastrointestinal system; thus, competition against pathogen microorganisms can be provided more easily. Based on this circumstance, in the study, 10 LAB and 10 yeast strains were isolated from various dairy products, such as cheese, yoghurt, kefir, and cream. Samples were obtained from farmer markets and bazaars in Bursa, Turkey. As a part of this research, all isolated strains were identified and their ability of biofilm formation was detected with two different methods and compared with each other. The first goal of this research was to determine whether isolates have the potential for biofilm production, and the second was to compare the validity of two different methods, which are known as “Tube method” and “96-well plate-based method”. This study may offer an insight into developing a point of view about biofilm formation and its beneficial properties in LAB and yeast cultures used as a starter in the food industry.

Keywords: biofilm, dairy products, lactic acid bacteria, yeast

Procedia PDF Downloads 253
14158 Analysis of Atomic Models in High School Physics Textbooks

Authors: Meng-Fei Cheng, Wei Fneg

Abstract:

New Taiwan high school standards emphasize employing scientific models and modeling practices in physics learning. However, to our knowledge. Few studies address how scientific models and modeling are approached in current science teaching, and they do not examine the views of scientific models portrayed in the textbooks. To explore the views of scientific models and modeling in textbooks, this study investigated the atomic unit in different textbook versions as an example and provided suggestions for modeling curriculum. This study adopted a quantitative analysis of qualitative data in the atomic units of four mainstream version of Taiwan high school physics textbooks. The models were further analyzed using five dimensions of the views of scientific models (nature of models, multiple models, purpose of the models, testing models, and changing models); each dimension had three levels (low, medium, high). Descriptive statistics were employed to compare the frequency of describing the five dimensions of the views of scientific models in the atomic unit to understand the emphasis of the views and to compare the frequency of the eight scientific models’ use to investigate the atomic model that was used most often in the textbooks. Descriptive statistics were further utilized to investigate the average levels of the five dimensions of the views of scientific models to examine whether the textbooks views were close to the scientific view. The average level of the five dimensions of the eight atomic models were also compared to examine whether the views of the eight atomic models were close to the scientific views. The results revealed the following three major findings from the atomic unit. (1) Among the five dimensions of the views of scientific models, the most portrayed dimension was the 'purpose of models,' and the least portrayed dimension was 'multiple models.' The most diverse view was the 'purpose of models,' and the most sophisticated scientific view was the 'nature of models.' The least sophisticated scientific view was 'multiple models.' (2) Among the eight atomic models, the most mentioned model was the atomic nucleus model, and the least mentioned model was the three states of matter. (3) Among the correlations between the five dimensions, the dimension of 'testing models' was highly related to the dimension of 'changing models.' In short, this study examined the views of scientific models based on the atomic units of physics textbooks to identify the emphasized and disregarded views in the textbooks. The findings suggest how future textbooks and curriculum can provide a thorough view of scientific models to enhance students' model-based learning.

Keywords: atomic models, textbooks, science education, scientific model

Procedia PDF Downloads 152
14157 Visibility Measurements Using a Novel Open-Path Optical Extinction Analyzer

Authors: Nabil Saad, David Morgan, Manish Gupta

Abstract:

Visibility has become a key component of air quality and is regulated in many areas by environmental laws such as the EPA Clean Air Act and Regional Haze Rule. Typically, visibility is calculated by estimating the optical absorption and scattering of both gases and aerosols. A major component of the aerosols’ climatic effect is due to their scattering and absorption of solar radiation, which are governed by their optical and physical properties. However, the accurate assessment of this effect on global warming, climate change, and air quality is made difficult due to uncertainties in the calculation of single scattering albedo (SSA). Experimental complications arise in the determination of the single scattering albedo of an aerosol particle since it requires the simultaneous measurement of both scattering and extinction. In fact, aerosol optical absorption, in particular, is a difficult measurement to perform, and it’s often associated with large uncertainties when using filter methods or difference methods. In this presentation, we demonstrate the use of a new open-path Optical Extinction Analyzer (OEA) in conjunction with a nephelometer and two particle sizers, emphasizing the benefits that co-employment of the OEA offers to derive the complex refractive index of aerosols and their single scattering albedo parameter. Various use cases, data reproducibility, and instrument calibration will also be presented to highlight the value proposition of this novel Open-Path OEA.

Keywords: aerosols, extinction, visibility, albedo

Procedia PDF Downloads 82
14156 Diagnosis of Rotavirus Infection among Egyptian Children by Using Different Laboratory Techniques

Authors: Mohamed A. Alhammad, Hadia A. Abou-Donia, Mona H. Hashish, Mohamed N. Massoud

Abstract:

Background: Rotavirus is the leading etiologic agent of severe diarrheal disease in infants and young children worldwide. The present study was aimed 1) to detect rotavirus infection as a cause of diarrhoea among children under 5 years of age using the two serological methods (ELISA and LA) and the PCR technique (2) to evaluate the three methodologies used for human RV detection in stool samples. Materials and Methods: This study was carried out on 247 children less than 5 years old, diagnosed clinically as acute gastroenteritis and attending Alexandria University Children Hospital at EL-Shatby. Rotavirus antigen was screened by ELISA and LA tests in all stool samples, whereas only 100 samples were subjected to RT-PCR method for detection of rotavirus RNA. Results: Out of the 247 studied cases with diarrhoea, rotavirus antigen was detected in 83 (33.6%) by ELISA and 73 (29.6%) by LA, while the 100 cases tested by RT-PCR showed that 44% of them had rotavirus RNA. Rotavirus diarrhoea was significantly presented with a marked seasonal peak during autumn and winter (61.4%). Conclusion: The present study confirms the huge burden of rotavirus as a major cause of acute diarrhoea in Egyptian infants and young children. It was concluded that; LA is equal in sensitivity to ELISA, ELISA is more specific than LA, and RT-PCR is more specific than ELISA and LA in diagnosis of rotavirus infection.

Keywords: rotavirus, diarrhea, immunoenzyme techniques, latex fixation tests, RT-PCR

Procedia PDF Downloads 364
14155 Content-Aware Image Augmentation for Medical Imaging Applications

Authors: Filip Rusak, Yulia Arzhaeva, Dadong Wang

Abstract:

Machine learning based Computer-Aided Diagnosis (CAD) is gaining much popularity in medical imaging and diagnostic radiology. However, it requires a large amount of high quality and labeled training image datasets. The training images may come from different sources and be acquired from different radiography machines produced by different manufacturers, digital or digitized copies of film radiographs, with various sizes as well as different pixel intensity distributions. In this paper, a content-aware image augmentation method is presented to deal with these variations. The results of the proposed method have been validated graphically by plotting the removed and added seams of pixels on original images. Two different chest X-ray (CXR) datasets are used in the experiments. The CXRs in the datasets defer in size, some are digital CXRs while the others are digitized from analog CXR films. With the proposed content-aware augmentation method, the Seam Carving algorithm is employed to resize CXRs and the corresponding labels in the form of image masks, followed by histogram matching used to normalize the pixel intensities of digital radiography, based on the pixel intensity values of digitized radiographs. We implemented the algorithms, resized the well-known Montgomery dataset, to the size of the most frequently used Japanese Society of Radiological Technology (JSRT) dataset and normalized our digital CXRs for testing. This work resulted in the unified off-the-shelf CXR dataset composed of radiographs included in both, Montgomery and JSRT datasets. The experimental results show that even though the amount of augmentation is large, our algorithm can preserve the important information in lung fields, local structures, and global visual effect adequately. The proposed method can be used to augment training and testing image data sets so that the trained machine learning model can be used to process CXRs from various sources, and it can be potentially used broadly in any medical imaging applications.

Keywords: computer-aided diagnosis, image augmentation, lung segmentation, medical imaging, seam carving

Procedia PDF Downloads 209
14154 Recovery and Εncapsulation of Μarine Derived Antifouling Agents

Authors: Marina Stramarkou, Sofia Papadaki, Maria Kaloupi, Ioannis Batzakas

Abstract:

Biofouling is a complex problem of the aquaculture industry, as it reduces the efficiency of the equipment and causes significant losses of cultured organisms. Nowadays, the current antifouling methods are proved to be labor intensive, have limited lifetime and use toxic substances that result in fish mortality. Several species of marine algae produce a wide variety of biogenic compounds with antibacterial and antifouling properties, which are effective in the prevention and control of biofouling and can be incorporated in antifouling coatings. In the present work, Fucus spiralis, a species of macro algae, and Chlorella vulgaris, a well-known species of microalgae, were used for the isolation and recovery of bioactive compounds, belonging to groups of fatty acids, lipopeptides and amides. The recovery of the compounds was achieved through the application of the ultrasound- assisted extraction, an environmentally friendly method, using green, non-toxic solvents. Moreover, the coating of the antifouling agents was done by innovative encapsulation and coating methods, such as electro-hydrodynamic process. For the encapsulation of the bioactive compounds natural matrices were used, such as polysaccharides and proteins. Water extracts that were incorporated in protein matrices were considered the most efficient antifouling coating.

Keywords: algae, electrospinning, fatty acids, ultrasound-assisted extraction

Procedia PDF Downloads 336
14153 Difference between 'HDR Ir-192 and Co-60 Sources' for High Dose Rate Brachytherapy Machine

Authors: Md Serajul Islam

Abstract:

High Dose Rate (HDR) Brachytherapy is used for cancer patients. In our country’s prospect, we are using only cervices and breast cancer treatment by using HDR. The air kerma rate in air at a reference distance of less than a meter from the source is the recommended quantity for the specification of gamma ray source Ir-192 in brachytherapy. The absorbed dose for the patients is directly proportional to the air kerma rate. Therefore the air kerma rate should be determined before the first use of the source on patients by qualified medical physicist who is independent from the source manufacturer. The air kerma rate will then be applied in the calculation of the dose delivered to patients in their planning systems. In practice, high dose rate (HDR) Ir-192 afterloader machines are mostly used in brachytherapy treatment. Currently, HDR-Co-60 increasingly comes into operation too. The essential advantage of the use of Co-60 sources is its longer half-life compared to Ir-192. The use of HDRCo-60 afterloading machines is also quite interesting for developing countries. This work describes the dosimetry at HDR afterloading machines according to the protocols IAEA-TECDOC-1274 (2002) with the nuclides Ir-192 and Co-60. We have used 3 different measurement methods (with a ring chamber, with a solid phantom and in free air and with a well chamber) in dependence of each of the protocols. We have shown that the standard deviations of the measured air kerma rate for the Co-60 source are generally larger than those of the Ir-192 source. The measurements with the well chamber had the lowest deviation from the certificate value. In all protocols and methods, the deviations stood for both nuclides by a maximum of about 1% for Ir-192 and 2.5% for Co-60-Sources respectively.

Keywords: Ir-192 source, cancer, patients, cheap treatment cost

Procedia PDF Downloads 231
14152 Electrochemical Detection of Polycyclic Aromatic Hydrocarbons in Urban Air by Exfoliated Graphite Based Electrode

Authors: A. Sacko, H. Nyoni, T. A. M. Msagati, B. Ntsendwana

Abstract:

Carbon based materials to target environmental pollutants have become increasingly recognized in science. Electrochemical methods using carbon based materials are notable methods for high sensitive detection of organic pollutants in air. It is therefore in this light that exfoliated graphite electrode was fabricated for electrochemical analysis of PAHs in urban atmospheric air. The electrochemical properties of the graphite electrode were studied using CV and EIS in the presence of acetate buffer supporting electrolyte with 2 Mm ferricyanide as a redox probe. The graphite electrode showed enhanced current response which confirms facile kinetics and enhanced sensitivity. However, the peak to peak (DE) separation increased as a function of scan rate. The EIS showed a high charger transfer resistance. The detection phenanthrene on the exfoliated graphite was studied in the presence of acetate buffer solution at PH 3.5 using DPV. The oxidation peak of phenanthrene was observed at 0.4 V. Under optimized conditions (supporting electrolyte, pH, deposition time, etc.). The detection limit observed was at 5x 10⁻⁸ M. Thus the results demonstrate with further optimization and modification lower concentration detection can be achieved.

Keywords: electrochemical detection, exfoliated graphite, PAHs (polycyclic aromatic hydrocarbons), urban air

Procedia PDF Downloads 200
14151 Generation and Diagnostics of Atmospheric Pressure Dielectric Barrier Discharge in Argon/Air

Authors: R. Shrestha, D. P. Subedi, R. B. Tyata, C. S. Wong,

Abstract:

In this paper, a technique for the determination of electron temperatures and electron densities in atmospheric pressure Argon/air discharge by the analysis of optical emission spectra (OES) is reported. The discharge was produced using a high voltage (0-20) kV power supply operating at a frequency of 27 kHz in parallel electrode system, with glass as dielectric. The dielectric layers covering the electrodes act as current limiters and prevent the transition to an arc discharge. Optical emission spectra in the range of (300nm-850nm) were recorded for the discharge with different inter electrode gap keeping electric field constant. Electron temperature (Te) and electron density (ne) are estimated from electrical and optical methods. Electron density was calculated using power balance method. The optical methods are related with line intensity ratio from the relative intensities of Ar-I and Ar-II lines in Argon plasma. The electron density calculated by using line intensity ratio method was compared with the electron density calculated by stark broadening method. The effect of dielectric thickness on plasma parameters (Te and ne) have also been studied and found that Te and ne increases as thickness of dielectric decrease for same inter electrode distance and applied voltage.

Keywords: electron density, electron temperature, optical emission spectra,

Procedia PDF Downloads 489
14150 Impact of COVID-19 on Antenatal Care Provision at Public Hospitals in Ethiopia: A Mixed Method Study

Authors: Zemenu Yohannes

Abstract:

Introduction: The pandemic overstretched the weak health systems in developing countries, including Ethiopia. This study aims to assess and explore the effect of COVID-19 on antenatal care (ANC) provision. Methods: A concurrent mixed methods study was applied. An interrupted time series design was applied for the quantitative study, and in-depth interviews were implemented for the qualitative research to explore maternity care providers' perceptions of ANC provision during COVID-19. We used routine monthly collected data from the health management information system (HMIS) in fifteen hospitals in the Sidama region, Ethiopia, from March 2019 to February 2020 (12 months) before COVID-19 and from March to August 2020 (6 months) during COVID-19. We imported data into STATA V.17 for analysis. ANC provision's mean monthly incidence rate ratio (IRR) was calculated using Poisson regression with a 95% confidence interval. The qualitative data were analysed using thematic analysis. Findings from quantitative and qualitative elements were integrated with a contiguous approach. Results: Our findings indicate the rate of ANC provision significantly decreased in the first six months of COVID-19. This study has three identified main themes: barriers to ANC provision, inadequate COVID-19 prevention approach, and delay in providing ANC. Conclusion and recommendation: Based on our findings, the pandemic affected ANC provision in the study area. The health bureau and stakeholders should take a novel and sustainable approach to prevent future pandemics. The health bureau and hospital administrators should establish a task force that relies on financial self-reliance to close gaps in future pandemics of medical supply shortages. Pregnant women should receive their care promptly from maternity care providers. In order to foster contact and avoid discrimination the future pandemics, hospital administrators should set up a platform for community members and maternity care providers.

Keywords: ANC provision, COVID-19, mixed methods study, Ethiopia

Procedia PDF Downloads 66
14149 Stress Reduction Techniques for First Responders: Scientifically Proven Methods

Authors: Esther Ranero Carrazana, Maria Karla Ramirez Valdes

Abstract:

First responders, including firefighters, police officers, and emergency medical personnel, are frequently exposed to high-stress scenarios that significantly increase their risk of mental health issues such as depression, anxiety, and post-traumatic stress disorder (PTSD). Their work involves life-threatening situations, witnessing suffering, and making critical decisions under pressure, all contributing to psychological strain. The objectives of this research on "Stress Reduction Techniques for First Responders: Scientifically Proven Methods" are as follows. One of them is to evaluate the effectiveness of stress reduction techniques. The primary objective is to assess the efficacy of various scientifically proven stress reduction techniques explicitly tailored for first responders. Heart Rate Variability (HRV) Training, Interoception and Exteroception, Sensory Integration, and Body Perception Awareness are scrutinized for their ability to mitigate stress-related symptoms. Furthermore, we evaluate and enhance the understanding of stress mechanisms in first responders by exploring how different techniques influence the physiological and psychological responses to stress. The study aims to deepen the understanding of stress mechanisms in high-risk professions. Additionally, the study promotes psychological resilience by seeking to identify and recommend methods that can significantly enhance the psychological resilience of first responders, thereby supporting their mental health and operational efficiency in high-stress environments. Guide training and policy development is an additional objective to provide evidence-based recommendations that can be used to guide training programs and policy development aimed at improving the mental health and well-being of first responders. Lastly, the study aims to contribute valuable insights to the existing body of knowledge in stress management, specifically tailored to the unique needs of first responders. This study involved a comprehensive literature review assessing the effectiveness of various stress reduction techniques tailored for first responders. Techniques evaluated include Heart Rate Variability (HRV) Training, Interoception and Exteroception, Sensory Integration, and Body Perception Awareness, focusing on their ability to alleviate stress-related symptoms. The review indicates promising results for several stress reduction methods. HRV Training demonstrates the potential to reflect stress vulnerability and enhance physiological and behavioral flexibility. Interoception and Exteroception help modulate the stress response by enhancing awareness of the body's internal state and its interaction with the environment. Sensory integration plays a crucial role in adaptive responses to stress by focusing on individual senses and their integration. Therefore, body perception awareness addresses stress and anxiety through enhanced body perception and mindfulness. The evaluated techniques show significant potential in reducing stress and improving the mental health of first responders. Implementing these scientifically supported methods into routine training could significantly enhance their psychological resilience and operational effectiveness in high-stress environments.

Keywords: first responders, HRV training, mental health, sensory integration, stress reduction

Procedia PDF Downloads 25
14148 Typification and Determination of Antibiotic Susceptibility Profiles with E Test Methods of Anaerobic Gram Negative Bacilli Isolated from Various Clinical Specimen

Authors: Cengiz Demir, Recep Keşli, Gülşah Aşık

Abstract:

Objective: This study was carried out with the purpose of defining by using the E test method and determining the antibiotic resistance profiles of Gram-negative anaerobic bacilli isolated from various clinical specimens obtained from patients with suspected anaerobic infections and referred to Medical Microbiology Laboratory of Afyon Kocatepe University, ANS Application and Research Hospital. Methods: Two hundred and seventy eight clinical specimens were examined for isolation of the anaerobic bacteria in Medical Microbiology Laboratory between the 1st November 2014 and 30th October 2015. Specimens were cultivated by using Scheadler agar that 5% defibrinated sheep blood added, and Scheadler broth. The isolated anaerobic Gram-negative bacilli were identified conventional methods and Vitek 2 (ANC ID Card, bioMerieux, France) cards. Antibiotic resistance rates against to penicillin G, clindamycin, cefoxitin, metronidazole, moxifloxacin, imipenem, meropenem, ertapenem and doripenem were determined with E-test method for each isolate. Results: Of the isolated twenty-eight anaerobic gram negative bacilli fourteen were identified as the B. fragilis group, 9 were Prevotella group, and 5 were Fusobacterium group. The highest resistance rate was found against penicillin (78.5%) and resistance rates against clindamycin and cefoxitin were found as 17.8% and 21.4%, respectively. Against to the; metronidazole, moxifloxacin, imipenem, meropenem, ertapenem and doripenem, no resistance was found. Conclusion: Since high rate resistance has been detected against to penicillin in the study penicillin should not be preferred in empirical treatment. Cefoxitin can be preferred in empirical treatment; however, carrying out the antibiotic sensitivity testing will be more proper and beneficial. No resistance was observed against carbapenem group antibiotics and metronidazole; so that reason, these antibiotics should be reserved for treatment of infectious caused by resistant strains in the future.

Keywords: anaerobic gram-negative bacilli, anaerobe, antibiotics and resistance profiles, e-test method

Procedia PDF Downloads 295
14147 The Case for Reparations: Systemic Injustice and Human Rights in the United States

Authors: Journey Whitfield

Abstract:

This study investigates the United States' ongoing violation of Black Americans' fundamental human rights, as evidenced by mass incarceration, social injustice, and economic deprivation. It argues that the U.S. contravenes Article 9 of the International Covenant on Civil and Political Rights through policies that uphold systemic racism. The analysis dissects current practices within the criminal justice system, social welfare programs, and economic policy, uncovering the racially disparate impacts of seemingly race-neutral policies. This study establishes a clear lineage between past systems of oppression – slavery and Jim Crow – and present-day racial disparities, demonstrating their inextricable link. The thesis proposes that only a comprehensive reparations program for Black Americans can begin to redress these systemic injustices. This program must transcend mere financial compensation, demanding structural reforms within U.S. institutions to dismantle systemic racism and promote transformative justice. This study explores potential forms of reparations, drawing upon historical precedents, comparative case studies from other nations, and contemporary debates within political philosophy and legal studies. The research employs both qualitative and quantitative methods. Qualitative methods include historical analysis of legal frameworks and policy documents, as well as discourse analysis of political rhetoric. Quantitative methods involve statistical analysis of socioeconomic data and criminal justice outcomes to expose racial disparities. This study makes a significant contribution to the existing literature on reparations, human rights, and racial injustice in the United States. It offers a rigorous analysis of the enduring consequences of historical oppression and advocates for bold, justice-centered solutions.

Keywords: Black Americans, reparations, mass incarceration, racial injustice, human rights, united states

Procedia PDF Downloads 52
14146 Fiber Stiffness Detection of GFRP Using Combined ABAQUS and Genetic Algorithms

Authors: Gyu-Dong Kim, Wuk-Jae Yoo, Sang-Youl Lee

Abstract:

Composite structures offer numerous advantages over conventional structural systems in the form of higher specific stiffness and strength, lower life-cycle costs, and benefits such as easy installation and improved safety. Recently, there has been a considerable increase in the use of composites in engineering applications and as wraps for seismic upgrading and repairs. However, these composites deteriorate with time because of outdated materials, excessive use, repetitive loading, climatic conditions, manufacturing errors, and deficiencies in inspection methods. In particular, damaged fibers in a composite result in significant degradation of structural performance. In order to reduce the failure probability of composites in service, techniques to assess the condition of the composites to prevent continual growth of fiber damage are required. Condition assessment technology and nondestructive evaluation (NDE) techniques have provided various solutions for the safety of structures by means of detecting damage or defects from static or dynamic responses induced by external loading. A variety of techniques based on detecting the changes in static or dynamic behavior of isotropic structures has been developed in the last two decades. These methods, based on analytical approaches, are limited in their capabilities in dealing with complex systems, primarily because of their limitations in handling different loading and boundary conditions. Recently, investigators have introduced direct search methods based on metaheuristics techniques and artificial intelligence, such as genetic algorithms (GA), simulated annealing (SA) methods, and neural networks (NN), and have promisingly applied these methods to the field of structural identification. Among them, GAs attract our attention because they do not require a considerable amount of data in advance in dealing with complex problems and can make a global solution search possible as opposed to classical gradient-based optimization techniques. In this study, we propose an alternative damage-detection technique that can determine the degraded stiffness distribution of vibrating laminated composites made of Glass Fiber-reinforced Polymer (GFRP). The proposed method uses a modified form of the bivariate Gaussian distribution function to detect degraded stiffness characteristics. In addition, this study presents a method to detect the fiber property variation of laminated composite plates from the micromechanical point of view. The finite element model is used to study free vibrations of laminated composite plates for fiber stiffness degradation. In order to solve the inverse problem using the combined method, this study uses only first mode shapes in a structure for the measured frequency data. In particular, this study focuses on the effect of the interaction among various parameters, such as fiber angles, layup sequences, and damage distributions, on fiber-stiffness damage detection.

Keywords: stiffness detection, fiber damage, genetic algorithm, layup sequences

Procedia PDF Downloads 261
14145 A Mathematical Model for Reliability Redundancy Optimization Problem of K-Out-Of-N: G System

Authors: Gak-Gyu Kim, Won Il Jung

Abstract:

According to a remarkable development of science and technology, function and role of the system of engineering fields has recently been diversified. The system has become increasingly more complex and precise, and thus, system designers intended to maximize reliability concentrate more effort at the design stage. This study deals with the reliability redundancy optimization problem (RROP) for k-out-of-n: G system configuration with cold standby and warm standby components. This paper further intends to present the optimal mathematical model through which the following three elements of (i) multiple components choices, (ii) redundant components quantity and (iii) the choice of redundancy strategies may be combined in order to maximize the reliability of the system. Therefore, we focus on the following three issues. First, we consider RROP that there exists warm standby state as well as cold standby state of the component. Second, as eliminating an approximation approach of the previous RROP studies, we construct a precise model for system reliability. Third, given transition time when the state of components changes, we present not simply a workable solution but the advanced method. For the wide applicability of RROPs, moreover, we use absorbing continuous time Markov chain and matrix analytic methods in the suggested mathematical model.

Keywords: RROP, matrix analytic methods, k-out-of-n: G system, MTTF, absorbing continuous time Markov Chain

Procedia PDF Downloads 251
14144 Feasibility Studies through Quantitative Methods: The Revamping of a Tourist Railway Line in Italy

Authors: Armando Cartenì, Ilaria Henke

Abstract:

Recently, the Italian government has approved a new law for public contracts and has been laying the groundwork for restarting a planning phase. The government has adopted the indications given by the European Commission regarding the estimation of the external costs within the Cost-Benefit Analysis, and has been approved the ‘Guidelines for assessment of Investment Projects’. In compliance with the new Italian law, the aim of this research was to perform a feasibility study applying quantitative methods regarding the revamping of an Italian tourist railway line. A Cost-Benefit Analysis was performed starting from the quantification of the passengers’ demand potentially interested in using the revamped rail services. The benefits due to the external costs reduction were also estimated (quantified) in terms of variations (with respect to the not project scenario): climate change, air pollution, noises, congestion, and accidents. Estimations results have been proposed in terms of the Measure of Effectiveness underlying a positive Net Present Value equal to about 27 million of Euros, an Internal Rate of Return much greater the discount rate, a benefit/cost ratio equal to 2 and a PayBack Period of 15 years.

Keywords: cost-benefit analysis, evaluation analysis, demand management, external cost, transport planning, quality

Procedia PDF Downloads 212
14143 Deconvolution of Anomalous Fast Fourier Transform Patterns for Tin Sulfide

Authors: I. Shuro

Abstract:

The crystal structure of Tin Sulfide prepared by certain chemical methods is investigated using High-Resolution Transmission Electron Microscopy (HRTEM), Scanning Electron Microscopy (SEM), and X-ray diffraction (XRD) methods. An anomalous HRTEM Fast Fourier Transform (FFT) exhibited a central scatter of diffraction spots, which is surrounded by secondary clusters of spots arranged in a hexagonal pattern around the central cluster was observed. FFT analysis has revealed a long lattice parameter and mostly viewed along a hexagonal axis where there many columns of atoms slightly displaced from one another. This FFT analysis has revealed that the metal sulfide has a long-range order interwoven chain of atoms in its crystal structure. The observed crystalline structure is inconsistent with commonly observed FFT patterns of chemically synthesized Tin Sulfide nanocrystals and thin films. SEM analysis showed the morphology of a myriad of multi-shaped crystals ranging from hexagonal, cubic, and spherical micro to nanostructured crystals. This study also investigates the presence of quasi-crystals as reflected by the presence of mixed local symmetries.

Keywords: fast fourier transform, high resolution transmission electron microscopy, tin sulfide, crystalline structure

Procedia PDF Downloads 136
14142 Study on the Geometric Similarity in Computational Fluid Dynamics Calculation and the Requirement of Surface Mesh Quality

Authors: Qian Yi Ooi

Abstract:

At present, airfoil parameters are still designed and optimized according to the scale of conventional aircraft, and there are still some slight deviations in terms of scale differences. However, insufficient parameters or poor surface mesh quality is likely to occur if these small deviations are embedded in a future civil aircraft with a size that is quite different from conventional aircraft, such as a blended-wing-body (BWB) aircraft with future potential, resulting in large deviations in geometric similarity in computational fluid dynamics (CFD) simulations. To avoid this situation, the study on the CFD calculation on the geometric similarity of airfoil parameters and the quality of the surface mesh is conducted to obtain the ability of different parameterization methods applied on different airfoil scales. The research objects are three airfoil scales, including the wing root and wingtip of conventional civil aircraft and the wing root of the giant hybrid wing, used by three parameterization methods to compare the calculation differences between different sizes of airfoils. In this study, the constants including NACA 0012, a Reynolds number of 10 million, an angle of attack of zero, a C-grid for meshing, and the k-epsilon (k-ε) turbulence model are used. The experimental variables include three airfoil parameterization methods: point cloud method, B-spline curve method, and class function/shape function transformation (CST) method. The airfoil dimensions are set to 3.98 meters, 17.67 meters, and 48 meters, respectively. In addition, this study also uses different numbers of edge meshing and the same bias factor in the CFD simulation. Studies have shown that with the change of airfoil scales, different parameterization methods, the number of control points, and the meshing number of divisions should be used to improve the accuracy of the aerodynamic performance of the wing. When the airfoil ratio increases, the most basic point cloud parameterization method will require more and larger data to support the accuracy of the airfoil’s aerodynamic performance, which will face the severe test of insufficient computer capacity. On the other hand, when using the B-spline curve method, average number of control points and meshing number of divisions should be set appropriately to obtain higher accuracy; however, the quantitative balance cannot be directly defined, but the decisions should be made repeatedly by adding and subtracting. Lastly, when using the CST method, it is found that limited control points are enough to accurately parameterize the larger-sized wing; a higher degree of accuracy and stability can be obtained by using a lower-performance computer.

Keywords: airfoil, computational fluid dynamics, geometric similarity, surface mesh quality

Procedia PDF Downloads 215
14141 Various Models of Quality Management Systems

Authors: Mehrnoosh Askarizadeh

Abstract:

People, process and IT are the most important assets of any organization. Optimal utilization of these resources has been the question of research in business for many decades. The business world have responded by inventing various methodologies that can be used for addressing problems of quality improvement, efficiency of processes, continuous improvement, reduction of waste, automation, strategy alignments etc. Some of these methodologies can be commonly called as Business Process Quality Management methodologies (BPQM). In essence, the first references to the process management can be traced back to Frederick Taylor and scientific management. Time and motion study was addressed to improvement of manufacturing process efficiency. The ideas of scientific management were in use for quite a long period until more advanced quality management techniques were developed in Japan and USA. One of the first prominent methods had been Total Quality Management (TQM) which evolved during 1980’s. About the same time, Six Sigma (SS) originated at Motorola as a separate method. SS spread and evolved; and later joined with ideas of Lean manufacturing to form Lean Six Sigma. In 1990’s due to emerging IT technologies, beginning of globalization, and strengthening of competition, companies recognized the need for better process and quality management. Business Process Management (BPM) emerged as a novel methodology that has taken all this into account and helped to align IT technologies with business processes and quality management. In this article we will study various aspects of above mentioned methods and identified their relations.

Keywords: e-process, quality, TQM, BPM, lean, six sigma, CPI, information technology, management

Procedia PDF Downloads 431
14140 Assessment of Rainfall Erosivity, Comparison among Methods: Case of Kakheti, Georgia

Authors: Mariam Tsitsagi, Ana Berdzenishvili

Abstract:

Rainfall intensity change is one of the main indicators of climate change. It has a great influence on agriculture as one of the main factors causing soil erosion. Splash and sheet erosion are one of the most prevalence and harmful for agriculture. It is invisible for an eye at first stage, but the process will gradually move to stream cutting erosion. Our study provides the assessment of rainfall erosivity potential with the use of modern research methods in Kakheti region. The region is the major provider of wheat and wine in the country. Kakheti is located in the eastern part of Georgia and characterized quite a variety of natural conditions. The climate is dry subtropical. For assessment of the exact rate of rainfall erosion potential several year data of rainfall with short intervals are needed. Unfortunately, from 250 active metro stations running during the Soviet period only 55 of them are active now and 5 stations in Kakheti region respectively. Since 1936 we had data on rainfall intensity in this region, and rainfall erosive potential is assessed, in some old papers, but since 1990 we have no data about this factor, which in turn is a necessary parameter for determining the rainfall erosivity potential. On the other hand, researchers and local communities suppose that rainfall intensity has been changing and the number of haily days has also been increasing. However, finding a method that will allow us to determine rainfall erosivity potential as accurate as possible in Kakheti region is very important. The study period was divided into three sections: 1936-1963; 1963-1990 and 1990-2015. Rainfall erosivity potential was determined by the scientific literature and old meteorological stations’ data for the first two periods. And it is known that in eastern Georgia, at the boundary between steppe and forest zones, rainfall erosivity in 1963-1990 was 20-75% higher than that in 1936-1963. As for the third period (1990-2015), for which we do not have data of rainfall intensity. There are a variety of studies, where alternative ways of calculating the rainfall erosivity potential based on lack of data are discussed e.g.based on daily rainfall data, average annual rainfall data and the elevation of the area, etc. It should be noted that these methods give us a totally different results in case of different climatic conditions and sometimes huge errors in some cases. Three of the most common methods were selected for our research. Each of them was tested for the first two sections of the study period. According to the outcomes more suitable method for regional climatic conditions was selected, and after that, we determined rainfall erosivity potential for the third section of our study period with use of the most successful method. Outcome data like attribute tables and graphs was specially linked to the database of Kakheti, and appropriate thematic maps were created. The results allowed us to analyze the rainfall erosivity potential changes from 1936 to the present and make the future prospect. We have successfully implemented a method which can also be use for some another region of Georgia.

Keywords: erosivity potential, Georgia, GIS, Kakheti, rainfall

Procedia PDF Downloads 215
14139 Correlation of Material Mechanical Characteristics Obtained by Means of Standardized and Miniature Test Specimens

Authors: Vaclav Mentl, P. Zlabek, J. Volak

Abstract:

New methods of mechanical testing were developed recently that are based on making use of miniature test specimens (e.g. Small Punch Test). The most important advantage of these method is the nearly non-destructive withdrawal of test material and small size of test specimen what is interesting in cases of remaining lifetime assessment when a sufficient volume of the representative material cannot be withdrawn of the component in question. In opposite, the most important disadvantage of such methods stems from the necessity to correlate test results with the results of standardised test procedures and to build up a database of material data in service. The correlations among the miniature test specimen data and the results of standardised tests are necessary. The paper describes the results of fatigue tests performed on miniature tests specimens in comparison with traditional fatigue tests for several steels applied in power producing industry. Special miniature test specimens fixtures were designed and manufactured for the purposes of fatigue testing at the Zwick/Roell 10HPF5100 testing machine. The miniature test specimens were produced of the traditional test specimens. Seven different steels were fatigue loaded (R = 0.1) at room temperature.

Keywords: mechanical properties, miniature test specimens, correlations, small punch test, micro-tensile test, mini-charpy impact test

Procedia PDF Downloads 529
14138 Streamlining .NET Data Access: Leveraging JSON for Data Operations in .NET

Authors: Tyler T. Procko, Steve Collins

Abstract:

New features in .NET (6 and above) permit streamlined access to information residing in JSON-capable relational databases, such as SQL Server (2016 and above). Traditional methods of data access now comparatively involve unnecessary steps which compromise system performance. This work posits that the established ORM (Object Relational Mapping) based methods of data access in applications and APIs result in common issues, e.g., object-relational impedance mismatch. Recent developments in C# and .NET Core combined with a framework of modern SQL Server coding conventions have allowed better technical solutions to the problem. As an amelioration, this work details the language features and coding conventions which enable this streamlined approach, resulting in an open-source .NET library implementation called Codeless Data Access (CODA). Canonical approaches rely on ad-hoc mapping code to perform type conversions between the client and back-end database; with CODA, no mapping code is needed, as JSON is freely mapped to SQL and vice versa. CODA streamlines API data access by improving on three aspects of immediate concern to web developers, database engineers and cybersecurity professionals: Simplicity, Speed and Security. Simplicity is engendered by cutting out the “middleman” steps, effectively making API data access a whitebox, whereas traditional methods are blackbox. Speed is improved because of the fewer translational steps taken, and security is improved as attack surfaces are minimized. An empirical evaluation of the speed of the CODA approach in comparison to ORM approaches ] is provided and demonstrates that the CODA approach is significantly faster. CODA presents substantial benefits for API developer workflows by simplifying data access, resulting in better speed and security and allowing developers to focus on productive development rather than being mired in data access code. Future considerations include a generalization of the CODA method and extension outside of the .NET ecosystem to other programming languages.

Keywords: API data access, database, JSON, .NET core, SQL server

Procedia PDF Downloads 60
14137 Using the Smith-Waterman Algorithm to Extract Features in the Classification of Obesity Status

Authors: Rosa Figueroa, Christopher Flores

Abstract:

Text categorization is the problem of assigning a new document to a set of predetermined categories, on the basis of a training set of free-text data that contains documents whose category membership is known. To train a classification model, it is necessary to extract characteristics in the form of tokens that facilitate the learning and classification process. In text categorization, the feature extraction process involves the use of word sequences also known as N-grams. In general, it is expected that documents belonging to the same category share similar features. The Smith-Waterman (SW) algorithm is a dynamic programming algorithm that performs a local sequence alignment in order to determine similar regions between two strings or protein sequences. This work explores the use of SW algorithm as an alternative to feature extraction in text categorization. The dataset used for this purpose, contains 2,610 annotated documents with the classes Obese/Non-Obese. This dataset was represented in a matrix form using the Bag of Word approach. The score selected to represent the occurrence of the tokens in each document was the term frequency-inverse document frequency (TF-IDF). In order to extract features for classification, four experiments were conducted: the first experiment used SW to extract features, the second one used unigrams (single word), the third one used bigrams (two word sequence) and the last experiment used a combination of unigrams and bigrams to extract features for classification. To test the effectiveness of the extracted feature set for the four experiments, a Support Vector Machine (SVM) classifier was tuned using 20% of the dataset. The remaining 80% of the dataset together with 5-Fold Cross Validation were used to evaluate and compare the performance of the four experiments of feature extraction. Results from the tuning process suggest that SW performs better than the N-gram based feature extraction. These results were confirmed by using the remaining 80% of the dataset, where SW performed the best (accuracy = 97.10%, weighted average F-measure = 97.07%). The second best was obtained by the combination of unigrams-bigrams (accuracy = 96.04, weighted average F-measure = 95.97) closely followed by the bigrams (accuracy = 94.56%, weighted average F-measure = 94.46%) and finally unigrams (accuracy = 92.96%, weighted average F-measure = 92.90%).

Keywords: comorbidities, machine learning, obesity, Smith-Waterman algorithm

Procedia PDF Downloads 289
14136 Industrial Policy Directions in Georgia

Authors: Nino Grigolaia

Abstract:

Introduction - The paper discusses the role of industrial policy in the development of the economy in the country. The main challenges on the way to the implementation of industrial policy are analyzed: the long-term period of industrial policy, the risk of changes in priorities, the limited scope and external shocks. Methodology - Various research methods are used in the paper. The methods of induction, deduction, analysis, synthesis, analogy, correlation and statistical observation are used. Main Findings - Based on the analysis of the current situation in Georgia, the obstacles to the country's industrialization and its supporting factors are identified. Also, the challenges of the country's core industrial policies are revealed. Specific industry development strategies, ways of state support and main directions of new industrial policies are identified. Conclusion - The paper concludes that the development of the industrial sector is critical for the future growth and development of the Georgian economy, which will accelerate the industrialization and structural transformation processes, reduce the trade deficit, increase the exports and create more jobs in the country. The listed changes will guarantee the improvement of the socio-economic situation of the population. Accordingly, it is revealed that the study of industrial policy in Georgia is still actual. Based on the analysis, relevant conclusions in the field of industrialization of the country are developed and recommendations are proposed.

Keywords: industrialization , industrial policy, industrialization of the economy, Georgia priorities

Procedia PDF Downloads 186
14135 Data and Model-based Metamodels for Prediction of Performance of Extended Hollo-Bolt Connections

Authors: M. Cabrera, W. Tizani, J. Ninic, F. Wang

Abstract:

Open section beam to concrete-filled tubular column structures has been increasingly utilized in construction over the past few decades due to their enhanced structural performance, as well as economic and architectural advantages. However, the use of this configuration in construction is limited due to the difficulties in connecting the structural members as there is no access to the inner part of the tube to install standard bolts. Blind-bolted systems are a relatively new approach to overcome this limitation as they only require access to one side of the tubular section to tighten the bolt. The performance of these connections in concrete-filled steel tubular sections remains uncharacterized due to the complex interactions between concrete, bolt, and steel section. Over the last years, research in structural performance has moved to a more sophisticated and efficient approach consisting of machine learning algorithms to generate metamodels. This method reduces the need for developing complex, and computationally expensive finite element models, optimizing the search for desirable design variables. Metamodels generated by a data fusion approach use numerical and experimental results by combining multiple models to capture the dependency between the simulation design variables and connection performance, learning the relations between different design parameters and predicting a given output. Fully characterizing this connection will transform high-rise and multistorey construction by means of the introduction of design guidance for moment-resisting blind-bolted connections, which is currently unavailable. This paper presents a review of the steps taken to develop metamodels generated by means of artificial neural network algorithms which predict the connection stress and stiffness based on the design parameters when using Extended Hollo-Bolt blind bolts. It also provides consideration of the failure modes and mechanisms that contribute to the deformability as well as the feasibility of achieving blind-bolted rigid connections when using the blind fastener.

Keywords: blind-bolted connections, concrete-filled tubular structures, finite element analysis, metamodeling

Procedia PDF Downloads 153