Search results for: 20th century architecture
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3141

Search results for: 20th century architecture

921 Integrative Biology Teaching and Learning Model Based on STEM Education

Authors: Narupot Putwattana

Abstract:

Changes in global situation such as environmental and economic crisis brought the new perspective for science education called integrative biology. STEM has been increasingly mentioned for several educational researches as the approach which combines the concept in Science (S), Technology (T), Engineering (E) and Mathematics (M) to apply in teaching and learning process so as to strengthen the 21st-century skills such as creativity and critical thinking. Recent studies demonstrated STEM as the pedagogy which described the engineering process along with the science classroom activities. So far, pedagogical contents for STEM explaining the content in biology have been scarce. A qualitative literature review was conducted so as to gather the articles based on electronic databases (google scholar). STEM education, engineering design, teaching and learning of biology were used as main keywords to find out researches involving with the application of STEM in biology teaching and learning process. All articles were analyzed to obtain appropriate teaching and learning model that unify the core concept of biology. The synthesized model comprised of engineering design, inquiry-based learning, biological prototype and biologically-inspired design (BID). STEM content and context integration were used as the theoretical framework to create the integrative biology instructional model for STEM education. Several disciplines contents such as biology, engineering, and technology were regarded for inquiry-based learning to build biological prototype. Direct and indirect integrations were used to provide the knowledge into the biology related STEM strategy. Meanwhile, engineering design and BID showed the occupational context for engineer and biologist. Technological and mathematical aspects were required to be inspected in terms of co-teaching method. Lastly, other variables such as critical thinking and problem-solving skills should be more considered in the further researches.

Keywords: biomimicry, engineering approach, STEM education, teaching and learning model

Procedia PDF Downloads 257
920 Detection of Micro-Unmanned Ariel Vehicles Using a Multiple-Input Multiple-Output Digital Array Radar

Authors: Tareq AlNuaim, Mubashir Alam, Abdulrazaq Aldowesh

Abstract:

The usage of micro-Unmanned Ariel Vehicles (UAVs) has witnessed an enormous increase recently. Detection of such drones became a necessity nowadays to prevent any harmful activities. Typically, such targets have low velocity and low Radar Cross Section (RCS), making them indistinguishable from clutter and phase noise. Multiple-Input Multiple-Output (MIMO) Radars have many potentials; it increases the degrees of freedom on both transmit and receive ends. Such architecture allows for flexibility in operation, through utilizing the direct access to every element in the transmit/ receive array. MIMO systems allow for several array processing techniques, permitting the system to stare at targets for longer times, which improves the Doppler resolution. In this paper, a 2×2 MIMO radar prototype is developed using Software Defined Radio (SDR) technology, and its performance is evaluated against a slow-moving low radar cross section micro-UAV used by hobbyists. Radar cross section simulations were carried out using FEKO simulator, achieving an average of -14.42 dBsm at S-band. The developed prototype was experimentally evaluated achieving more than 300 meters of detection range for a DJI Mavic pro-drone

Keywords: digital beamforming, drone detection, micro-UAV, MIMO, phased array

Procedia PDF Downloads 140
919 Impact Force Difference on Natural Grass Versus Synthetic Turf Football Fields

Authors: Nathaniel C. Villanueva, Ian K. H. Chun, Alyssa S. Fujiwara, Emily R. Leibovitch, Brennan E. Yamamoto, Loren G. Yamamoto

Abstract:

Introduction: In previous studies of high school sports, over 15% of concussions were attributed to contact with the playing surface. While artificial turf fields are increasing in popularity due to lower maintenance costs, artificial turf has been associated with more ankle and knee injuries, with inconclusive data on concussions. In this study, natural grass and artificial football fields were compared in terms of deceleration on fall impact. Methods: Accelerometers were placed on the forehead, apex of the head, and right ear of a Century Body Opponent Bag (BOB) manikin. A Riddell HITS football helmet was secured onto the head of the manikin over the accelerometers. This manikin was dropped onto natural grass (n = 10) and artificial turf (n = 9) high school football fields. The manikin was dropped from a stationary position at a height of 60 cm onto its front, back, and left side. Each of these drops was conducted 10 times at the 40-yard line, 20-yard line, and endzone. The net deceleration on impact was calculated as a net vector from each of the three accelerometers’ x, y, and z vectors from the three different locations on the manikin’s head (9 vector measurements per drop). Results: Mean values for the multiple drops were calculated for each accelerometer and drop type for each field. All accelerometers in forward and backward falls and one accelerometer in side falls showed significantly greater impact force on synthetic turf compared to the natural grass surfaces. Conclusion: Impact force was higher on synthetic fields for all drop types for at least one of the accelerometer locations. These findings suggest that concussion risk might be higher for athletes playing on artificial turf fields.

Keywords: concussion, football, biomechanics, sports

Procedia PDF Downloads 160
918 Research on Resilience-Oriented Disintegration in System-of-System

Authors: Hang Yang, Jiahao Liu, Jichao Li, Kewei Yang, Minghao Li, Bingfeng Ge

Abstract:

The system-of-systems (SoS) are utilized to characterize networks formed by integrating individual complex systems that demonstrate interdependence and interconnectedness. Research on the disintegration issue in SoS is significant in improving network survivability, maintaining network security, and optimizing SoS architecture. Accordingly, this study proposes an integrated framework called resilience-oriented disintegration in SoS (SoSRD), for modeling and solving the issue of SoS disintegration. Firstly, a SoS disintegration index (SoSDI) is presented to evaluate the disintegration effect of SoS. This index provides a practical description of the disintegration process and is the first integration of the network disintegration model and resilience models. Subsequently, we propose a resilience-oriented disintegration method based on reinforcement learning (RDRL) to enhance the efficiency of SoS disintegration. This method is not restricted by the problem scenario as well as considering the coexistence of disintegration (node/link removal) and recovery (node/link addition) during the process of SoS disintegration. Finally, the effectiveness and superiority of the proposed SoSRD are demonstrated through a case study. We demonstrate that our proposed framework outperforms existing indexes and methods in both node and link disintegration scenarios, providing a fresh perspective on network disintegration. The findings provide crucial insights into dismantling harmful SoS and designing a more resilient SoS.

Keywords: system-of-systems, disintegration index, resilience, reinforcement learning

Procedia PDF Downloads 18
917 Luminescent Functionalized Graphene Oxide Based Sensitive Detection of Deadly Explosive TNP

Authors: Diptiman Dinda, Shyamal Kumar Saha

Abstract:

In the 21st century, sensitive and selective detection of trace amounts of explosives has become a serious problem. Generally, nitro compound and its derivatives are being used worldwide to prepare different explosives. Recently, TNP (2, 4, 6 trinitrophenol) is the most commonly used constituent to prepare powerful explosives all over the world. It is even powerful than TNT or RDX. As explosives are electron deficient in nature, it is very difficult to detect one separately from a mixture. Again, due to its tremendous water solubility, detection of TNP in presence of other explosives from water is very challenging. Simple instrumentation, cost-effective, fast and high sensitivity make fluorescence based optical sensing a grand success compared to other techniques. Graphene oxide (GO), with large no of epoxy grps, incorporate localized nonradiative electron-hole centres on its surface to give very weak fluorescence. In this work, GO is functionalized with 2, 6-diamino pyridine to remove those epoxy grps. through SN2 reaction. This makes GO into a bright blue luminescent fluorophore (DAP/rGO) which shows an intense PL spectrum at ∼384 nm when excited at 309 nm wavelength. We have also characterized the material by FTIR, XPS, UV, XRD and Raman measurements. Using this as fluorophore, a large fluorescence quenching (96%) is observed after addition of only 200 µL of 1 mM TNP in water solution. Other nitro explosives give very moderate PL quenching compared to TNP. Such high selectivity is related to the operation of FRET mechanism from fluorophore to TNP during this PL quenching experiment. TCSPC measurement also reveals that the lifetime of DAP/rGO drastically decreases from 3.7 to 1.9 ns after addition of TNP. Our material is also quite sensitive to 125 ppb level of TNP. Finally, we believe that this graphene based luminescent material will emerge a new class of sensing materials to detect trace amounts of explosives from aqueous solution.

Keywords: graphene, functionalization, fluorescence quenching, FRET, nitroexplosive detection

Procedia PDF Downloads 441
916 A Comparative Analysis of Thermal Performance of Building Envelope Types over Time

Authors: Aram Yeretzian, Yaser Abunnasr, Zahraa Makki, Betina Abi Habib

Abstract:

Developments in architectural building typologies that are informed by prevalent construction techniques and socio-cultural practices generate different adaptations in the building envelope. While different building envelope types exhibit different climate responsive passive strategies, the individual and comparative thermal performance analysis resulting from these technologies is yet to be understood. This research aims to develop this analysis by selecting three building envelope types from three distinct building traditions by measuring the heat transmission in the city of Beirut. The three typical residential buildings are selected from the 1920s, 1940s, and 1990s within the same street to ensure similar climatic and urban conditions. Climatic data loggers are installed inside and outside of the three locations to measure indoor and outdoor temperatures, relative humidity, and heat flow. The analysis of the thermal measurements is complemented by site surveys on window opening, lighting, and occupancy in the three selected locations and research on building technology from the three periods. Apart from defining the U-value of the building envelopes, the collected data will help evaluate the indoor environments with respect to the thermal comfort zone. This research, thus, validates and contextualizes the role of building technologies in relation to climate responsive design.

Keywords: architecture, wall construction, envelope performance, thermal comfort

Procedia PDF Downloads 234
915 Aging and Falls Profile from Hospital Databases

Authors: Nino Chikhladze, Tamar Dochviri, Nato Pitskhelauri, Maia Bitskhinashvili

Abstract:

Population aging is a key social and demographic trend of the 21st century. Falls represent a prevalent geriatric syndrome that poses significant risks to the health and independence of older adults. The World Health Organization notes a lack of comprehensive data on falls in low- and middle-income countries, complicating the creation of effective prevention programs. To the authors’ best knowledge, no such studies have been conducted in Georgia. The aim of the study is to explore the epidemiology of falls in the elderly population. The hospitalization database of the National Center for Disease Control and Public Health of Georgia was used for the retrospective study. Falls-related injuries were identified using ICD-10 classifications using the class XIX (S and T codes) and class XX for the type of injury (V-Y codes). Statistical data analyses were done using SPSS software version 23.0. The total number of fall-related hospitalizations for individuals aged 65 and older from 2015 to 2021 was 29,697. The study revealed that falls accounted for an average of 63% (ranging from 59% to 66%) of all hospitalizations and 68% (ranging from 65% to 70%) of injury-related hospitalizations during this period. The 69% of all patients were women and 31%-men (Chi2=4482.1, p<0.001). The highest rate of hospitalization was in the age groups 80-84 and 75-79. The probability of fall-related hospitalization was significantly higher in women (p<0.001) compared to men in all age groups except 65-69 years. In the target age group of 65 years and older, the probability of hospitalization increased significantly with an increase in age (p<0.001). The study's results can be leveraged to create evidence-based awareness programs, design targeted multi-domain interventions addressing specific risk factors, and enhance the quality of geriatric healthcare services in Georgia.

Keywords: elderly population, falls, geriatric patients, hospitalization, injuries

Procedia PDF Downloads 31
914 Comparison of Impulsivity Trait in Males and Females: Exploring the Sex Difference in Impulsivity

Authors: Pinhas Dannon, Aviv Weinstein

Abstract:

Impulsivity is raising major interest clinically because it is associated with various clinical conditions such as delinquency, antisocial behavior, suicide attempts, aggression, and criminal activity. The evolutionary perspective argued that impulsivity relates to self-regulation and it has predicted that female individuals should have evolved a greater ability to inhibit pre-potent responses. There is supportive evidence showing that female individuals have better performance on cognitive tasks measuring impulsivity such as delay in gratification and delayed discounting mainly in childhood. During adolescence, brain imaging studies using diffusion tensor imaging on white matter architecture indicated contrary to the evolutionary perspective hypothesis, that young adolescent male individuals may be less vulnerable than age-matched female individuals to risk- and reward- related maladaptive behaviors. In adults, the results are mixed presumably owing to hormonal effects on neuro-biological mechanisms of reward. Consequently, female individuals were less impulsive than male individuals only during fertile stages of the menstrual cycle. Finally, there is evidence the serotonin (5-HT) system is more involved in the impulsivity of men than in that of women. Overall, there seem to be sex differences in impulsivity but these differences are more pronounced in childhood and they are later subject to maturational and hormonal changes during adolescence and adulthood and their effects on the brain, cognition, and behavior.

Keywords: impulse control, male population, female population, gender differences, reward, neurocognitive tests

Procedia PDF Downloads 343
913 Nelder-Mead Parametric Optimization of Elastic Metamaterials with Artificial Neural Network Surrogate Model

Authors: Jiaqi Dong, Qing-Hua Qin, Yi Xiao

Abstract:

Some of the most fundamental challenges of elastic metamaterials (EMMs) optimization can be attributed to the high consumption of computational power resulted from finite element analysis (FEA) simulations that render the optimization process inefficient. Furthermore, due to the inherent mesh dependence of FEA, minuscule geometry features, which often emerge during the later stages of optimization, induce very fine elements, resulting in enormously high time consumption, particularly when repetitive solutions are needed for computing the objective function. In this study, a surrogate modelling algorithm is developed to reduce computational time in structural optimization of EMMs. The surrogate model is constructed based on a multilayer feedforward artificial neural network (ANN) architecture, trained with prepopulated eigenfrequency data prepopulated from FEA simulation and optimized through regime selection with genetic algorithm (GA) to improve its accuracy in predicting the location and width of the primary elastic band gap. With the optimized ANN surrogate at the core, a Nelder-Mead (NM) algorithm is established and its performance inspected in comparison to the FEA solution. The ANNNM model shows remarkable accuracy in predicting the band gap width and a reduction of time consumption by 47%.

Keywords: artificial neural network, machine learning, mechanical metamaterials, Nelder-Mead optimization

Procedia PDF Downloads 128
912 Performance Improvement of SOI-Tri Gate FinFET Transistor Using High-K Dielectric with Metal Gate

Authors: Fatima Zohra Rahou, A.Guen Bouazza, B. Bouazza

Abstract:

SOI TRI GATE FinFET transistors have emerged as novel devices due to its simple architecture and better performance: better control over short channel effects (SCEs) and reduced power dissipation due to reduced gate leakage currents. As the oxide thickness scales below 2 nm, leakage currents due to tunneling increase drastically, leading to high power consumption and reduced device reliability. Replacing the SiO2 gate oxide with a high-κ material allows increased gate capacitance without the associated leakage effects. In this paper, SOI TRI-GATE FinFET structure with use of high K dielectric materials (HfO2) and SiO2 dielectric are simulated using the 3-D device simulator Devedit and Atlas of TCAD Silvaco. The simulated results exhibits significant improvements in the performances of SOI TRI GATE FinFET with gate oxide HfO2 compared with conventional gate oxide SiO2 for the same structure. SOI TRI-GATE FinFET structure with the use of high K materials (HfO2) in gate oxide results into the increase in saturation current, threshold voltage, on-state current and Ion/Ioff ratio while off-state current, subthreshold slope and DIBL effect are decreased.

Keywords: technology SOI, short-channel effects (SCEs), multi-gate SOI MOSFET, SOI-TRI Gate FinFET, high-K dielectric, Silvaco software

Procedia PDF Downloads 348
911 Cable De-Commissioning of Legacy Accelerators at CERN

Authors: Adya Uluwita, Fernando Pedrosa, Georgi Georgiev, Christian Bernard, Raoul Masterson

Abstract:

CERN is an international organisation funded by 23 countries that provide the particle physics community with excellence in particle accelerators and other related facilities. Founded in 1954, CERN has a wide range of accelerators that allow groundbreaking science to be conducted. Accelerators bring particles to high levels of energy and make them collide with each other or with fixed targets, creating specific conditions that are of high interest to physicists. A chain of accelerators is used to ramp up the energy of particles and eventually inject them into the largest and most recent one: the Large Hadron Collider (LHC). Among this chain of machines is, for instance the Proton Synchrotron, which was started in 1959 and is still in operation. These machines, called "injectors”, keep evolving over time, as well as the related infrastructure. Massive decommissioning of obsolete cables started in 2015 at CERN in the frame of the so-called "injectors de-cabling project phase 1". Its goal was to replace aging cables and remove unused ones, freeing space for new cables necessary for upgrades and consolidation campaigns. To proceed with the de-cabling, a project co-ordination team was assembled. The start of this project led to the investigation of legacy cables throughout the organisation. The identification of cables stacked over half a century proved to be arduous. Phase 1 of the injectors de-cabling was implemented for 3 years with success after overcoming some difficulties. Phase 2, started 3 years later, focused on improving safety and structure with the introduction of a quality assurance procedure. This paper discusses the implementation of this quality assurance procedure throughout phase 2 of the project and the transition between the two phases. Over hundreds of kilometres of cable were removed in the injectors complex at CERN from 2015 to 2023.

Keywords: CERN, de-cabling, injectors, quality assurance procedure

Procedia PDF Downloads 96
910 Psychedelic Assisted-Treatment for Patients with Opioid Use Disorder

Authors: Daniele Zullino, Gabriel Thorens, Léonice Furtado, Federico Seragnoli, Radu Iuga, Louise Penzenstadler

Abstract:

Context: Since the start of the 21st century, there has been a resurgence of interest in psychedelics, marked by a renewed focus on scientific investigations into their therapeutic potential. While psychedelic therapy has gained recognition for effectively treating depression and anxiety disorders, notable progress has been made in the clinical development of substances like psilocybin. Moreover, mounting evidence suggests promising applications of Lysergic acid diethylamide (LSD) and psilocybin in the field of addiction medicine. In Switzerland, compassionate treatment with LSD and psilocybin has been permitted since 2014 through exceptional licenses granted by the Federal Office of Public Health. This treatment approach is also available within the Geneva treatment program, extending its accessibility to patients undergoing opioid-assisted treatment involving substances like morphine and diacetylmorphine. The aim of this study is to assess the feasibility of psychedelic-assisted therapy in patients with opioid use disorder who are undergoing opioid-assisted treatment. This study addresses the question of whether psychedelic-assisted therapy can be successfully implemented in patients with opioid use disorder. It also explores the effects of psychedelic therapy on the patient's experiences and outcomes. Methodology: This is an open case series on six patients who have undergone at least one session with either LSD (100-200 micrograms) or psilocybin (20-40 mg). The patients were assessed using the Five Dimensional Altered States of Consciousness (5D-ASC)-Scale. The data were analyzed descriptively to identify patterns and trends in the patients' experiences. Results: The patients experienced substantial positive psychedelic effects during the psychedelic sessions without significant adverse effects. The patients reported positive experiences and improvements in their condition. Conclusion: The findings of this study support the feasibility and potential efficacy of psychedelic-assisted therapy in patients undergoing opioid-assisted treatment.

Keywords: psychedelics, psychedelic-assisted treatment, opioid use disorder, addiction, LSD, psilocybin

Procedia PDF Downloads 56
909 Looking beyond Corporate Social Responsibility to Sustainable Development: Conceptualisation and Theoretical Exploration

Authors: Mercy E. Makpor

Abstract:

Traditional Corporate Social Responsibility (CSR) idea has gone beyond just ensuring safety environments, caring about global warming and ensuring good living standards and conditions for the society at large. The paradigm shift is towards a focus on strategic objectives and the long-term value creation for both businesses and the society at large for a realistic future. As an important approach to solving social and environment issues, CSR has been accepted globally. Yet the approach is expected to go beyond where it is currently. So much is expected from businesses and governments at every level globally and locally. This then leads to the original idea of the concept, that is, how it originated and how it has been perceived over the years. Little wonder there has been a lot of definitions surrounding the concept without a major globally acceptable definition of it. The definition of CSR given by the European Commission will be considered for the purpose of this paper. Sustainable Development (SD), on the other hand, has been viewed in recent years as an ethical concept explained in the UN-Report termed “Our Common Future,” which can also be referred to as the Brundtland report. The report summarises the need for SD to take place in the present without comprising the future. However, the recent 21st-century framework on sustainability known as the “Triple Bottom Line (TBL)” framework, has added its voice to the concepts of CSR and sustainable development. The TBL model is of the opinion that businesses should not only report on their financial performance but also on their social and environmental performances, highlighting that CSR has gone beyond just the “material-impact” approach towards a “Future-Oriented” approach (sustainability). In this paper, the concept of CSR is revisited by exploring the various theories therein. The discourse on the concepts of sustainable development and sustainable development frameworks will also be indicated, thereby inducing these into how CSR can benefit both businesses and their stakeholders as well as the entirety of the society, not just for the present but for the future. It does this by exploring the importance of both concepts (CSR and SD) and concludes by making recommendations for a more empirical research in the near future.

Keywords: corporate social responsibility, sustainable development, sustainability, triple bottom line model

Procedia PDF Downloads 247
908 The Conceptual Relationships in N+N Compounds in Arabic Compared to English

Authors: Abdel Rahman Altakhaineh

Abstract:

This paper has analysed the conceptual relations between the elements of NN compounds in Arabic and compared them to those found in English based on the framework of Conceptual Semantics and a modified version of Parallel Architecture referred to as Relational Morphology. The analysis revealed that the repertoire of possible semantic relations between the two nouns in Arabic NN compounds reproduces that in English NN compounds and that, therefore, the main difference is in headedness (right-headed in English, left-headed in Arabic). Adopting RM allows productive and idiosyncratic elements to interweave with each other naturally. Semantically transparent compounds can be stored in memory or produced and understood online, while compounds with different degrees of semantic idiosyncrasy are stored in memory. Furthermore, the predictable parts of idiosyncratic compounds are captured by general schemas. In compounds, such schemas pick out the range of possible semantic relations between the two nouns. Finally, conducting a cross-linguistic study of the systematic patterns of possible conceptual relationships between compound elements is an area worthy of further exploration. In addition, comparing and contrasting compounding in Arabic and Hebrew, especially as they are both Semitic languages, is another area that needs to be investigated thoroughly. It will help morphologists understand the extent to which Jackendoff’s repertoire of semantic relations in compounds is universal. That is, if a language as distant from English as Arabic displays a similar range of cases, this is evidence for a (relatively) universal set of relations from which individual languages may pick and choose.

Keywords: conceptual semantics, morphology, compounds, arabic, english

Procedia PDF Downloads 100
907 A 1.57ghz Mixer Design for GPS Receiver

Authors: Hamd Ahmed

Abstract:

During the Persian Gulf War in 1991s, The confederation forces were surprised when they were being shot at by friendly forces in Iraqi desert. As obvious was the fact that they were mislead due to the lack of proper guidance and technology resulting in unnecessary loss of life and bloodshed. This unforeseen incident along with many others led the US department of defense to open the doors of GPS. In the very beginning, this technology was for military use, but now it is being widely used and increasingly popular among the public due to its high accuracy and immeasurable significance. The GPS system simply consists of three segments, the space segment (the satellite), the control segment (ground control) and the user segment (receiver). This project work is about designing a 1.57GHZ mixer for triple conversion GPS receiver .The GPS Front-End receiver based on super heterodyne receiver which improves selectivity and image frequency. However the main principle of the super heterodyne receiver depends on the mixer. Many different types of mixers (single balanced mixer, Single Ended mixer, Double balanced mixer) can be used with GPS receiver, it depends on the required specifications. This research project will provide an overview of the GPS system and details about the basic architecture of the GPS receiver. The basic emphasis of this report in on investigating general concept of the mixer circuit some terms related to the mixer along with their definitions and present the types of mixer, then gives some advantages of using singly balanced mixer and its application. The focus of this report is on how to design mixer for GPS receiver and discussing the simulation results.

Keywords: GPS , RF filter, heterodyne, mixer

Procedia PDF Downloads 324
906 Regression of Hand Kinematics from Surface Electromyography Data Using an Long Short-Term Memory-Transformer Model

Authors: Anita Sadat Sadati Rostami, Reza Almasi Ghaleh

Abstract:

Surface electromyography (sEMG) offers important insights into muscle activation and has applications in fields including rehabilitation and human-computer interaction. The purpose of this work is to predict the degree of activation of two joints in the index finger using an LSTM-Transformer architecture trained on sEMG data from the Ninapro DB8 dataset. We apply advanced preprocessing techniques, such as multi-band filtering and customizable rectification methods, to enhance the encoding of sEMG data into features that are beneficial for regression tasks. The processed data is converted into spike patterns and simulated using Leaky Integrate-and-Fire (LIF) neuron models, allowing for neuromorphic-inspired processing. Our findings demonstrate that adjusting filtering parameters and neuron dynamics and employing the LSTM-Transformer model improves joint angle prediction performance. This study contributes to the ongoing development of deep learning frameworks for sEMG analysis, which could lead to improvements in motor control systems.

Keywords: surface electromyography, LSTM-transformer, spiking neural networks, hand kinematics, leaky integrate-and-fire neuron, band-pass filtering, muscle activity decoding

Procedia PDF Downloads 18
905 Alternative Method of Determining Seismic Loads on Buildings Without Response Spectrum Application

Authors: Razmik Atabekyan, V. Atabekyan

Abstract:

This article discusses a new alternative method for determination of seismic loads on buildings, based on resistance of structures to deformations of vibrations. The basic principles for determining seismic loads by spectral method were developed in 40… 50ies of the last century and further have been improved to pursuit true assessments of seismic effects. The base of the existing methods to determine seismic loads is response spectrum or dynamicity coefficient β (norms of RF), which are not definitively established. To this day there is no single, universal method for the determination of seismic loads and when trying to apply the norms of different countries, significant discrepancies between the results are obtained. On the other hand there is a contradiction of the results of macro seismic surveys of strong earthquakes with the principle of the calculation based on accelerations. It is well-known, on soft soils there is an increase of destructions (mainly due to large displacements), even though the accelerations decreases. Obviously, the seismic impacts are transmitted to the building through foundation, but paradoxically, the existing methods do not even include foundation data. Meanwhile acceleration of foundation of the building can differ several times from the acceleration of the ground. During earthquakes each building has its own peculiarities of behavior, depending on the interaction between the soil and the foundations, their dynamic characteristics and many other factors. In this paper we consider a new, alternative method of determining the seismic loads on buildings, without the use of response spectrum. The following main conclusions: 1) Seismic loads are revealed at the foundation level, which leads to redistribution and reduction of seismic loads on structures. 2) The proposed method is universal and allows determine the seismic loads without the use of response spectrum and any implicit coefficients. 3) The possibility of taking into account important factors such as the strength characteristics of the soils, the size of the foundation, the angle of incidence of the seismic ray and others. 4) Existing methods can adequately determine the seismic loads on buildings only for first form of vibrations, at an average soil conditions.

Keywords: seismic loads, response spectrum, dynamic characteristics of buildings, momentum

Procedia PDF Downloads 505
904 Land Cover Remote Sensing Classification Advanced Neural Networks Supervised Learning

Authors: Eiman Kattan

Abstract:

This study aims to evaluate the impact of classifying labelled remote sensing images conventional neural network (CNN) architecture, i.e., AlexNet on different land cover scenarios based on two remotely sensed datasets from different point of views such as the computational time and performance. Thus, a set of experiments were conducted to specify the effectiveness of the selected convolutional neural network using two implementing approaches, named fully trained and fine-tuned. For validation purposes, two remote sensing datasets, AID, and RSSCN7 which are publicly available and have different land covers features were used in the experiments. These datasets have a wide diversity of input data, number of classes, amount of labelled data, and texture patterns. A specifically designed interactive deep learning GPU training platform for image classification (Nvidia Digit) was employed in the experiments. It has shown efficiency in training, validation, and testing. As a result, the fully trained approach has achieved a trivial result for both of the two data sets, AID and RSSCN7 by 73.346% and 71.857% within 24 min, 1 sec and 8 min, 3 sec respectively. However, dramatic improvement of the classification performance using the fine-tuning approach has been recorded by 92.5% and 91% respectively within 24min, 44 secs and 8 min 41 sec respectively. The represented conclusion opens the opportunities for a better classification performance in various applications such as agriculture and crops remote sensing.

Keywords: conventional neural network, remote sensing, land cover, land use

Procedia PDF Downloads 372
903 Faster, Lighter, More Accurate: A Deep Learning Ensemble for Content Moderation

Authors: Arian Hosseini, Mahmudul Hasan

Abstract:

To address the increasing need for efficient and accurate content moderation, we propose an efficient and lightweight deep classification ensemble structure. Our approach is based on a combination of simple visual features, designed for high-accuracy classification of violent content with low false positives. Our ensemble architecture utilizes a set of lightweight models with narrowed-down color features, and we apply it to both images and videos. We evaluated our approach using a large dataset of explosion and blast contents and compared its performance to popular deep learning models such as ResNet-50. Our evaluation results demonstrate significant improvements in prediction accuracy, while benefiting from 7.64x faster inference and lower computation cost. While our approach is tailored to explosion detection, it can be applied to other similar content moderation and violence detection use cases as well. Based on our experiments, we propose a "think small, think many" philosophy in classification scenarios. We argue that transforming a single, large, monolithic deep model into a verification-based step model ensemble of multiple small, simple, and lightweight models with narrowed-down visual features can possibly lead to predictions with higher accuracy.

Keywords: deep classification, content moderation, ensemble learning, explosion detection, video processing

Procedia PDF Downloads 55
902 Econophysical Approach on Predictability of Financial Crisis: The 2001 Crisis of Turkey and Argentina Case

Authors: Arzu K. Kamberli, Tolga Ulusoy

Abstract:

Technological developments and the resulting global communication have made the 21st century when large capitals are moved from one end to the other via a button. As a result, the flow of capital inflows has accelerated, and capital inflow has brought with it crisis-related infectiousness. Considering the irrational human behavior, the financial crisis in the world under the influence of the whole world has turned into the basic problem of the countries and increased the interest of the researchers in the reasons of the crisis and the period in which they lived. Therefore, the complex nature of the financial crises and its linearly unexplained structure have also been included in the new discipline, econophysics. As it is known, although financial crises have prediction mechanisms, there is no definite information. In this context, in this study, using the concept of electric field from the electrostatic part of physics, an early econophysical approach for global financial crises was studied. The aim is to define a model that can take place before the financial crises, identify financial fragility at an earlier stage and help public and private sector members, policy makers and economists with an econophysical approach. 2001 Turkey crisis has been assessed with data from Turkish Central Bank which is covered between 1992 to 2007, and for 2001 Argentina crisis, data was taken from IMF and the Central Bank of Argentina from 1997 to 2007. As an econophysical method, an analogy is used between the Gauss's law used in the calculation of the electric field and the forecasting of the financial crisis. The concept of Φ (Financial Flux) has been adopted for the pre-warning of the crisis by taking advantage of this analogy, which is based on currency movements and money mobility. For the first time used in this study Φ (Financial Flux) calculations obtained by the formula were analyzed by Matlab software, and in this context, in 2001 Turkey and Argentina Crisis for Φ (Financial Flux) crisis of values has been confirmed to give pre-warning.

Keywords: econophysics, financial crisis, Gauss's Law, physics

Procedia PDF Downloads 155
901 An Adaptive Distributed Incremental Association Rule Mining System

Authors: Adewale O. Ogunde, Olusegun Folorunso, Adesina S. Sodiya

Abstract:

Most existing Distributed Association Rule Mining (DARM) systems are still facing several challenges. One of such challenges that have not received the attention of many researchers is the inability of existing systems to adapt to constantly changing databases and mining environments. In this work, an Adaptive Incremental Mining Algorithm (AIMA) is therefore proposed to address these problems. AIMA employed multiple mobile agents for the entire mining process. AIMA was designed to adapt to changes in the distributed databases by mining only the incremental database updates and using this to update the existing rules in order to improve the overall response time of the DARM system. In AIMA, global association rules were integrated incrementally from one data site to another through Results Integration Coordinating Agents. The mining agents in AIMA were made adaptive by defining mining goals with reasoning and behavioral capabilities and protocols that enabled them to either maintain or change their goals. AIMA employed Java Agent Development Environment Extension for designing the internal agents’ architecture. Results from experiments conducted on real datasets showed that the adaptive system, AIMA performed better than the non-adaptive systems with lower communication costs and higher task completion rates.

Keywords: adaptivity, data mining, distributed association rule mining, incremental mining, mobile agents

Procedia PDF Downloads 393
900 Conceptualizing Clashing Values in the Field of Media Ethics

Authors: Saadia Izzeldin Malik

Abstract:

Lack of ethics is the crisis of the 21-century. Today’s global world is filled with economic, political, environmental, media/communication, and social crises that all generated by the eroding fabric of ethics and moral values that guide human’s decisions in all aspects of live. Our global world is guided by liberal western democratic principles and liberal capitalist economic principles that define and reinforce each other. In economic terms, capitalism has turned world economic systems into one market place of ideas and products controlled by big multinational corporations that not only determine the conditions and terms of commodity production and commodity exchange between countries, but also transform the political economy of media systems around the globe. The citizen (read the consumer) today is the target of persuasion by all types of media at a time when her/his interests should be, ethically and in principle, the basic significant factor in the selection of media content. It is very important in this juncture of clashing media values –professional and commercial- and wide spread ethical lapses of media organizations and media professionals to think of a perspective to theorize these conflicting values within a broader framework of media ethics. Thus, the aim of this paper is to, epistemologically, bring to the center a perspective on media ethics as a basis for reconciliation of clashing values of the media. The paper focuses on conflicting ethical values in current media debate; namely ownership of media vs. press freedom, individual right for privacy vs. public right to know, and global western consumerism values vs. media values. The paper concludes that a framework to reconcile conflicting values of media ethics should focus on the “individual” journalist and his/her moral development as well as focus on maintaining ethical principles of the media as an institution with a primary social responsibility for the “public” it serves.

Keywords: ethics, media, journalism, social responsibility, conflicting values, global

Procedia PDF Downloads 495
899 Graph Neural Networks and Rotary Position Embedding for Voice Activity Detection

Authors: YingWei Tan, XueFeng Ding

Abstract:

Attention-based voice activity detection models have gained significant attention in recent years due to their fast training speed and ability to capture a wide contextual range. The inclusion of multi-head style and position embedding in the attention architecture are crucial. Having multiple attention heads allows for differential focus on different parts of the sequence, while position embedding provides guidance for modeling dependencies between elements at various positions in the input sequence. In this work, we propose an approach by considering each head as a node, enabling the application of graph neural networks (GNN) to identify correlations among the different nodes. In addition, we adopt an implementation named rotary position embedding (RoPE), which encodes absolute positional information into the input sequence by a rotation matrix, and naturally incorporates explicit relative position information into a self-attention module. We evaluate the effectiveness of our method on a synthetic dataset, and the results demonstrate its superiority over the baseline CRNN in scenarios with low signal-to-noise ratio and noise, while also exhibiting robustness across different noise types. In summary, our proposed framework effectively combines the strengths of CNN and RNN (LSTM), and further enhances detection performance through the integration of graph neural networks and rotary position embedding.

Keywords: voice activity detection, CRNN, graph neural networks, rotary position embedding

Procedia PDF Downloads 76
898 High Resolution Image Generation Algorithm for Archaeology Drawings

Authors: Xiaolin Zeng, Lei Cheng, Zhirong Li, Xueping Liu

Abstract:

Aiming at the problem of low accuracy and susceptibility to cultural relic diseases in the generation of high-resolution archaeology drawings by current image generation algorithms, an archaeology drawings generation algorithm based on a conditional generative adversarial network is proposed. An attention mechanism is added into the high-resolution image generation network as the backbone network, which enhances the line feature extraction capability and improves the accuracy of line drawing generation. A dual-branch parallel architecture consisting of two backbone networks is implemented, where the semantic translation branch extracts semantic features from orthophotographs of cultural relics, and the gradient screening branch extracts effective gradient features. Finally, the fusion fine-tuning module combines these two types of features to achieve the generation of high-quality and high-resolution archaeology drawings. Experimental results on the self-constructed archaeology drawings dataset of grotto temple statues show that the proposed algorithm outperforms current mainstream image generation algorithms in terms of pixel accuracy (PA), structural similarity (SSIM), and peak signal-to-noise ratio (PSNR) and can be used to assist in drawing archaeology drawings.

Keywords: archaeology drawings, digital heritage, image generation, deep learning

Procedia PDF Downloads 60
897 Racism as a Biopolitical Bordering: Experiences of the Lhotshampa People Displaced from Bhutan

Authors: Karun Karki

Abstract:

The Lhotshampa are Bhutanese people of Nepali origin who have been in Bhutan since the early 1600s. A significant number of these people migrated to Bhutan in the nineteenth century. The 1958 Nationality Law of Bhutan granted citizenship to many Lhotshampa people; however, in the late 1970s, the government of Bhutan introduced a series of laws and policies intended for the socio-political and cultural exclusion of the Lhotshampa due to their ancestry. These exclusionary policies and ethnic and racial injustices not only removed the rights and citizenship of the Lhotshampa but also forcibly displaced thousands of families with no choice but to seek refuge in Nepal. In this context, racism becomes a biopolitical tool designed to govern and regulate populations in a way that determines who may live and who must die. The governance and the management of the population, what Stephan Scheel terms as biopolitical bordering, depends on boundaries between residents and non-residents, citizens and non-citizens, and emigrants and immigrants. Drawing on Foucault’s biopolitics and Mbembe’s necropolitics, this paper argues that the concept of racism should be examined within the context of political discourses because it is intertwined with the colonial project, enslavement, and diaspora. This paper critically explores ethnic and racial injustices the Lhotshampa people experienced and the ways in which they negotiated and resisted such injustices in their resettlement processes, including before displacement, in refugee camps, and after the third-country resettlement. Critical examination of these issues helps shed light on the notion of racial difference that justifies dehumanization, discrimination, and racist attitudes against the Lhotshampa people. The study's findings are critical in promoting human rights, social justice, and the health and well-being of the Lhotshampa community in the context of trauma and stressors in their resettlement processes.

Keywords: lhotshampa people, bhutanese refugees, racism, dehumanization, social justice, biopower, necropower

Procedia PDF Downloads 54
896 Hybrid Lateral-Directional Robust Flight Control with Propulsive Systems

Authors: Alexandra Monteiro, K. Bousson, Fernando J. O. Moreira, Ricardo Reis

Abstract:

Fixed-wing flying vehicles are usually controlled by means of control surfaces such as elevators, ailerons, and rudders. The failure of these systems may lead to severe or even fatal crashes. These failures resulted in increased popularity for research activities on propulsion control in the last decades. The present work deals with a hybrid control architecture in which the propulsion-controlled vehicle maintains its traditional control surfaces, addressing the issue of robust lateral-directional dynamics control. The challenges stem from the parameter uncertainties in the stability and control derivatives and some unknown terms in the flight dynamics model. Two approaches are implemented and tested: linear quadratic regulation with robustness characteristics and H∞ control. The problem is centered on roll-yaw controller design with full state-feedback, which is able to deal with a standalone propulsion control mode as well as a hybrid mode combining both propulsion control and conventional control surface concepts while maintaining the original flight maneuverability characteristics. The results for both controllers emphasized very good control performances; however, the H∞ controller showed higher stabilization rates and robustness albeit with a slightly higher control magnitude than using the linear quadratic regulator.

Keywords: robust propulsion control, h-infinity control, lateral-directional flight dynamics, parameter uncertainties

Procedia PDF Downloads 156
895 Mapping the Quotidian Life of Practitioners of Various Religious Sects in Late Medieval Bengal: Portrayals on the Front Façades of the Baranagar Temple Cluster

Authors: I. Gupta, B. Karmakar

Abstract:

Bengal has a long history (8th century A.D. onwards) of decorating the wall of brick-built temples with curved terracotta plaques on a diverse range of subjects. These could be considered as one of the most significant visual archives to understand the various facets of the then contemporary societies. The temples under focus include Char-bangla temple complex (circa 1755 A.D.), Bhavanishvara temple (circa 1755 A.D.) and the Gangeshvara Shiva Jor-bangla temple (circa 1753 A.D.), located within a part of the river Bhagirathi basin in Baranagar, Murshidabad, West Bengal, India. Though, a diverse range of subjects have been intricately carved mainly on the front façades of the Baranagar temple cluster, the study specifically concentrates on depictions related to religious and non-religious acts performed by practitioners of various religious sects of late medieval Bengal with the intention to acquire knowledge about the various facets of their life. Apart from this, the paper also mapped the spatial location of these religious performers on the temples’ façades to examine if any systematic plan or arrangement had been employed for connoting a particular idea. Further, an attempt is made to provide a commentary on the attire worn by followers of various religious sects of late medieval Bengal. The primary materials for the study comprise the depictions which denote religious activities carved on the terracotta plaques. The secondary material has been collected from published and unpublished theses, journals and books. These data have been further supplemented with photographic documentation, some useful line-drawings and descriptions in table format to get a clear understanding of the concerned issues.

Keywords: attire, scheme of allocation, terracotta temple, various religious sect

Procedia PDF Downloads 138
894 Recommendation of Semi Permanent Buildings for Tsunami Prone Areas

Authors: Fitri Nugraheni, Adwitya Bhaskara, N. Faried Hanafi

Abstract:

Coastal is one area that can be a place to live. Various buildings can be built in the area around the beach. Many Indonesians use beaches as housing and work, but we know that coastal areas are identical to tsunami and wind. Costs incurred due to permanent damage caused by tsunamis and wind disasters in Indonesia can be minimized by replacing permanent buildings into semi-permanent buildings. Semi-permanent buildings can be realized by using cold-formed steel as a building. Thus, the purpose of this research is to provide efficient semi-permanent building recommendations for residents around the coast. The research is done by first designing the building model by using sketch-up software, then the validation phase is done in consultation with the expert consultant of cold form steel structure. Based on the results of the interview there are several revisions on several sides of the building by adding some bracing rods on the roof, walls and floor frame. The result of this research is recommendation of semi-permanent building model, where the nature of the building; easy to disassemble and install (knockdown), tsunami-friendly (continue the tsunami load), cost and time efficient (using cold-formed-steel and prefabricated GRC), zero waste, does not require many workers (less labor). The recommended building design concept also keeps the architecture side in mind thus it remains a comfortable occupancy for the residents.

Keywords: construction method, cold-formed steel, efficiency, semi-permanent building, tsunami

Procedia PDF Downloads 285
893 Biologic Materials- Ecological Living Network

Authors: Ina Dajci

Abstract:

Biologic Materials presents groundbreaking transdisciplinary research aimed at fostering new collaborative models across the Built Environment, Forestry, and Agriculture sectors. This initiative seeks to establish innovative paradigms for local and global material flows by developing a biocompatible, regenerative material economy. The project focuses on creating materials derived from biowaste and silvicultural practices, ensuring the preservation of endangered indigenous and vernacular techniques through the integration of emerging biosciences. By utilizing biomaterials sourced from agricultural waste and forest byproducts, the initiative incorporates fabrication methods recognized by UNESCO as ‘intangible cultural heritage of humanity,’ which are currently at risk. The structural, mechanical, and environmental properties of these materials are enhanced through advanced CAD-CAM fabrication, along with energy-efficient biochemical and bacterial processes that promote healthy indigo coloration. Furthermore, the integration of AI technologies in species selection facilitates a novel partnership model, enabling designers to collaborate effectively with forest managers and silviculture practitioners. This collaborative approach not only optimizes the use of plant-based materials but also enhances biodiversity and climate resilience in regional ecosystems. Overall, this project embodies a holistic strategy for addressing environmental challenges while revitalizing traditional practices and fostering sustainable innovation.

Keywords: material, architecture, culture, heritage, ecology, environment

Procedia PDF Downloads 16
892 Theoretical Comparisons and Empirical Illustration of Malmquist, Hicks–Moorsteen, and Luenberger Productivity Indices

Authors: Fatemeh Abbasi, Sahand Daneshvar

Abstract:

Productivity is one of the essential goals of companies to improve performance, which as a strategy-oriented method, determines the basis of the company's economic growth. The history of productivity goes back centuries, but most researchers defined productivity as the relationship between a product and the factors used in production in the early twentieth century. Productivity as the optimal use of available resources means that "more output using less input" can increase companies' economic growth and prosperity capacity. Also, having a quality life based on economic progress depends on productivity growth in that society. Therefore, productivity is a national priority for any developed country. There are several methods for calculating productivity growth measurements that can be divided into parametric and non-parametric methods. Parametric methods rely on the existence of a function in their hypotheses, while non-parametric methods do not require a function based on empirical evidence. One of the most popular non-parametric methods is Data Envelopment Analysis (DEA), which measures changes in productivity over time. The DEA evaluates the productivity of decision-making units (DMUs) based on mathematical models. This method uses multiple inputs and outputs to compare the productivity of similar DMUs such as banks, government agencies, companies, airports, Etc. Non-parametric methods are themselves divided into the frontier and non frontier approaches. The Malmquist productivity index (MPI) proposed by Caves, Christensen, and Diewert (1982), the Hicks–Moorsteen productivity index (HMPI) proposed by Bjurek (1996), or the Luenberger productivity indicator (LPI) proposed by Chambers (2002) are powerful tools for measuring productivity changes over time. This study will compare the Malmquist, Hicks–Moorsteen, and Luenberger indices theoretically and empirically based on DEA models and review their strengths and weaknesses.

Keywords: data envelopment analysis, Hicks–Moorsteen productivity index, Leuenberger productivity indicator, malmquist productivity index

Procedia PDF Downloads 194