Search results for: performing art
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1117

Search results for: performing art

277 Signal Processing Techniques for Adaptive Beamforming with Robustness

Authors: Ju-Hong Lee, Ching-Wei Liao

Abstract:

Adaptive beamforming using antenna array of sensors is useful in the process of adaptively detecting and preserving the presence of the desired signal while suppressing the interference and the background noise. For conventional adaptive array beamforming, we require a prior information of either the impinging direction or the waveform of the desired signal to adapt the weights. The adaptive weights of an antenna array beamformer under a steered-beam constraint are calculated by minimizing the output power of the beamformer subject to the constraint that forces the beamformer to make a constant response in the steering direction. Hence, the performance of the beamformer is very sensitive to the accuracy of the steering operation. In the literature, it is well known that the performance of an adaptive beamformer will be deteriorated by any steering angle error encountered in many practical applications, e.g., the wireless communication systems with massive antennas deployed at the base station and user equipment. Hence, developing effective signal processing techniques to deal with the problem due to steering angle error for array beamforming systems has become an important research work. In this paper, we present an effective signal processing technique for constructing an adaptive beamformer against the steering angle error. The proposed array beamformer adaptively estimates the actual direction of the desired signal by using the presumed steering vector and the received array data snapshots. Based on the presumed steering vector and a preset angle range for steering mismatch tolerance, we first create a matrix related to the direction vector of signal sources. Two projection matrices are generated from the matrix. The projection matrix associated with the desired signal information and the received array data are utilized to iteratively estimate the actual direction vector of the desired signal. The estimated direction vector of the desired signal is then used for appropriately finding the quiescent weight vector. The other projection matrix is set to be the signal blocking matrix required for performing adaptive beamforming. Accordingly, the proposed beamformer consists of adaptive quiescent weights and partially adaptive weights. Several computer simulation examples are provided for evaluating and comparing the proposed technique with the existing robust techniques.

Keywords: adaptive beamforming, robustness, signal blocking, steering angle error

Procedia PDF Downloads 100
276 Embedded Hybrid Intuition: A Deep Learning and Fuzzy Logic Approach to Collective Creation and Computational Assisted Narratives

Authors: Roberto Cabezas H

Abstract:

The current work shows the methodology developed to create narrative lighting spaces for the multimedia performance piece 'cluster: the vanished paradise.' This empirical research is focused on exploring unconventional roles for machines in subjective creative processes, by delving into the semantics of data and machine intelligence algorithms in hybrid technological, creative contexts to expand epistemic domains trough human-machine cooperation. The creative process in scenic and performing arts is guided mostly by intuition; from that idea, we developed an approach to embed collective intuition in computational creative systems, by joining the properties of Generative Adversarial Networks (GAN’s) and Fuzzy Clustering based on a semi-supervised data creation and analysis pipeline. The model makes use of GAN’s to learn from phenomenological data (data generated from experience with lighting scenography) and algorithmic design data (augmented data by procedural design methods), fuzzy logic clustering is then applied to artificially created data from GAN’s to define narrative transitions built on membership index; this process allowed for the creation of simple and complex spaces with expressive capabilities based on position and light intensity as the parameters to guide the narrative. Hybridization comes not only from the human-machine symbiosis but also on the integration of different techniques for the implementation of the aided design system. Machine intelligence tools as proposed in this work are well suited to redefine collaborative creation by learning to express and expand a conglomerate of ideas and a wide range of opinions for the creation of sensory experiences. We found in GAN’s and Fuzzy Logic an ideal tool to develop new computational models based on interaction, learning, emotion and imagination to expand the traditional algorithmic model of computation.

Keywords: fuzzy clustering, generative adversarial networks, human-machine cooperation, hybrid collective data, multimedia performance

Procedia PDF Downloads 118
275 Systems Intelligence in Management (High Performing Organizations and People Score High in Systems Intelligence)

Authors: Raimo P. Hämäläinen, Juha Törmänen, Esa Saarinen

Abstract:

Systems thinking has been acknowledged as an important approach in the strategy and management literature ever since the seminal works of Ackhoff in the 1970´s and Senge in the 1990´s. The early literature was very much focused on structures and organizational dynamics. Understanding systems is important but making improvements also needs ways to understand human behavior in systems. Peter Senge´s book The Fifth Discipline gave the inspiration to the development of the concept of Systems Intelligence. The concept integrates the concepts of personal mastery and systems thinking. SI refers to intelligent behavior in the context of complex systems involving interaction and feedback. It is a competence related to the skills needed in strategy and the environment of modern industrial engineering and management where people skills and systems are in an increasingly important role. The eight factors of Systems Intelligence have been identified from extensive surveys and the factors relate to perceiving, attitude, thinking and acting. The personal self-evaluation test developed consists of 32 items which can also be applied in a peer evaluation mode. The concept and test extend to organizations too. One can talk about organizational systems intelligence. This paper reports the results of an extensive survey based on peer evaluation. The results show that systems intelligence correlates positively with professional performance. People in a managerial role score higher in SI than others. Age improves the SI score but there is no gender difference. Top organizations score higher in all SI factors than lower ranked ones. The SI-tests can also be used as leadership and management development tools helping self-reflection and learning. Finding ways of enhancing learning organizational development is important. Today gamification is a new promising approach. The items in the SI test have been used to develop an interactive card game following the Topaasia game approach. It is an easy way of engaging people in a process which both helps participants see and approach problems in their organization. It also helps individuals in identifying challenges in their own behavior and in improving in their SI.

Keywords: gamification, management competence, organizational learning, systems thinking

Procedia PDF Downloads 71
274 Tonal Pitch Structure as a Tool of Social Consolidation

Authors: Piotr Podlipniak

Abstract:

Social consolidation has often been indicated as an adaptive function of music which led to the evolution of music faculty. According to many scholars this function is possible thanks to musical rhythm that enables sensorimotor synchronization to a musical beat. The ability to synchronize to music allows performing music collectively which enhances social cohesion. However, the collective performance of music consists also in spectral synchronization that depends on musical pitch structure. Similarly to rhythmic synchronization, spectral synchronization is a result of ‘brain states alignment’ between people who collectively listen to or perform music. In order to successfully synchronize pitches performers have to adequately expect the pitch structure. The most common form of music which predominates among all human societies is tonal music. In fact tonality understood in the broadest sense as such an organization of musical pitches in which some pitch is more important than others is the only kind of musical pitch structure that has been observed in all currently known musical cultures. The perception of such a musical pitch structure elicits specific emotional reactions which are often described as tensions and relaxations. These facts provoke some important questions. What is the evolutionary reason that people use pitch structure as a form of vocal communication? Why different pitch structures elicit different emotional states independent of extra-musical context? It is proposed in the current presentation that in the course of evolution pitch structure became a human specific tool of communication the function of which is to induce emotional states such as uncertainty and cohesion. By the means of eliciting these emotions during collective music performance people are able to unconsciously give cues concerning social acceptance. This is probably one of the reasons why in all cultures people collectively perform tonal music. It is also suggested that tonal pitch structure had been invented socially before it became an evolutionary innovation of Homo sapiens. It means that a predisposition to tonally organize pitches evolved by the means of ‘Baldwin effect’ – a process in which natural selection transforms the learned response of an organism into the instinctive response. The hypothetical evolutionary scenario of the emergence of tonal pitch structure will be proposed. In this scenario social forces such as a need for closer cooperation play the crucial role.

Keywords: emotion, evolution, tonality, social consolidation

Procedia PDF Downloads 296
273 A 3-Dimensional Memory-Based Model for Planning Working Postures Reaching Specific Area with Postural Constraints

Authors: Minho Lee, Donghyun Back, Jaemoon Jung, Woojin Park

Abstract:

The current 3-dimensional (3D) posture prediction models commonly provide only a few optimal postures to achieve a specific objective. The problem with such models is that they are incapable of rapidly providing several optimal posture candidates according to various situations. In order to solve this problem, this paper presents a 3D memory-based posture planning (3D MBPP) model, which is a new digital human model that can analyze the feasible postures in 3D space for reaching tasks that have postural constraints and specific reaching space. The 3D MBPP model can be applied to the types of works that are done with constrained working postures and have specific reaching space. The examples of such works include driving an excavator, driving automobiles, painting buildings, working at an office, pitching/batting, and boxing. For these types of works, a limited amount of space is required to store all of the feasible postures, as the hand reaches boundary can be determined prior to perform the task. This prevents computation time from increasing exponentially, which has been one of the major drawbacks of memory-based posture planning model in 3D space. This paper validates the utility of 3D MBPP model using a practical example of analyzing baseball batting posture. In baseball, batters swing with both feet fixed to the ground. This motion is appropriate for use with the 3D MBPP model since the player must try to hit the ball when the ball is located inside the strike zone (a limited area) in a constrained posture. The results from the analysis showed that the stored and the optimal postures vary depending on the ball’s flying path, the hitting location, the batter’s body size, and the batting objective. These results can be used to establish the optimal postural strategies for achieving the batting objective and performing effective hitting. The 3D MBPP model can also be applied to various domains to determine the optimal postural strategies and improve worker comfort.

Keywords: baseball, memory-based, posture prediction, reaching area, 3D digital human models

Procedia PDF Downloads 186
272 A Semantic and Concise Structure to Represent Human Actions

Authors: Tobias Strübing, Fatemeh Ziaeetabar

Abstract:

Humans usually manipulate objects with their hands. To represent these actions in a simple and understandable way, we need to use a semantic framework. For this purpose, the Semantic Event Chain (SEC) method has already been presented which is done by consideration of touching and non-touching relations between manipulated objects in a scene. This method was improved by a computational model, the so-called enriched Semantic Event Chain (eSEC), which incorporates the information of static (e.g. top, bottom) and dynamic spatial relations (e.g. moving apart, getting closer) between objects in an action scene. This leads to a better action prediction as well as the ability to distinguish between more actions. Each eSEC manipulation descriptor is a huge matrix with thirty rows and a massive set of the spatial relations between each pair of manipulated objects. The current eSEC framework has so far only been used in the category of manipulation actions, which eventually involve two hands. Here, we would like to extend this approach to a whole body action descriptor and make a conjoint activity representation structure. For this purpose, we need to do a statistical analysis to modify the current eSEC by summarizing while preserving its features, and introduce a new version called Enhanced eSEC or (e2SEC). This summarization can be done from two points of the view: 1) reducing the number of rows in an eSEC matrix, 2) shrinking the set of possible semantic spatial relations. To achieve these, we computed the importance of each matrix row in an statistical way, to see if it is possible to remove a particular one while all manipulations are still distinguishable from each other. On the other hand, we examined which semantic spatial relations can be merged without compromising the unity of the predefined manipulation actions. Therefore by performing the above analyses, we made the new e2SEC framework which has 20% fewer rows, 16.7% less static spatial and 11.1% less dynamic spatial relations. This simplification, while preserving the salient features of a semantic structure in representing actions, has a tremendous impact on the recognition and prediction of complex actions, as well as the interactions between humans and robots. It also creates a comprehensive platform to integrate with the body limbs descriptors and dramatically increases system performance, especially in complex real time applications such as human-robot interaction prediction.

Keywords: enriched semantic event chain, semantic action representation, spatial relations, statistical analysis

Procedia PDF Downloads 85
271 Management as a Proxy for Firm Quality

Authors: Petar Dobrev

Abstract:

There is no agreed-upon definition of firm quality. While profitability and stock performance often qualify as popular proxies of quality, in this project, we aim to identify quality without relying on a firm’s financial statements or stock returns as selection criteria. Instead, we use firm-level data on management practices across small to medium-sized U.S. manufacturing firms from the World Management Survey (WMS) to measure firm quality. Each firm in the WMS dataset is assigned a mean management score from 0 to 5, with higher scores identifying better-managed firms. This management score serves as our proxy for firm quality and is the sole criteria we use to separate firms into portfolios comprised of high-quality and low-quality firms. We define high-quality (low-quality) firms as those firms with a management score of one standard deviation above (below) the mean. To study whether this proxy for firm quality can identify better-performing firms, we link this data to Compustat and The Center for Research in Security Prices (CRSP) to obtain firm-level data on financial performance and monthly stock returns, respectively. We find that from 1999 to 2019 (our sample data period), firms in the high-quality portfolio are consistently more profitable — higher operating profitability and return on equity compared to low-quality firms. In addition, high-quality firms also exhibit a lower risk of bankruptcy — a higher Altman Z-score. Next, we test whether the stocks of the firms in the high-quality portfolio earn superior risk-adjusted excess returns. We regress the monthly excess returns on each portfolio on the Fama-French 3-factor, 4-factor, and 5-factor models, the betting-against-beta factor, and the quality-minus-junk factor. We find no statistically significant differences in excess returns between both portfolios, suggesting that stocks of high-quality (well managed) firms do not earn superior risk-adjusted returns compared to low-quality (poorly managed) firms. In short, our proxy for firm quality, the WMS management score, can identify firms with superior financial performance (higher profitability and reduced risk of bankruptcy). However, our management proxy cannot identify stocks that earn superior risk-adjusted returns, suggesting no statistically significant relationship between managerial quality and stock performance.

Keywords: excess stock returns, management, profitability, quality

Procedia PDF Downloads 72
270 The Effect of Expanding the Early Pregnancy Assessment Clinic and COVID-19 on Emergency Department and Urgent Care Visits for Early Pregnancy Bleeding

Authors: Harley Bray, Helen Pymar, Michelle Liu, Chau Pham, Tomislav Jelic, Fran Mulhall

Abstract:

Background: Our study assesses the impact of the COVID-19 pandemic on Early Pregnancy Assessment Clinic (EPAC) referrals and the use of virtual consultation in Winnipeg, Manitoba. Our clinic expanded to accept referrals from all Winnipeg Emergency Department (ED)/Urgent Care (UC) sites beginning November 2019 to April 2020. By May 2020, the COVID-19 pandemic reached Manitoba, and EPAC virtual care was expanded by performing hCG remotely and reviewing blood and ED/UC ultrasound results by phone. Methods: Emergency Department Information Systems (EDIS) and EPAC data reviewed ED/UC visits for pregnancy <20 weeks and vaginal bleeding 1-year pre-COVID (March 12, 2019, to March 11, 2020) and during COVID (March 12, 2020 (first case in Manitoba) to March 11, 2021). Results: There were fewer patient visits for vaginal bleeding or pregnancy of <20 weeks (4264 vs. 5180), diagnoses of threatened abortion (1895 vs. 2283), and ectopic pregnancy (78 vs. 97) during COVID compared with pre-COVID, respectively. International Classification of Disease 10 codes were missing in 849 (20%) and 1183 (23%) of patients during COVID and pre-COVID, respectively. Wait times for all patient visits improved during COVID-19 compared to pre-COVID (5.1 ±4.4 hours vs. 5.5 ± 3.8 hours), more patients received obstetrical ultrasounds, 761 (18%) vs. 787 (15%), and fewer patients returned within 30 days (1360 (32%) vs. 1848 (36%); p<0.01). EPAC saw 708 patients (218; 31% new ED/UC) during COVID compared to 552 (37; 7% new ED/UC) pre-COVID. Fewer operative interventions for pregnancy loss (346 vs. 456) and retained products (236 vs. 272) were noted. Surgeries to treat ectopic pregnancy (106 vs. 113) remained stable during the study time interval. Conclusion: Accurate identification of pregnancy complications was difficult, with over 20% missing ICD-10 diagnostic codes. There were fewer ED/UC visits and surgical management for threatened abortion during COVID, but ectopic pregnancy operative management remained unchanged.

Keywords: obstetrics and gynecology, EPAC, early pregnancy assessment, first trimester, emergency department, abortion, pregnancy, COVID-19

Procedia PDF Downloads 41
269 Comparison of the Efficacy of Ketamine-Propofol versus Thiopental Sodium-Fentanyl in Procedural Sedation in the Emergency Department: A Randomized Double-Blind Clinical Trial

Authors: Maryam Bahreini, Mostafa Talebi Garekani, Fatemeh Rasooli, Atefeh Abdollahi

Abstract:

Introduction: Procedural sedation and analgesia have been desirable to handle painful procedures. The trend to find the agent with more efficacy and less complications is still controversial; thus, many sedative regimens have been studied. This study tried to assess the effectiveness and adverse effects of thiopental sodium-fentanyl with the known medication, ketamine-propofol for procedural sedation in the emergency department. Methods: Consenting patients were enrolled in this randomized double-blind trial to receive either 1:1 ketamine-propofol (KP) or thiopental-fentanyl (TF) 1:1 mg: Mg proportion on a weight-based dosing basis to reach the sedation level of American Society of Anesthesiologist class III/IV. The respiratory and hemodynamic complications, nausea and vomiting, recovery agitation, patient recall and satisfaction, provider satisfaction and recovery time were compared. The study was registered in Iranian randomized Control Trial Registry (Code: IRCT2015111325025N1). Results: 96 adult patients were included and randomized, 47 in the KP group and 49 in the TF group. 2.1% in the KP group and 8.1 % in the TF group experienced transient hypoxia leading to performing 4.2 % versus 8.1 % airway maneuvers for 2 groups, respectively; however, no statistically significant difference was observed between 2 combinations, and there was no report of endotracheal placement or further admission. Patient and physician satisfaction were significantly higher in the KP group. There was no difference in respiratory, gastrointestinal, cardiovascular and psychiatric adverse events, recovery time and patient recall of the procedure between groups. The efficacy and complications were not related to the type of procedure or patients’ smoking or addiction trends. Conclusion: Ketamine-propofol and thiopental-fentanyl combinations were effectively comparable although KP resulted in higher patient and provider satisfaction. It is estimated that thiopental fentanyl combination can be as potent and efficacious as ketofol with relatively similar incidence of adverse events in procedural sedation.

Keywords: adverse effects, conscious sedation, fentanyl, propofol, ketamine, safety, thiopental

Procedia PDF Downloads 187
268 Elevated Reductive Defluorination of Branched Per and Polyfluoroalkyl Substances by Soluble Metal-Porphyrins and New Mechanistic Insights on the Degradation

Authors: Jun Sun, Tsz Tin Yu, Maryam Mirabediny, Matthew Lee, Adele Jones, Denis M. O’Carroll, Michael J. Manefield, Björn Åkermark, Biswanath Das, Naresh Kumar

Abstract:

Reductive defluorination has emerged as a sustainable approach to clean water from Per and polyfluoroalkyl substances (PFASs), also known as forever organic containments. For last few decades, nano zero valent metals (nZVMs) have been intensively applied in the reductive remediation of groundwater contaminated with chlorinated organic compounds due to its low redox potential, easy application, and low production cost. However, there is inadequate information on the effective reductive defluorination of linear or branched PFAS using nZVMs as reductants because of the lack of suitable catalysts. CoII-5,10,15,20-Tetraphenyl-21H,23H-porphyrin (CoTPP) has been recently reported for effective catalyzing reductive defluorination of branched (br-) perfluorooctane sulfonate (PFOS) by using TiIII citrate as reductant. However, the low water solubility of CoTPP limited its applicability. Here, we explored a series of structurally related soluble cobalt porphyrin catalysts based on our previously reported best performing CoTPP. All soluble porphyrins [[meso-tetra(4-carboxyphenyl)porphyrinato]cobalt(III)]Cl·₇H₂O (CoTCPP), [[meso-tetra(4-sulfonatophenyl) porphyrinato]cobalt(III)]·9H2O (CoTPPS), and [[meso-tetra(4-N-methylpyridyl) porphyrinato]cobalt(II)](I)₄·₄H₂O (CoTMpyP) displayed better defluorination efficiencies than CoTPP. Especially, CoTMpyP presented the best defluorination efficiency for br-PFOS (94 %), branched perfluorooctanoic acid (PFOA) (89 %), and 3,7-Perfluorodecanoic acid (PFDA) (60 %) after 1 day at 70 0C. CoTMpyP-nZn0 system showed 88-164 times higher defluorination rate than VB12-nZn0 system in terms of all investigated br-PFASs. The CoTMpyP-nZn0 also performed effectively at room temperature, demonstrating the potential prospect for in-situ reductive systems. Based on the analysis of the intermediate products, the calculated bond dissociation energies (BDEs) and possible first interaction between CoTMpyP and PFAS, degradation pathways of 3,7-PFDA and 6-PFOS are proposed.

Keywords: cationic, soluble porphyrin, cobalt, vitamin b12, pfas, reductive defluorination

Procedia PDF Downloads 53
267 Sorghum Resilience and Sustainability under Limiting and Non-limiting Conditions of Water and Nitrogen

Authors: Muhammad Tanveer Altaf, Mehmet Bedir, Waqas Liaqat, Gönül Cömertpay, Volkan Çatalkaya, Celaluddin Barutçular, Nergiz Çoban, Ibrahim Cerit, Muhammad Azhar Nadeem, Tolga Karaköy, Faheem Shehzad Baloch

Abstract:

Food production needs to be almost double by 2050 in order to feed around 9 billion people around the Globe. Plant production mostly relies on fertilizers, which also have one of the main roles in environmental pollution. In addition to this, climatic conditions are unpredictable, and the earth is expected to face severe drought conditions in the future. Therefore, water and fertilizers, especially nitrogen are considered as main constraints for future food security. To face these challenges, developing integrative approaches for germplasm characterization and selecting the resilient genotypes performing under limiting conditions is very crucial for effective breeding to meet the food requirement under climatic change scenarios. This study is part of the European Research Area Network (ERANET) project for the characterization of the diversity panel of 172 sorghum accessions and six hybrids as control cultivars under limiting (+N/-H2O, -N/+H2O) and non-limiting conditions (+N+H2O). This study was planned to characterize the sorghum diversity in relation to resource Use Efficiency (RUE), with special attention on harnessing the interaction between genotype and environment (GxE) from a physiological and agronomic perspective. Experiments were conducted at Adana, a Mediterranean climate, with augmented design, and data on various agronomic and physiological parameters were recorded. Plentiful diversity was observed in the sorghum diversity panel and significant variations were seen among the limiting water and nitrogen conditions in comparison with the control experiment. Potential genotypes with the best performance are identified under limiting conditions. Whole genome resequencing was performed for whole germplasm under investigation for diversity analysis. GWAS analysis will be performed using genotypic and phenotypic data and linked markers will be identified. The results of this study will show the adaptation and improvement of sorghum under climate change conditions for future food security.

Keywords: germplasm, sorghum, drought, nitrogen, resources use efficiency, sequencing

Procedia PDF Downloads 50
266 Structure-Guided Optimization of Sulphonamide as Gamma–Secretase Inhibitors for the Treatment of Alzheimer’s Disease

Authors: Vaishali Patil, Neeraj Masand

Abstract:

In older people, Alzheimer’s disease (AD) is turning out to be a lethal disease. According to the amyloid hypothesis, aggregation of the amyloid β–protein (Aβ), particularly its 42-residue variant (Aβ42), plays direct role in the pathogenesis of AD. Aβ is generated through sequential cleavage of amyloid precursor protein (APP) by β–secretase (BACE) and γ–secretase (GS). Thus in the treatment of AD, γ-secretase modulators (GSMs) are potential disease-modifying as they selectively lower pathogenic Aβ42 levels by shifting the enzyme cleavage sites without inhibiting γ–secretase activity. This possibly avoids known adverse effects observed with complete inhibition of the enzyme complex. Virtual screening, via drug-like ADMET filter, QSAR and molecular docking analyses, has been utilized to identify novel γ–secretase modulators with sulphonamide nucleus. Based on QSAR analyses and docking score, some novel analogs have been synthesized. The results obtained by in silico studies have been validated by performing in vivo analysis. In the first step, behavioral assessment has been carried out using Scopolamine induced amnesia methodology. Later the same series has been evaluated for neuroprotective potential against the oxidative stress induced by Scopolamine. Biochemical estimation was performed to evaluate the changes in biochemical markers of Alzheimer’s disease such as lipid peroxidation (LPO), Glutathione reductase (GSH), and Catalase. The Scopolamine induced amnesia model has shown increased Acetylcholinesterase (AChE) levels and the inhibitory effect of test compounds in the brain AChE levels have been evaluated. In all the studies Donapezil (Dose: 50µg/kg) has been used as reference drug. The reduced AChE activity is shown by compounds 3f, 3c, and 3e. In the later stage, the most potent compounds have been evaluated for Aβ42 inhibitory profile. It can be hypothesized that this series of alkyl-aryl sulphonamides exhibit anti-AD activity by inhibition of Acetylcholinesterase (AChE) enzyme as well as inhibition of plaque formation on prolong dosage along with neuroprotection from oxidative stress.

Keywords: gamma-secretase inhibitors, Alzzheimer's disease, sulphonamides, QSAR

Procedia PDF Downloads 228
265 The Effect of Different Parameters on a Single Invariant Lateral Displacement Distribution to Consider the Higher Modes Effect in a Displacement-Based Pushover Procedure

Authors: Mohamad Amin Amini, Mehdi Poursha

Abstract:

Nonlinear response history analysis (NL-RHA) is a robust analytical tool for estimating the seismic demands of structures responding in the inelastic range. However, because of its conceptual and numerical complications, the nonlinear static procedure (NSP) is being increasingly used as a suitable tool for seismic performance evaluation of structures. The conventional pushover analysis methods presented in various codes (FEMA 356; Eurocode-8; ATC-40), are limited to the first-mode-dominated structures, and cannot take higher modes effect into consideration. Therefore, since more than a decade ago, researchers developed enhanced pushover analysis procedures to take higher modes effect into account. The main objective of this study is to propose an enhanced invariant lateral displacement distribution to take higher modes effect into consideration in performing a displacement-based pushover analysis, whereby a set of laterally applied displacements, rather than forces, is monotonically applied to the structure. For this purpose, the effect of different parameters such as the spectral displacement of ground motion, the modal participation factor, and the effective modal participating mass ratio on the lateral displacement distribution is investigated to find the best distribution. The major simplification of this procedure is that the effect of higher modes is concentrated into a single invariant lateral load distribution. Therefore, only one pushover analysis is sufficient without any need to utilize a modal combination rule for combining the responses. The invariant lateral displacement distribution for pushover analysis is then calculated by combining the modal story displacements using the modal combination rules. The seismic demands resulting from the different procedures are compared to those from the more accurate nonlinear response history analysis (NL-RHA) as a benchmark solution. Two structures of different heights including 10 and 20-story special steel moment resisting frames (MRFs) were selected and evaluated. Twenty ground motion records were used to conduct the NL-RHA. The results show that more accurate responses can be obtained in comparison with the conventional lateral loads when the enhanced modal lateral displacement distributions are used.

Keywords: displacement-based pushover, enhanced lateral load distribution, higher modes effect, nonlinear response history analysis (NL-RHA)

Procedia PDF Downloads 249
264 Mediterranean Diet, Duration of Admission and Mortality in Elderly, Hospitalized Patients: A Cross-Sectional Study

Authors: Christos Lampropoulos, Maria Konsta, Ifigenia Apostolou, Vicky Dradaki, Tamta Sirbilatze, Irini Dri, Christina Kordali, Vaggelis Lambas, Kostas Argyros, Georgios Mavras

Abstract:

Objectives: Mediterranean diet has been associated with lower incidence of cardiovascular disease and cancer. The purpose of our study was to examine the hypothesis that Mediterranean diet may protect against mortality and reduce admission duration in elderly, hospitalized patients. Methods: Sample population included 150 patients (78 men, 72 women, mean age 80±8.2). The following data were taken into account in analysis: anthropometric and laboratory data, dietary habits (MedDiet score), patients’ nutritional status [Mini Nutritional Assessment (MNA) score], physical activity (International Physical Activity Questionnaires, IPAQ), smoking status, cause and duration of current admission, medical history (co-morbidities, previous admissions). Primary endpoints were mortality (from admission until 6 months afterwards) and duration of admission, compared to national guidelines for closed consolidated medical expenses. Logistic regression and linear regression analysis were performed in order to identify independent predictors for mortality and admission duration difference respectively. Results: According to MNA, nutrition was normal in 54/150 (36%) of patients, 46/150 (30.7%) of them were at risk of malnutrition and the rest 50/150 (33.3%) were malnourished. After performing multivariate logistic regression analysis we found that the odds of death decreased 30% per each unit increase of MedDiet score (OR=0.7, 95% CI:0.6-0.8, p < 0.0001). Patients with cancer-related admission were 37.7 times more likely to die, compared to those with infection (OR=37.7, 95% CI:4.4-325, p=0.001). According to multivariate linear regression analysis, admission duration was inversely related to Mediterranean diet, since it is decreased 0.18 days on average for each unit increase of MedDiet score (b:-0.18, 95% CI:-0.33 - -0.035, p=0.02). Additionally, the duration of current admission increased on average 0.83 days for each previous hospital admission (b:0.83, 95% CI:0.5-1.16, p<0.0001). The admission duration of patients with cancer was on average 4.5 days higher than the patients who admitted due to infection (b:4.5, 95% CI:0.9-8, p=0.015). Conclusion: Mediterranean diet adequately protects elderly, hospitalized patients against mortality and reduces the duration of hospitalization.

Keywords: Mediterranean diet, malnutrition, nutritional status, prognostic factors for mortality

Procedia PDF Downloads 283
263 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation

Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke

Abstract:

Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.

Keywords: automatic calibration framework, approximate bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform

Procedia PDF Downloads 278
262 The Effectiveness of Virtual Reality Training for Improving Interpersonal Communication Skills: An Experimental Study

Authors: Twinkle Sara Joseph

Abstract:

Virtual reality technology has emerged as a revolutionary power that can transform the education sector in many ways. VR environments can break the boundaries of the traditional classroom setting by immersing the students in realistic 3D environments where they can interact with virtual characters without fearing being judged. Communication skills are essential for every profession, and studies suggest the importance of implementing basic-level communication courses at both the school and graduate levels. Interpersonal communication is a skill that gains prominence as it is required in every profession. Traditional means of training have limitations for trainees as well as participants. The fear of being judged, the audience interaction, and other factors can affect the performance of a participant in a traditional classroom setting. Virtual reality offers a unique opportunity for its users to participate in training that does not set any boundaries that prevent the participants from performing in front of an audience. Specialised applications designed in VR headsets offer a range of training and exercises for participants without any time, space, or audience limitations. The present study aims at measuring the effectiveness of VR training in improving interpersonal communication skills among students. The study uses a mixed-method approach, in which a pre-and post-test will be designed to measure effectiveness. A preliminary selection process involving a questionnaire and a screening test will identify suitable candidates based on their current communication proficiency levels. Participants will undergo specialised training through the VR application Virtual Speech tailored for interpersonal communication and public speaking, designed to operate without the traditional constraints of time, space, or audience. The training's impact will subsequently be measured through situational exercises to engage the participants in interpersonal communication tasks, thereby assessing the improvement in their skills. The significance of this study lies in its potential to provide empirical evidence supporting VR technology's role in enhancing communication skills, thereby offering valuable insights for integrating VR-based methodologies into educational frameworks to prepare students more effectively for their professional futures.

Keywords: virtual reality, VR training, interpersonal communication, communication skills, 3D environments

Procedia PDF Downloads 15
261 Characterization of Kevlar 29 for Multifunction Applications

Authors: Doaa H. Elgohary, Dina M. Hamoda, S. Yahia

Abstract:

Technical textiles refer to textile materials that are engineered and designed to have specific functionalities and performance characteristics beyond their traditional use as apparel or upholstery fabrics. These textiles are usually developed for their unique properties such as strength, durability, flame retardancy, chemical resistance, waterproofing, insulation and other special properties. The development and use of technical textiles are constantly evolving, driven by advances in materials science, manufacturing technologies and the demand for innovative solutions in various industries. Kevlar 29 is a type of aramid fiber developed by DuPont. It is a high-performance material known for its exceptional strength and resistance to impact, abrasion, and heat. Kevlar 29 belongs to the Kevlar family, which includes different types of aramid fibers. Kevlar 29 is primarily used in applications that require strength and durability, such as ballistic protection, body armor, and body armor for military and law enforcement personnel. It is also used in the aerospace and automotive industries to reinforce composite materials, as well as in various industrial applications. Two different Kevlar samples were used coated with cooper lithium silicate (CLS); ten different mechanical and physical properties (weight, thickness, tensile strength, elongation, stiffness, air permeability, puncture resistance, thermal conductivity, stiffness, and spray test) were conducted to approve its functional performance efficiency. The influence of different mechanical properties was statistically analyzed using an independent t-test with a significant difference at P-value = 0.05. The radar plot was calculated and evaluated to determine the best-performing samples. The results of the independent t-test observed that all variables were significantly affected by yarn counts except water permeability, which has no significant effect. All properties were evaluated for samples 1 and 2, a radar chart was used to determine the best attitude for samples. The radar chart area was calculated, which shows that sample 1 recorded the best performance, followed by sample 2. The surface morphology of all samples and the coating materials was determined using a scanning electron microscope (SEM), also Fourier Transform Infrared Spectroscopy Measurement for the two samples.

Keywords: cooper lithium silicate, independent t-test, kevlar, technical textiles.

Procedia PDF Downloads 48
260 A Smartphone-Based Real-Time Activity Recognition and Fall Detection System

Authors: Manutchanok Jongprasithporn, Rawiphorn Srivilai, Paweena Pongsopha

Abstract:

Fall is the most serious accident leading to increased unintentional injuries and mortality. Falls are not only the cause of suffering and functional impairments to the individuals, but also the cause of increasing medical cost and days away from work. The early detection of falls could be an advantage to reduce fall-related injuries and consequences of falls. Smartphones, embedded accelerometer, have become a common device in everyday life due to decreasing technology cost. This paper explores a physical activity monitoring and fall detection application in smartphones which is a non-invasive biomedical device to determine physical activities and fall event. The combination of application and sensors could perform as a biomedical sensor to monitor physical activities and recognize a fall. We have chosen Android-based smartphone in this study since android operating system is an open-source and no cost. Moreover, android phone users become a majority of Thai’s smartphone users. We developed Thai 3 Axis (TH3AX) as a physical activities and fall detection application which included command, manual, results in Thai language. The smartphone was attached to right hip of 10 young, healthy adult subjects (5 males, 5 females; aged< 35y) to collect accelerometer and gyroscope data during performing physical activities (e.g., walking, running, sitting, and lying down) and falling to determine threshold for each activity. Dependent variables are including accelerometer data (acceleration, peak acceleration, average resultant acceleration, and time between peak acceleration). A repeated measures ANOVA was performed to test whether there are any differences between DVs’ means. Statistical analyses were considered significant at p<0.05. After finding threshold, the results were used as training data for a predictive model of activity recognition. In the future, accuracies of activity recognition will be performed to assess the overall performance of the classifier. Moreover, to help improve the quality of life, our system will be implemented with patients and elderly people who need intensive care in hospitals and nursing homes in Thailand.

Keywords: activity recognition, accelerometer, fall, gyroscope, smartphone

Procedia PDF Downloads 666
259 Coaches Attitudes, Efficacy and Proposed Behaviors towards Athletes with Hidden Disabilities: A Review of Recent Survey Research

Authors: Robbi Beyer, Tiffanye Vargas, Margaret Flores

Abstract:

Within the United States, youths with hidden disabilities (specific learning disabilities, attention deficit hyperactivity disorder, emotional behavioral disorders, mild intellectual disabilities and speech/language disorders) can often be part of the kindergarten through twelfth grade school population. Because individuals with hidden disabilities have no apparent physical disability, learning difficulties may be overlooked and these youths may be mistakenly labeled as unmotivated, or defiant because they don't understand and follow directions, or maintain enough attention to remember and perform. These behaviors are considered especially challenging for youth sport coaches to manage and they often find it difficult to successfully select and deliver effective accommodations for the athletes. These deficits can be remediated and compensated through the use of research-validated strategies and instructional methods. However, while these techniques are commonly included in teacher preparation, they rarely, if ever, are included in coaching preparation. Therefore, the purpose of this presentation is to summarize consecutive research studies that examined coaching education within the United States for youth athletes with hidden disabilities. Each study utilized a questionnaire format to collect data from coaches on attitudes, efficacy and solutions for addressing challenging behaviors. Results indicated that although the majority of coaches’ attitudes were positive and they perceived themselves confident in working with athletes who have hidden disabilities, there were significant differences in the understanding of appropriate teaching strategies and techniques for this population. For example, when asked to describe a videotaped situation of why an athlete is not performing correctly, coaches often found the athlete to be at fault, as opposed to considering the possibility of faulty directions, or the need for accommodations in teaching/coaching style. When considering coaches’ preparation, 83% of participants declared they were inadequately prepared to coach athletes with hidden disabilities and 92% strongly supported improved preparation for coaches. The comprehensive examination of coaches’ perceptions and efficacy in working with youth athletes with hidden disabilities has provided valuable insight and highlights the need for continued research in this area.

Keywords: health, hidden disabilties, physical activity, youth recreational sports

Procedia PDF Downloads 325
258 Administrative Traits and Capabilities of Mindanao State University Heads of Office as Perceived by Their Subordinates

Authors: Johanida L. Etado

Abstract:

The study determined the Administrative traits and capabilities of Mindanao State University Heads of office as perceived by their respondents. Specifically, this study attempted to find out: To get the primary data, a self- constructed survey questionnaire which was validated by a panel of experts, including the adviser. Most of the MSU head of office were aware of their duties and responsibilities as a manager. Considering their vast knowledge and expertise on the technical or task aspects of the job, it is not surprising that respondents perceived them to a high degree as work or task oriented. MSU head of office were knowledgeable and capable in performing field-specific, specialized tasks and enabling them to coordinate work, solve problems, communicate effectively, and also understand the big picture in light of the front-line work that must be performed. The significance of coaching or mentoring in this instance may be explained by the less number of Master’s or Doctorate degree holder among employees resulting to close supervision and mentorship of head of office towards the latter; Without comparison, interpersonal or human relation capabilities is a very effective way in dealing with people as it gives them the opportunity to influence their employees. In the case of MSU head of office, the best way of dealing with problematic employees is by establishing trust and allowing them to partake in the decision making even on setting organizational goals as it would make them feel part of the organization; Thus, it is recommended that the success of an organization depends largely with the effectiveness of the head of unit. In this case, being development oriented would mean encouraging both head officers & employees to know not only the technical know hoe of the organisation but also the visions, missions, goals & the latter’s aspirations to establish cooperation & harmonious working environment; hence, orientation & reorientation time to time would enable them to be more development oriented; With respect to human relations, effective interpersonal relationship between head of unit & employee is of paramount importance. In order to strengthen the relationship between the two, the management should establish an upward & downward communication where two parties will have to establish an open & transparent communication, either through verbal & non-verbal one.

Keywords: administrator, administrative traits, leadership traits, work orientation

Procedia PDF Downloads 40
257 Durability Analysis of a Knuckle Arm Using VPG System

Authors: Geun-Yeon Kim, S. P. Praveen Kumar, Kwon-Hee Lee

Abstract:

A steering knuckle arm is the component that connects the steering system and suspension system. The structural performances such as stiffness, strength, and durability are considered in its design process. The former study suggested the lightweight design of a knuckle arm considering the structural performances and using the metamodel-based optimization. The six shape design variables were defined, and the optimum design was calculated by applying the kriging interpolation method. The finite element method was utilized to predict the structural responses. The suggested knuckle was made of the aluminum Al6082, and its weight was reduced about 60% in comparison with the base steel knuckle, satisfying the design requirements. Then, we investigated its manufacturability by performing foraging analysis. The forging was done as hot process, and the product was made through two-step forging. As a final step of its developing process, the durability is investigated by using the flexible dynamic analysis software, LS-DYNA and the pre and post processor, eta/VPG. Generally, a car make does not provide all the information with the part manufacturer. Thus, the part manufacturer has a limit in predicting the durability performance with the unit of full car. The eta/VPG has the libraries of suspension, tire, and road, which are commonly used parts. That makes a full car modeling. First, the full car is modeled by referencing the following information; Overall Length: 3,595mm, Overall Width: 1,595mm, CVW (Curve Vehicle Weight): 910kg, Front Suspension: MacPherson Strut, Rear Suspension: Torsion Beam Axle, Tire: 235/65R17. Second, the road is selected as the cobblestone. The road condition of the cobblestone is almost 10 times more severe than that of usual paved road. Third, the dynamic finite element analysis using the LS-DYNA is performed to predict the durability performance of the suggested knuckle arm. The life of the suggested knuckle arm is calculated as 350,000km, which satisfies the design requirement set up by the part manufacturer. In this study, the overall design process of a knuckle arm is suggested, and it can be seen that the developed knuckle arm satisfies the design requirement of the durability with the unit of full car. The VPG analysis is successfully performed even though it does not an exact prediction since the full car model is very rough one. Thus, this approach can be used effectively when the detail to full car is not given.

Keywords: knuckle arm, structural optimization, Metamodel, forging, durability, VPG (Virtual Proving Ground)

Procedia PDF Downloads 398
256 A Sense of Belonging: Music Learning and School Connectedness

Authors: Johanna Gamboa-Kroesen

Abstract:

School connectedness, or the sense of belonging at school, is a critical factor in adolescent health, academic achievement, and socioemotional well-being. In educational research, the construct of the psychological sense of school membership is often referred to as school engagement, school bonding, or school attachment. While current research recognizes school connectedness as integral to a child’s mental health and academic success, many schools have yet to develop adequate interventions to promote a child’s overall sense of belonging at school. However, prior researches in music education indicates that, among other benefits, music classrooms may provide an environment where students feel they belong. While studies indicates that music learning environments, specifically performing ensemble learning environments, instill a sense of school connectedness and, more broadly, contribute to a student’s socio-emotional development, there has been inadequate research on how the actions of music teachers contribute to this phenomenon. The purpose of this study was to examine the relationship between school connectedness and music learning environments with middle school music students enrolled in a school-based music ensemble. In addition, the study aimed to provide a descriptive analysis of the instructional practices that music teachers use to promote an inclusive environment in their classrooms and an overall sense of belonging in their students. Using 191 student surveys of school membership, student reflective writings, 5 teacher interviews, and 10 classroom observations, this study examined the relationship between 7th and 8th-grade student-reported levels of connectedness within their school-based music ensemble and teacher instructional practice. The study found that students reported high levels of positive school membership within their music classes. Students who participate in school-based orchestra ensembles reported a positive change in emotional state during music instruction. In addition, evidence in this study found that music teachers use instructional practices to build connectedness through de-emphasizing competition and strengthening a student’s sense of relational value within their music learning experience. The findings offer implications for future music teacher instruction to create environments of inclusion, strengthen student-teacher relationships, and promote strategies that enhance student connection to school.

Keywords: music education, belonging, instructional practice, school connectedness

Procedia PDF Downloads 37
255 Development of Electrochemical Biosensor Based on Dendrimer-Magnetic Nanoparticles for Detection of Alpha-Fetoprotein

Authors: Priyal Chikhaliwala, Sudeshna Chandra

Abstract:

Liver cancer is one of the most common malignant tumors with poor prognosis. This is because liver cancer does not exhibit any symptoms in early stage of disease. Increased serum level of AFP is clinically considered as a diagnostic marker for liver malignancy. The present diagnostic modalities include various types of immunoassays, radiological studies, and biopsy. However, these tests undergo slow response times, require significant sample volumes, achieve limited sensitivity and ultimately become expensive and burdensome to patients. Considering all these aspects, electrochemical biosensors based on dendrimer-magnetic nanoparticles (MNPs) was designed. Dendrimers are novel nano-sized, three-dimensional molecules with monodispersed structures. Poly-amidoamine (PAMAM) dendrimers with eight –NH₂ groups using ethylenediamine as a core molecule were synthesized using Michael addition reaction. Dendrimers provide added the advantage of not only stabilizing Fe₃O₄ NPs but also displays capability of performing multiple electron redox events and binding multiple biological ligands to its dendritic end-surface. Fe₃O₄ NPs due to its superparamagnetic behavior can be exploited for magneto-separation process. Fe₃O₄ NPs were stabilized with PAMAM dendrimer by in situ co-precipitation method. The surface coating was examined by FT-IR, XRD, VSM, and TGA analysis. Electrochemical behavior and kinetic studies were evaluated using CV which revealed that the dendrimer-Fe₃O₄ NPs can be looked upon as electrochemically active materials. Electrochemical immunosensor was designed by immobilizing anti-AFP onto dendrimer-MNPs by gluteraldehyde conjugation reaction. The bioconjugates were then incubated with AFP antigen. The immunosensor was characterized electrochemically indicating successful immuno-binding events. The binding events were also further studied using magnetic particle imaging (MPI) which is a novel imaging modality in which Fe₃O₄ NPs are used as tracer molecules with positive contrast. Multicolor MPI was able to clearly localize AFP antigen and antibody and its binding successfully. Results demonstrate immense potential in terms of biosensing and enabling MPI of AFP in clinical diagnosis.

Keywords: alpha-fetoprotein, dendrimers, electrochemical biosensors, magnetic nanoparticles

Procedia PDF Downloads 118
254 Sound Source Localisation and Augmented Reality for On-Site Inspection of Prefabricated Building Components

Authors: Jacques Cuenca, Claudio Colangeli, Agnieszka Mroz, Karl Janssens, Gunther Riexinger, Antonio D'Antuono, Giuseppe Pandarese, Milena Martarelli, Gian Marco Revel, Carlos Barcena Martin

Abstract:

This study presents an on-site acoustic inspection methodology for quality and performance evaluation of building components. The work focuses on global and detailed sound source localisation, by successively performing acoustic beamforming and sound intensity measurements. A portable experimental setup is developed, consisting of an omnidirectional broadband acoustic source and a microphone array and sound intensity probe. Three main acoustic indicators are of interest, namely the sound pressure distribution on the surface of components such as walls, windows and junctions, the three-dimensional sound intensity field in the vicinity of junctions, and the sound transmission loss of partitions. The measurement data is post-processed and converted into a three-dimensional numerical model of the acoustic indicators with the help of the simultaneously acquired geolocation information. The three-dimensional acoustic indicators are then integrated into an augmented reality platform superimposing them onto a real-time visualisation of the spatial environment. The methodology thus enables a measurement-supported inspection process of buildings and the correction of errors during construction and refurbishment. Two experimental validation cases are shown. The first consists of a laboratory measurement on a full-scale mockup of a room, featuring a prefabricated panel. The latter is installed with controlled defects such as lack of insulation and joint sealing material. It is demonstrated that the combined acoustic and augmented reality tool is capable of identifying acoustic leakages from the building defects and assist in correcting them. The second validation case is performed on a prefabricated room at a near-completion stage in the factory. With the help of the measurements and visualisation tools, the homogeneity of the partition installation is evaluated and leakages from junctions and doors are identified. Furthermore, the integration of acoustic indicators together with thermal and geometrical indicators via the augmented reality platform is shown.

Keywords: acoustic inspection, prefabricated building components, augmented reality, sound source localization

Procedia PDF Downloads 352
253 A Cross-Sectional Study on Clinical Self-Efficacy of Final Year School of Nursing Students among Universities of Tigray Region, Northern Ethiopia

Authors: Awole Seid, Yosef Zenebe, Hadgu Gerensea, Kebede Haile Misgina

Abstract:

Background: Clinical competence is one of the ultimate goals of nursing education. Clinical skills are more than successfully performing tasks; it incorporates client assessment, identification of deficits and the ability to critically think to provide solutions. Assessment of clinical competence, particularly identifying gaps that need improvement and determining the educational needs of nursing students have great importance in nursing education. Thus this study aims determining clinical self-efficacy of final year school of nursing students in three universities of Tigray Region. Methods: A cross-sectional study was conducted on 224 final year school of nursing students from department of nursing, psychiatric nursing, and midwifery on three universities of Tigray region. Anonymous self-administered questionnaire was administered to generate data collected on June, 2017. The data were analyzed using SPSS version 20. The result is described using tables and charts as required. Logistic regression was employed to test associations. Result: The mean age of students was 22.94 + 1.44. Generally, 21% of students have been graduated in the department in which they are not interested. The study demonstrated 28.6% had poor and 71.4% had good perceived clinical self-efficacy. Beside this, 43.8% of psychiatric nursing and 32.6% of comprehensive nursing students have poor clinical self-efficacy. Among the four domains, 39.3% and 37.9% have poor clinical self- efficacy with regard to ‘Professional development’ and ‘Management of care’. Place of the institution [AOR=3.480 (1.333 - 9.088), p=0.011], interest during department selection [AOR=2.202 (1.045 - 4.642), p=.038], and theory-practice gap [AOR=0.224 (0.110 - 0.457), p=0.000] were significantly associated with perceived clinical self-efficacy. Conclusion: The magnitude of students with poor clinically self efficacy was high. Place of institution, theory-practice gap, students interest to the discipline were the significant predictors of clinical self-efficacy. Students from youngest universities have good clinical self-efficacy. During department selection, student’s interest should be respected. The universities and other stakeholders should improve the capacity of surrounding affiliate teaching hospitals to set and improve care standards in order to narrow the theory-practice gap. School faculties should provide trainings to hospital staffs and monitor standards of clinical procedures.

Keywords: clinical self-efficacy, nursing students, Tigray, northern Ethiopia

Procedia PDF Downloads 142
252 Russian pipeline natural gas export strategy under uncertainty

Authors: Koryukaeva Ksenia, Jinfeng Sun

Abstract:

Europe has been a traditional importer of Russian natural gas for more than 50 years. In 2021, Russian state-owned company Gazprom supplied about a third of all gas consumed in Europe. The Russia-Europe mutual dependence in terms of natural gas supplies has been causing many concerns about the energy security of the two sides for a long period of time. These days the issue has become more urgent than ever considering recent Russian invasion in Ukraine followed by increased large-scale geopolitical conflicts, making the future of Russian natural gas supplies and global gas markets as well highly uncertain. Hence, the main purpose of this study is to get insight into the possible futures of Russian pipeline natural gas exports by a scenario planning method based on Monte-Carlo simulation within LUSS model framework, and propose Russian pipeline natural gas export strategies based on the obtained scenario planning results. The scenario analysis revealed that recent geopolitical disputes disturbed the traditional, longstanding model of Russian pipeline gas exports, and, as a result, the prospects and the pathways for Russian pipeline gas on the world markets will differ significantly from those before 2022. Specifically, our main findings show, that (i) the events of 2022 generated many uncertainties for the long-term future of Russian pipeline gas export perspectives on both western and eastern supply directions, including geopolitical, regulatory, economic, infrastructure and other uncertainties; (ii) according to scenario modelling results, Russian pipeline exports will face many challenges in the future, both on western and eastern directions. A decrease in pipeline gas exports will inevitably affect country’s natural gas production and significantly reduce fossil fuel export revenues, jeopardizing the energy security of the country; (iii) according to proposed strategies, in order to ensure the long-term stable export supplies in the changing environment, Russia may need to adjust its traditional export strategy by performing export flows and product diversification, entering new markets, adapting its contracting mechanism, increasing competitiveness and gaining a reputation of a reliable gas supplier.

Keywords: Russian natural gas, Pipeline natural gas, Uncertainty, Scenario simulation, Export strategy

Procedia PDF Downloads 32
251 A Biophysical Model of CRISPR/Cas9 on- and off-Target Binding for Rational Design of Guide RNAs

Authors: Iman Farasat, Howard M. Salis

Abstract:

The CRISPR/Cas9 system has revolutionized genome engineering by enabling site-directed and high-throughput genome editing, genome insertion, and gene knockdowns in several species, including bacteria, yeast, flies, worms, and human cell lines. This technology has the potential to enable human gene therapy to treat genetic diseases and cancer at the molecular level; however, the current CRISPR/Cas9 system suffers from seemingly sporadic off-target genome mutagenesis that prevents its use in gene therapy. A comprehensive mechanistic model that explains how the CRISPR/Cas9 functions would enable the rational design of the guide-RNAs responsible for target site selection while minimizing unexpected genome mutagenesis. Here, we present the first quantitative model of the CRISPR/Cas9 genome mutagenesis system that predicts how guide-RNA sequences (crRNAs) control target site selection and cleavage activity. We used statistical thermodynamics and law of mass action to develop a five-step biophysical model of cas9 cleavage, and examined it in vivo and in vitro. To predict a crRNA's binding specificities and cleavage rates, we then compiled a nearest neighbor (NN) energy model that accounts for all possible base pairings and mismatches between the crRNA and the possible genomic DNA sites. These calculations correctly predicted crRNA specificity across 5518 sites. Our analysis reveals that cas9 activity and specificity are anti-correlated, and, the trade-off between them is the determining factor in performing an RNA-mediated cleavage with minimal off-targets. To find an optimal solution, we first created a scheme of safe-design criteria for Cas9 target selection by systematic analysis of available high throughput measurements. We then used our biophysical model to determine the optimal Cas9 expression levels and timing that maximizes on-target cleavage and minimizes off-target activity. We successfully applied this approach in bacterial and mammalian cell lines to reduce off-target activity to near background mutagenesis level while maintaining high on-target cleavage rate.

Keywords: biophysical model, CRISPR, Cas9, genome editing

Procedia PDF Downloads 377
250 Applications of Artificial Intelligence (AI) in Cardiac imaging

Authors: Angelis P. Barlampas

Abstract:

The purpose of this study is to inform the reader, about the various applications of artificial intelligence (AI), in cardiac imaging. AI grows fast and its role is crucial in medical specialties, which use large amounts of digital data, that are very difficult or even impossible to be managed by human beings and especially doctors.Artificial intelligence (AI) refers to the ability of computers to mimic human cognitive function, performing tasks such as learning, problem-solving, and autonomous decision making based on digital data. Whereas AI describes the concept of using computers to mimic human cognitive tasks, machine learning (ML) describes the category of algorithms that enable most current applications described as AI. Some of the current applications of AI in cardiac imaging are the follows: Ultrasound: Automated segmentation of cardiac chambers across five common views and consequently quantify chamber volumes/mass, ascertain ejection fraction and determine longitudinal strain through speckle tracking. Determine the severity of mitral regurgitation (accuracy > 99% for every degree of severity). Identify myocardial infarction. Distinguish between Athlete’s heart and hypertrophic cardiomyopathy, as well as restrictive cardiomyopathy and constrictive pericarditis. Predict all-cause mortality. CT Reduce radiation doses. Calculate the calcium score. Diagnose coronary artery disease (CAD). Predict all-cause 5-year mortality. Predict major cardiovascular events in patients with suspected CAD. MRI Segment of cardiac structures and infarct tissue. Calculate cardiac mass and function parameters. Distinguish between patients with myocardial infarction and control subjects. It could potentially reduce costs since it would preclude the need for gadolinium-enhanced CMR. Predict 4-year survival in patients with pulmonary hypertension. Nuclear Imaging Classify normal and abnormal myocardium in CAD. Detect locations with abnormal myocardium. Predict cardiac death. ML was comparable to or better than two experienced readers in predicting the need for revascularization. AI emerge as a helpful tool in cardiac imaging and for the doctors who can not manage the overall increasing demand, in examinations such as ultrasound, computed tomography, MRI, or nuclear imaging studies.

Keywords: artificial intelligence, cardiac imaging, ultrasound, MRI, CT, nuclear medicine

Procedia PDF Downloads 47
249 A Heteroskedasticity Robust Test for Contemporaneous Correlation in Dynamic Panel Data Models

Authors: Andreea Halunga, Chris D. Orme, Takashi Yamagata

Abstract:

This paper proposes a heteroskedasticity-robust Breusch-Pagan test of the null hypothesis of zero cross-section (or contemporaneous) correlation in linear panel-data models, without necessarily assuming independence of the cross-sections. The procedure allows for either fixed, strictly exogenous and/or lagged dependent regressor variables, as well as quite general forms of both non-normality and heteroskedasticity in the error distribution. The asymptotic validity of the test procedure is predicated on the number of time series observations, T, being large relative to the number of cross-section units, N, in that: (i) either N is fixed as T→∞; or, (ii) N²/T→0, as both T and N diverge, jointly, to infinity. Given this, it is not expected that asymptotic theory would provide an adequate guide to finite sample performance when T/N is "small". Because of this, we also propose and establish asymptotic validity of, a number of wild bootstrap schemes designed to provide improved inference when T/N is small. Across a variety of experimental designs, a Monte Carlo study suggests that the predictions from asymptotic theory do, in fact, provide a good guide to the finite sample behaviour of the test when T is large relative to N. However, when T and N are of similar orders of magnitude, discrepancies between the nominal and empirical significance levels occur as predicted by the first-order asymptotic analysis. On the other hand, for all the experimental designs, the proposed wild bootstrap approximations do improve agreement between nominal and empirical significance levels, when T/N is small, with a recursive-design wild bootstrap scheme performing best, in general, and providing quite close agreement between the nominal and empirical significance levels of the test even when T and N are of similar size. Moreover, in comparison with the wild bootstrap "version" of the original Breusch-Pagan test our experiments indicate that the corresponding version of the heteroskedasticity-robust Breusch-Pagan test appears reliable. As an illustration, the proposed tests are applied to a dynamic growth model for a panel of 20 OECD countries.

Keywords: cross-section correlation, time-series heteroskedasticity, dynamic panel data, heteroskedasticity robust Breusch-Pagan test

Procedia PDF Downloads 408
248 Factors of Non-Conformity Behavior and the Emergence of a Ponzi Game in the Riba-Free (Interest-Free) Banking System of Iran

Authors: Amir Hossein Ghaffari Nejad, Forouhar Ferdowsi, Reza Mashhadi

Abstract:

In the interest-free banking system of Iran, the savings of society are in the form of bank deposits, and banks using the Islamic contracts, allocate the resources to applicants for obtaining facilities and credit. In the meantime, the central bank, with the aim of introducing monetary policy, determines the maximum interest rate on bank deposits in terms of macroeconomic requirements. But in recent years, the country's economic constraints with the stagflation and the consequence of the institutional weaknesses of the financial market of Iran have resulted in massive disturbances in the balance sheet of the banking system, resulting in a period of mismatch maturity in the banks' assets and liabilities and the implementation of a Ponzi game. This issue caused determination of the interest rate in long-term bank deposit contracts to be associated with non-observance of the maximum rate set by the central bank. The result of this condition was in the allocation of new sources of equipment to meet past commitments towards the old depositors and, as a result, a significant part of the supply of equipment was leaked out of the facilitating cycle and credit crunch emerged. The purpose of this study is to identify the most important factors affecting the occurrence of non-confirmatory financial banking behavior using data from 19 public and private banks of Iran. For this purpose, the causes of this non-confirmatory behavior of banks have been investigated using the panel vector autoregression method (PVAR) for the period of 2007-2015. Granger's causality test results suggest that the return of parallel markets for bank deposits, non-performing loans and the high share of the ratio of facilities to banks' deposits are all a cause of the formation of non-confirmatory behavior. Also, according to the results of impulse response functions and variance decomposition, NPL and the ratio of facilities to deposits have the highest long-term effect and also have a high contribution to explaining the changes in banks' non-confirmatory behavior in determining the interest rate on deposits.

Keywords: non-conformity behavior, Ponzi Game, panel vector autoregression, nonperforming loans

Procedia PDF Downloads 184