Search results for: time history response analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 44020

Search results for: time history response analysis

40090 Initial Dip: An Early Indicator of Neural Activity in Functional Near Infrared Spectroscopy Waveform

Authors: Mannan Malik Muhammad Naeem, Jeong Myung Yung

Abstract:

Functional near infrared spectroscopy (fNIRS) has a favorable position in non-invasive brain imaging techniques. The concentration change of oxygenated hemoglobin and de-oxygenated hemoglobin during particular cognitive activity is the basis for this neuro-imaging modality. Two wavelengths of near-infrared light can be used with modified Beer-Lambert law to explain the indirect status of neuronal activity inside brain. The temporal resolution of fNIRS is very good for real-time brain computer-interface applications. The portability, low cost and an acceptable temporal resolution of fNIRS put it on a better position in neuro-imaging modalities. In this study, an optimization model for impulse response function has been used to estimate/predict initial dip using fNIRS data. In addition, the activity strength parameter related to motor based cognitive task has been analyzed. We found an initial dip that remains around 200-300 millisecond and better localize neural activity.

Keywords: fNIRS, brain-computer interface, optimization algorithm, adaptive signal processing

Procedia PDF Downloads 228
40089 Deep Learning in Chest Computed Tomography to Differentiate COVID-19 from Influenza

Authors: Hongmei Wang, Ziyun Xiang, Ying liu, Li Yu, Dongsheng Yue

Abstract:

Intro: The COVID-19 (Corona Virus Disease 2019) has greatly changed the global economic, political and financial ecology. The mutation of the coronavirus in the UK in December 2020 has brought new panic to the world. Deep learning was performed on Chest Computed tomography (CT) of COVID-19 and Influenza and describes their characteristics. The predominant features of COVID-19 pneumonia was ground-glass opacification, followed by consolidation. Lesion density: most lesions appear as ground-glass shadows, and some lesions coexist with solid lesions. Lesion distribution: the focus is mainly on the dorsal side of the periphery of the lung, with the lower lobe of the lungs as the focus, and it is often close to the pleura. Other features it has are grid-like shadows in ground glass lesions, thickening signs of diseased vessels, air bronchi signs and halo signs. The severe disease involves whole bilateral lungs, showing white lung signs, air bronchograms can be seen, and there can be a small amount of pleural effusion in the bilateral chest cavity. At the same time, this year's flu season could be near its peak after surging throughout the United States for months. Chest CT for Influenza infection is characterized by focal ground glass shadows in the lungs, with or without patchy consolidation, and bronchiole air bronchograms are visible in the concentration. There are patchy ground-glass shadows, consolidation, air bronchus signs, mosaic lung perfusion, etc. The lesions are mostly fused, which is prominent near the hilar and two lungs. Grid-like shadows and small patchy ground-glass shadows are visible. Deep neural networks have great potential in image analysis and diagnosis that traditional machine learning algorithms do not. Method: Aiming at the two major infectious diseases COVID-19 and influenza, which are currently circulating in the world, the chest CT of patients with two infectious diseases is classified and diagnosed using deep learning algorithms. The residual network is proposed to solve the problem of network degradation when there are too many hidden layers in a deep neural network (DNN). The proposed deep residual system (ResNet) is a milestone in the history of the Convolutional neural network (CNN) images, which solves the problem of difficult training of deep CNN models. Many visual tasks can get excellent results through fine-tuning ResNet. The pre-trained convolutional neural network ResNet is introduced as a feature extractor, eliminating the need to design complex models and time-consuming training. Fastai is based on Pytorch, packaging best practices for in-depth learning strategies, and finding the best way to handle diagnoses issues. Based on the one-cycle approach of the Fastai algorithm, the classification diagnosis of lung CT for two infectious diseases is realized, and a higher recognition rate is obtained. Results: A deep learning model was developed to efficiently identify the differences between COVID-19 and influenza using chest CT.

Keywords: COVID-19, Fastai, influenza, transfer network

Procedia PDF Downloads 147
40088 The Role of Cholesterol Oxidase of Mycobacterium tuberculosis in the Down-Regulation of TLR2-Signaling Pathway in Human Macrophages during Infection Process

Authors: Michal Kielbik, Izabela Szulc-Kielbik, Anna Brzostek, Jaroslaw Dziadek, Magdalena Klink

Abstract:

The goal of many research groups in the world is to find new components that are important for survival of mycobacteria in the host cells. Mycobacterium tuberculosis (Mtb) possesses a number of enzymes degrading cholesterol that are considered to be an important factor for its survival and persistence in host macrophages. One of them - cholesterol oxidase (ChoD), although not being essential for cholesterol degradation, is discussed as a virulence compound, however its involvement in macrophages’ response to Mtb is still not sufficiently determined. The recognition of tubercle bacilli antigens by pathogen recognition receptors is crucial for the initiation of the host innate immune response. An important receptor that has been implicated in the recognition and/or uptake of Mtb is Toll-like receptor type 2 (TLR2). Engagement of TLR2 results in the activation and phosphorylation of intracellular signaling proteins including IRAK-1 and -4, TRAF-6, which in turn leads to the activation of target kinases and transcription factors responsible for bactericidal and pro-inflammatory response of macrophages. The aim of these studies was a detailed clarification of the role of Mtb cholesterol oxidase as a virulence factor affecting the TLR2 signaling pathway in human macrophages. As human macrophages the THP-1 differentiated cells were applied. The virulent wild-type Mtb strain (H37Rv), its mutant lacking a functional copy of gene encoding cholesterol oxidase (∆choD), as well as complimented strain (∆choD–choD) were used. We tested the impact of Mtb strains on the expression of TLR2-depended signaling proteins (mRNA level, cytosolic level and phosphorylation status). The cytokine and bactericidal response of THP-1 derived macrophages infected with Mtb strains in relation to TLR2 signaling pathway dependence was also determined. We found that during the 24-hours of infection process the wild-type and complemented Mtb significantly reduced the cytosolic level and phosphorylation status of IRAK-4 and TRAF-6 proteins in macrophages, that was not observed in the case of ΔchoD mutant. Decreasement of TLR2-dependent signaling proteins, induced by wild-type Mtb, was not dependent on the activity of proteasome. Blocking of TLR2 expression, before infection, effectively prevented the induced by wild-type strain reduction of cytosolic level and phosphorylation of IRAK-4. None of the strains affected the surface expression of TLR2. The mRNA level of IRAK-4 and TRAF-6 genes were significantly increased in macrophages 24 hours post-infection with either of tested strains. However, the impact of wild-type Mtb strain on both examined genes was significantly stronger than its ΔchoD mutant. We also found that wild-type strain stimulated macrophages to release high amount of immunosuppressive IL-10, accompanied by low amount of pro-inflammatory IL-8 and bactericidal nitric oxide in comparison to mutant lacking cholesterol oxidase. The influence of wild-type Mtb on this type of macrophages' response strongly dependent on fully active IRAK-1 and IRAK-4 signaling proteins. In conclusion, Mtb using cholesterol oxidase causes the over-activation of TLR2 signaling proteins leading to the reduction of their cytosolic level and activity resulting in the modulation of macrophages response to allow its intracellular survival. Supported by grant: 2014/15/B/NZ6/01565, National Science Center, Poland

Keywords: Mycobacterium tuberculosis, cholesterol oxidase, macrophages, TLR2-dependent signaling pathway

Procedia PDF Downloads 423
40087 The Effects of Chamomile on Serum Levels of Inflammatory Indexes to a Bout of Eccentric Exercise in Young Women

Authors: K. Azadeh, M. Ghasemi, S. Fazelifar

Abstract:

Aim: Changes in stress hormones can be modify response of immune system. Cortisol as the most important body corticosteroid is anti-inflammatory and immunosuppressive hormone. Normal levels of cortisol in humans has fluctuated during the day, In other words, cortisol is released periodically, and regulate through the release of ACTH circadian rhythm in every day. Therefore, the aim of this study was to determine the effects of Chamomile on serum levels of inflammatory indexes to a bout of eccentric exercise in young women. Methodology: 32 women were randomly divided into 4 groups: high dose of Chamomile, low dose of Chamomile, ibuprofen and placebo group. Eccentric exercise included 5 set and rest period between sets was 1 minute. For this purpose, subjects warm up 10 min and then done eccentric exercise. Each participant completed 15 repetitions with optional 20 kg weight or until can’t continue moving. When the subject was no longer able to continue to move, immediately decreased 5 kg from the weight and the protocol continued until cause exhaustion or complete 15 repetitions. Also, subjects received specified amount of ibuprofen and Chamomile capsules in target groups. Blood samples in 6 stages (pre of starting pill, pre of exercise protocol, 4, 24, 48 and 72 hours after eccentric exercise) was obtained. The levels of cortisol and adrenocorticotropic hormone levels were measured by ELISA way. K-S test to determine the normality of the data and analysis of variance for repeated measures was used to analyze the data. A significant difference in the p < 0/05 accepted. Results: The results showed that Individual characteristics including height, weight, age and body mass index were not significantly different among the four groups. Analyze of data showed that cortisol and ACTH basic levels significantly decreased after supplementation consumption, but then gradually significantly increased in all stages of post exercise. In High dose of Chamomile group, increasing tendency of post exercise somewhat less than other groups, but not to a significant level. The inter-group analysis results indicate that time effect had a significant impact in different stages of the groups. Conclusion: The results of this study, one session of eccentric exercise increased cortisol and ACTH hormone. The results represent the effect of high dose of Chamomile in the prevention and reduction of increased stress hormone levels. As regards use of medicinal plants and ibuprofen as a pain medication and inflammation has spread among athletes and non-athletes, the results of this research can provide information about the advantages and disadvantages of using medicinal plants and ibuprofen.

Keywords: chamomile, inflammatory indexes, eccentric exercise, young girls

Procedia PDF Downloads 421
40086 Evaluation of the Gas Exchange Characteristics of Selected Plant Species of Universiti Tun Hussein Onn Malaysia, UTHM

Authors: Yunusa Audu, Alona Cuevas Linatoc, Aisha Idris

Abstract:

The maximum carboxylation rate of Rubisco (Vcmax) and the maximum electron transport rate (Jmax), light compensation point (LCP), light saturation point (LSP), maximum photosynthesis (Amax), and apparent quantum yield (Aqy) are gas exchange characteristics that are derived from the carbon dioxide (CO2) and light response curves. This characteristics can be affected by the level of CO2 and light received by the plant. Moreover, the characteristics determines the photosynthetic capacity of the plant. The objective of the study is to evaluate the gas exchange characteristics of selected plant species of UTHM. Photosynthetic carbon dioxide (A\Ci) and light (A/Q) response curves were measured using portable photosynthesis system (LICOR). The results shows that both A/Ci and A/Q curves increases as CO2 and light increases, but reach to a certain point where the curves will become saturated. Spathodea campanulata was having the highest Vcmax (52.14±0.005 µmolCO2 m-2s-1), Jmax (104.461±0.011 µmolCO2 m-2s-1) and Aqy (0.072±0.001 mol CO2 mol-1 photons). The highest LCP was observed in Rhaphis excelsa (69.60±0.067 µmol photons m-2s-1) while the highest LSP was recorded for Costus spicatus (1576.69±0.173 µmol photons m-2s-1). It was concluded that the plants need high light intensity and CO2 for their maximum assimilation rate.

Keywords: Gas, Co2, Exchange, Plants

Procedia PDF Downloads 23
40085 Realistic Testing Procedure of Power Swing Blocking Function in Distance Relay

Authors: Farzad Razavi, Behrooz Taheri, Mohammad Parpaei, Mehdi Mohammadi Ghalesefidi, Siamak Zarei

Abstract:

As one of the major problems in protecting large-dimension power systems, power swing and its effect on distance have caused a lot of damages to energy transfer systems in many parts of the world. Therefore, power swing has gained attentions of many researchers, which has led to invention of different methods for power swing detection. Power swing detection algorithm is highly important in distance relay, but protection relays should have general requirements such as correct fault detection, response rate, and minimization of disturbances in a power system. To ensure meeting the requirements, protection relays need different tests during development, setup, maintenance, configuration, and troubleshooting steps. This paper covers power swing scheme of the modern numerical relay protection, 7sa522 to address the effect of the different fault types on the function of the power swing blocking. In this study, it was shown that the different fault types during power swing cause different time for unblocking distance relay.

Keywords: power swing, distance relay, power system protection, relay test, transient in power system

Procedia PDF Downloads 387
40084 Analysis of Sweat Evaporation and Heat Transfer on Skin Surface: A Pointwise Numerical Study

Authors: Utsav Swarnkar, Rabi Pathak, Rina Maiti

Abstract:

This study aims to investigate the thermoregulatory role of sweating by comprehensively analyzing the evaporation process and its thermal cooling impact on local skin temperature at various time intervals. Traditional experimental methods struggle to fully capture these intricate phenomena. Therefore, numerical simulations play a crucial role in assessing sweat production rates and associated thermal cooling. This research utilizes transient computational fluid dynamics (CFD) to enhance our understanding of the evaporative cooling process on human skin. We conducted a simulation employing the k-w SST turbulence model. This simulation includes a scenario where sweat evaporation occurs over the skin surface, and at particular time intervals, temperatures at different locations have been observed and its effect explained. During this study, sweat evaporation was monitored on the skin surface following the commencement of the simulation. Subsequent to the simulation, various observations were made regarding temperature fluctuations at specific points over time intervals. It was noted that points situated closer to the periphery of the droplets exhibited higher levels of heat transfer and lower temperatures, whereas points within the droplets displayed contrasting trends.

Keywords: CFD, sweat, evaporation, multiphase flow, local heat loss

Procedia PDF Downloads 71
40083 Effect of Extrusion Parameters on the Rheological Properties of Ready-To-Eat Extrudates Developed from De-Oiled Rice Bran

Authors: Renu Sharma, D. C. Saxena, Tanuja Srivastava

Abstract:

Mechanical properties of ready-to-eat extrudates are perceived by the consumers as one of the quality criteria. Texture quality of any product has a strong influence on the sensory evaluation as well as on the acceptability of the product. The main texture characteristics influencing the product acceptability are crispness, elasticity, hardness and softness. In the present work, the authors investigated one of the most important textural characteristics of extrudates i.e. hardness. A five-level, four-factor central composite rotatable design was employed to investigate the effect of temperature, screw speed, feed moisture content and feed composition mainly rice bran content and their interactions, on the mechanical hardness of extrudates. Among these, feed moisture was found to be a prominent factor affecting the product hardness. It was found that with the increase of feed moisture content, the rice bran proportion leads to increase in hardness of extrudates whereas the increase of temperature leads to decrease of hardness of product. A good agreement between the predicted (26.49 N) and actual value (28.73N) of the response confirms the validation of response surface methodology (RSM)-model.

Keywords: deoiled rice bran, extrusion, rheological properties, RSM

Procedia PDF Downloads 378
40082 Optimization of Ultrasound-Assisted Extraction of Oil from Spent Coffee Grounds Using a Central Composite Rotatable Design

Authors: Malek Miladi, Miguel Vegara, Maria Perez-Infantes, Khaled Mohamed Ramadan, Antonio Ruiz-Canales, Damaris Nunez-Gomez

Abstract:

Coffee is the second consumed commodity worldwide, yet it also generates colossal waste. Proper management of coffee waste is proposed by converting them into products with higher added value to achieve sustainability of the economic and ecological footprint and protect the environment. Based on this, a study looking at the recovery of coffee waste is becoming more relevant in recent decades. Spent coffee grounds (SCG's) resulted from brewing coffee represents the major waste produced among all coffee industry. The fact that SCGs has no economic value be abundant in nature and industry, do not compete with agriculture and especially its high oil content (between 7-15% from its total dry matter weight depending on the coffee varieties, Arabica or Robusta), encourages its use as a sustainable feedstock for bio-oil production. The bio-oil extraction is a crucial step towards biodiesel production by the transesterification process. However, conventional methods used for oil extraction are not recommended due to their high consumption of energy, time, and generation of toxic volatile organic solvents. Thus, finding a sustainable, economical, and efficient extraction technique is crucial to scale up the process and to ensure more environment-friendly production. Under this perspective, the aim of this work was the statistical study to know an efficient strategy for oil extraction by n-hexane using indirect sonication. The coffee waste mixed Arabica and Robusta, which was used in this work. The temperature effect, sonication time, and solvent-to-solid ratio on the oil yield were statistically investigated as dependent variables by Central Composite Rotatable Design (CCRD) 23. The results were analyzed using STATISTICA 7 StatSoft software. The CCRD showed the significance of all the variables tested (P < 0.05) on the process output. The validation of the model by analysis of variance (ANOVA) showed good adjustment for the results obtained for a 95% confidence interval, and also, the predicted values graph vs. experimental values confirmed the satisfactory correlation between the model results. Besides, the identification of the optimum experimental conditions was based on the study of the surface response graphs (2-D and 3-D) and the critical statistical values. Based on the CCDR results, 29 ºC, 56.6 min, and solvent-to-solid ratio 16 were the better experimental conditions defined statistically for coffee waste oil extraction using n-hexane as solvent. In these conditions, the oil yield was >9% in all cases. The results confirmed the efficiency of using an ultrasound bath in extracting oil as a more economical, green, and efficient way when compared to the Soxhlet method.

Keywords: coffee waste, optimization, oil yield, statistical planning

Procedia PDF Downloads 123
40081 The Role of Organizational Identity in Disaster Response, Recovery and Prevention: A Case Study of an Italian Multi-Utility Company

Authors: Shanshan Zhou, Massimo Battaglia

Abstract:

Identity plays a critical role when an organization faces disasters. Individuals reflect on their working identities and identify themselves with the group and the organization, which facilitate collective sensemaking under crisis situations and enable coordinated actions to respond to and recover from disasters. In addition, an organization’s identity links it to its regional community, which fosters the mobilization of resources and contributes to rapid recovery. However, identity is also problematic for disaster prevention because of its persistence. An organization’s ego-defenses system prohibits the rethink of its identity and a rigid identity obstructs disaster prevention. This research aims to tackle the ‘problem’ of identity by study in-depth a case of an Italian multi–utility which experienced the 2012 Northern Italy earthquakes. Collecting data from 11 interviews with top managers and key players in the local community and archived materials, we find that the earthquakes triggered the rethink of the organization’s identity, which got reinforced afterward. This research highlighted the importance of identity in disaster response and recovery. More importantly, it explored the solution of overcoming the barrier of ego-defense that is to transform the organization into a learning organization which constantly rethinks its identity.

Keywords: community identity, disaster, identity, organizational learning

Procedia PDF Downloads 737
40080 Numerical Investigation on the Effects of Deep Excavation on Adjacent Pile Groups Subjected to Inclined Loading

Authors: Ashkan Shafee, Ahmad Fahimifar

Abstract:

There is a growing demand for construction of high-rise buildings and infrastructures in large cities, which sometimes require deep excavations in the vicinity of pile foundations. In this study, a two-dimensional finite element analysis is used to gain insight into the response of pile groups adjacent to deep excavations in sand. The numerical code was verified by available experimental works, and a parametric study was performed on different working load combinations, excavation depth and supporting system. The results show that the simple two-dimensional plane strain model can accurately simulate the excavation induced changes on adjacent pile groups. It was found that further excavation than pile toe level and also inclined loading on adjacent pile group can severely affect the serviceability of the foundation.

Keywords: deep excavation, inclined loading, lateral deformation, pile group

Procedia PDF Downloads 277
40079 Mathematical Modeling and Optimization of Burnishing Parameters for 15NiCr6 Steel

Authors: Tarek Litim, Ouahiba Taamallah

Abstract:

The present paper is an investigation of the effect of burnishing on the surface integrity of a component made of 15NiCr6 steel. This work shows a statistical study based on regression, and Taguchi's design has allowed the development of mathematical models to predict the output responses as a function of the technological parameters studied. The response surface methodology (RSM) showed a simultaneous influence of the burnishing parameters and observe the optimal processing parameters. ANOVA analysis of the results resulted in the validation of the prediction model with a determination coefficient R=90.60% and 92.41% for roughness and hardness, respectively. Furthermore, a multi-objective optimization allowed to identify a regime characterized by P=10kgf, i=3passes, and f=0.074mm/rev, which favours minimum roughness and maximum hardness. The result was validated by the desirability of D= (0.99 and 0.95) for roughness and hardness, respectively.

Keywords: 15NiCr6 steel, burnishing, surface integrity, Taguchi, RSM, ANOVA

Procedia PDF Downloads 196
40078 Time's Arrow and Entropy: Violations to the Second Law of Thermodynamics Disrupt Time Perception

Authors: Jason Clarke, Michaela Porubanova, Angela Mazzoli, Gulsah Kut

Abstract:

What accounts for our perception that time inexorably passes in one direction, from the past to the future, the so-called arrow of time, given that the laws of physics permit motion in one temporal direction to also happen in the reverse temporal direction? Modern physics says that the reason for time’s unidirectional physical arrow is the relationship between time and entropy, the degree of disorder in the universe, which is evolving from low entropy (high order; thermal disequilibrium) toward high entropy (high disorder; thermal equilibrium), the second law of thermodynamics. Accordingly, our perception of the direction of time, from past to future, is believed to emanate as a result of the natural evolution of entropy from low to high, with low entropy defining our notion of ‘before’ and high entropy defining our notion of ‘after’. Here we explored this proposed relationship between entropy and the perception of time’s arrow. We predicted that if the brain has some mechanism for detecting entropy, whose output feeds into processes involved in constructing our perception of the direction of time, presentation of violations to the expectation that low entropy defines ‘before’ and high entropy defines ‘after’ would alert this mechanism, leading to measurable behavioral effects, namely a disruption in duration perception. To test this hypothesis, participants were shown briefly-presented (1000 ms or 500 ms) computer-generated visual dynamic events: novel 3D shapes that were seen either to evolve from whole figures into parts (low to high entropy condition) or were seen in the reverse direction: parts that coalesced into whole figures (high to low entropy condition). On each trial, participants were instructed to reproduce the duration of their visual experience of the stimulus by pressing and releasing the space bar. To ensure that attention was being deployed to the stimuli, a secondary task was to report the direction of the visual event (forward or reverse motion). Participants completed 60 trials. As predicted, we found that duration reproduction was significantly longer for the high to low entropy condition compared to the low to high entropy condition (p=.03). This preliminary data suggests the presence of a neural mechanism that detects entropy, which is used by other processes to construct our perception of the direction of time or time’s arrow.

Keywords: time perception, entropy, temporal illusions, duration perception

Procedia PDF Downloads 176
40077 The Relationship between Sleep and Selective Attention among Adolescents

Authors: Devante K. Barrett

Abstract:

The objective of this research is to evaluate the association between one's subjective tiredness as it relates to the Stroop effect and the identification of subjective tiredness among adolescents. Individuals with a high subjective tired score are more likely to have a lower reaction time in incongruent trials. It is understood that sleep is an overlooked phenomenon in psychological research, and with the utilization of adequate testing, ways to address sleep in this manner may no longer be an issue of concern in the future. Sleep researchers often obtain significant results by way of Stroop testing. The caveat is that the integrity of Stroop testing can be negatively affected by various external factors. The propensity for interference to occur is caused by the automatic process of reading. This is deemed one of the most detrimental issues in understanding the dimensions of sleep. Interference subsequently decreases response time in the identification of the ink color. Considering the fact that the Stroop task is helpful in evaluating cognitive function in clinical populations, results should be interpreted cautiously due to the multitude of variables that may affect performance. When planning studies and analyzing data related to the Stroop effect, researchers must take individual differences and environmental factors into consideration. Having a thorough understanding of the Stroop effect can aid in the development of initiatives targeted at enhancing attention span and cognitive control across a range of demographics. Age-related changes in sleep patterns and selective attention from childhood to adolescence are key points of consideration in various areas of research, which are primarily focused on age-dependent performance in vigilant attention among young populations. Thus, findings related to Stroop testing may be deemed eligible to provide meaningful implications regarding vigilant attention among such populations. With respect to future research, this can be assessed by way of neuroimaging of brain-regions associated with selective attention and circadian cycles.

Keywords: sleep, selective attention, Stroop effect, cognitive function, sleep hygiene, vigilant attention

Procedia PDF Downloads 14
40076 The Variable Sampling Interval Xbar Chart versus the Double Sampling Xbar Chart

Authors: Michael B. C. Khoo, J. L. Khoo, W. C. Yeong, W. L. Teoh

Abstract:

The Shewhart Xbar control chart is a useful process monitoring tool in manufacturing industries to detect the presence of assignable causes. However, it is insensitive in detecting small process shifts. To circumvent this problem, adaptive control charts are suggested. An adaptive chart enables at least one of the chart’s parameters to be adjusted to increase the chart’s sensitivity. Two common adaptive charts that exist in the literature are the double sampling (DS) Xbar and variable sampling interval (VSI) Xbar charts. This paper compares the performances of the DS and VSI Xbar charts, based on the average time to signal (ATS) criterion. The ATS profiles of the DS Xbar and VSI Xbar charts are obtained using the Mathematica and Statistical Analysis System (SAS) programs, respectively. The results show that the VSI Xbar chart is generally superior to the DS Xbar chart.

Keywords: adaptive charts, average time to signal, double sampling, charts, variable sampling interval

Procedia PDF Downloads 291
40075 Tourism Economics and Tourism Development in Greece, in the Period of the Economic Adjustment Programmes

Authors: Aimilia Vlami

Abstract:

This paper examines the tourist economic development of Greece on the basis of the analysis of the main characteristics of the financing and development processes and the spatial and temporal structure of supply and demand. Taking into consideration the evolution of the economic planning and the policy for the tourist development of Greece over time, we study at the same time: the composition, the changes and the dynamics of the hotel industry in the last 20 years and especially the period of the economic adjustment programmes, where tourism has become a key pillar of development. It is clearly evident that this paper is written in a specific economic situation, which directs as much the emphases as the flow of arguments around the central question of balance of interventions in the tourist space, between the need for planning and practice of policy for sustainable tourist growth and in the de facto adaptation of fragmentary and urgent interventions of shaping and transforming the tourist space, as they are shaped by the requirements of various institutions and interest groups.

Keywords: development, Greece, hospitality, economic policy, tourism investments

Procedia PDF Downloads 138
40074 Application of Lean Manufacturing in Brake Shoe Manufacturing Plant: A Case Study

Authors: Anees K. Ahamed, Aakash Kumar R. G., Raj M. Mohan

Abstract:

The main objective is to apply lean tools to identify and eliminate waste in and among the work stations so as to improve the process speed and quality. From the top seven wastes in the lean concept, we consider the movement of materials, defects, and inventory for the improvement since these cause the major impact on the performance measures. The layout was improved to reduce the movement of materials. It also quantifies the reduction in movement among the work stations. Value stream mapping has been used for identification of waste. Cause and effect diagram and 5W analysis are used to identify the reasons for defects and to provide the counter measures. Some cycle time reduction techniques also proposed to improve the productivity. Lean Audit check sheet was also used to identify the current position of the industry and to identify the gap to make the industry Lean.

Keywords: cause and effect diagram, cycle time reduction, defects, lean, waste reduction

Procedia PDF Downloads 390
40073 Evolution of Design through Documentation of Architecture Design Processes

Authors: Maniyarasan Rajendran

Abstract:

Every design has a process, and every architect deals in the ways best known to them. The design translation from the concept to completion change in accordance with their design philosophies, their tools, availability of resources, and at times the clients and the context of the design as well. The approach to understanding the design process requires formalisation of the design intents. The design process is characterised by change, with the time and the technology. The design flow is just indicative and never exhaustive. The knowledge and experience of stakeholders remain limited to the part they played in the project, and their ability to remember, and is through the Photographs. These artefacts, when circulated can hardly tell what the project is. They can never tell the narrative behind. In due course, the design processes are lost. The Design junctions are lost in the journey. Photographs acted as major source materials, along with its importance in architectural revivalism in the 19th century. From the history, we understand that it has been photographs, that act as the dominant source of evidence. The idea of recording is also followed with the idea of getting inspired from the records and documents. The design concept, the architectural firms’ philosophies, the materials used, the special needs, the numerous ‘Trial-and-error’ methods, design methodology, experience of failures and success levels, and the knowledge acquired, etc., and the various other aspects and methods go through in every project, and they deserve/ought to be recorded. The knowledge can be preserved and passed through generations, by documenting the design processes involved. This paper explores the idea of a process documentation as a tool of self-reflection, creation of architectural firm’ repository, and these implications proceed with the design evolution of the team.

Keywords: architecture, design, documentation, records

Procedia PDF Downloads 371
40072 Review and Evaluation of Trending Canonical Correlation Analyses-Based Brain Computer Interface Methods

Authors: Bayar Shahab

Abstract:

The fast development of technology that has advanced neuroscience and human interaction with computers has enabled solutions to various problems, and issues of this new era have been found and are being found like no other time in history. Brain-computer interface so-called BCI has opened the door to several new research areas and have been able to provide solutions to critical and important issues such as supporting a paralyzed patient to interact with the outside world, controlling a robot arm, playing games in VR with the brain, driving a wheelchair or even a car and neurotechnology enabled the rehabilitation of the lost memory, etc. This review work presents state-of-the-art methods and improvements of canonical correlation analyses (CCA), which is an SSVEP-based BCI method. These are the methods used to extract EEG signal features or, to be said in a different way, the features of interest that we are looking for in the EEG analyses. Each of the methods from oldest to newest has been discussed while comparing their advantages and disadvantages. This would create a great context and help researchers to understand the most state-of-the-art methods available in this field with their pros and cons, along with their mathematical representations and usage. This work makes a vital contribution to the existing field of study. It differs from other similar recently published works by providing the following: (1) stating most of the prominent methods used in this field in a hierarchical way (2) explaining pros and cons of each method and their performance (3) presenting the gaps that exist at the end of each method that can open the understanding and doors to new research and/or improvements.

Keywords: BCI, CCA, SSVEP, EEG

Procedia PDF Downloads 147
40071 Parallel Evaluation of Sommerfeld Integrals for Multilayer Dyadic Green's Function

Authors: Duygu Kan, Mehmet Cayoren

Abstract:

Sommerfeld-integrals (SIs) are commonly encountered in electromagnetics problems involving analysis of antennas and scatterers embedded in planar multilayered media. Generally speaking, the analytical solution of SIs is unavailable, and it is well known that numerical evaluation of SIs is very time consuming and computationally expensive due to the highly oscillating and slowly decaying nature of the integrands. Therefore, fast computation of SIs has a paramount importance. In this paper, a parallel code has been developed to speed up the computation of SI in the framework of calculation of dyadic Green’s function in multilayered media. OpenMP shared memory approach is used to parallelize the SI algorithm and resulted in significant time savings. Moreover accelerating the computation of dyadic Green’s function is discussed based on the parallel SI algorithm developed.

Keywords: Sommerfeld-integrals, multilayer dyadic Green’s function, OpenMP, shared memory parallel programming

Procedia PDF Downloads 252
40070 Improving Sample Analysis and Interpretation Using QIAGENs Latest Investigator STR Multiplex PCR Assays with a Novel Quality Sensor

Authors: Daniel Mueller, Melanie Breitbach, Stefan Cornelius, Sarah Pakulla-Dickel, Margaretha Koenig, Anke Prochnow, Mario Scherer

Abstract:

The European STR standard set (ESS) of loci as well as the new expanded CODIS core loci set as recommended by the CODIS Core Loci Working Group, has led to a higher standardization and harmonization in STR analysis across borders. Various multiplex PCRs assays have since been developed for the analysis of these 17 ESS or 23 CODIS expansion STR markers that all meet high technical demands. However, forensic analysts are often faced with difficult STR results and the questions thereupon. What is the reason that no peaks are visible in the electropherogram? Did the PCR fail? Was the DNA concentration too low? QIAGEN’s newest Investigator STR kits contain a novel Quality Sensor (QS) that acts as internal performance control and gives useful information for evaluating the amplification efficiency of the PCR. QS indicates if the reaction has worked in general and furthermore allows discriminating between the presence of inhibitors or DNA degradation as a cause for the typical ski slope effect observed in STR profiles of such challenging samples. This information can be used to choose the most appropriate rework strategy.Based on the latest PCR chemistry called FRM 2.0, QIAGEN now provides the next technological generation for STR analysis, the Investigator ESSplex SE QS and Investigator 24plex QS Kits. The new PCR chemistry ensures robust and fast PCR amplification with improved inhibitor resistance and easy handling for a manual or automated setup. The short cycling time of 60 min reduces the duration of the total PCR analysis to make a whole workflow analysis in one day more likely. To facilitate the interpretation of STR results a smart primer design was applied for best possible marker distribution, highest concordance rates and a robust gender typing.

Keywords: PCR, QIAGEN, quality sensor, STR

Procedia PDF Downloads 499
40069 A Large Language Model-Driven Method for Automated Building Energy Model Generation

Authors: Yake Zhang, Peng Xu

Abstract:

The development of building energy models (BEM) required for architectural design and analysis is a time-consuming and complex process, demanding a deep understanding and proficient use of simulation software. To streamline the generation of complex building energy models, this study proposes an automated method for generating building energy models using a large language model and the BEM library aimed at improving the efficiency of model generation. This method leverages a large language model to parse user-specified requirements for target building models, extracting key features such as building location, window-to-wall ratio, and thermal performance of the building envelope. The BEM library is utilized to retrieve energy models that match the target building’s characteristics, serving as reference information for the large language model to enhance the accuracy and relevance of the generated model, allowing for the creation of a building energy model that adapts to the user’s modeling requirements. This study enables the automatic creation of building energy models based on natural language inputs, reducing the professional expertise required for model development while significantly decreasing the time and complexity of manual configuration. In summary, this study provides an efficient and intelligent solution for building energy analysis and simulation, demonstrating the potential of a large language model in the field of building simulation and performance modeling.

Keywords: artificial intelligence, building energy modelling, building simulation, large language model

Procedia PDF Downloads 35
40068 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques

Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev

Abstract:

Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.

Keywords: data analysis, demand modeling, healthcare, medical facilities

Procedia PDF Downloads 147
40067 Microwave Assisted Foam-Mat Drying of Guava Pulp

Authors: Ovais S. Qadri, Abhaya K. Srivastava

Abstract:

Present experiments were carried to study the drying kinetics and quality of microwave foam-mat dried guava powder. Guava pulp was microwave foam mat dried using 8% egg albumin as foaming agent and then dried at microwave power 480W, 560W, 640W, 720W and 800W, foam thickness 3mm, 5mm and 7mm and inlet air temperature of 40˚C and 50˚C. Weight loss was used to estimate change in drying rate with respect to time. Powdered samples were analysed for various physicochemical quality parameters viz. acidity, pH, TSS, colour change and ascorbic acid content. Statistical analysis using three-way ANOVA revealed that sample of 5mm foam thickness dried at 800W and 50˚C was the best with 0.3584% total acid, 3.98 pH, 14min drying time, 8˚Brix TSS, 3.263 colour change and 154.762mg/100g ascorbic acid content.

Keywords: foam mat drying, foam mat guava, guava powder, microwave drying

Procedia PDF Downloads 335
40066 Gaze Patterns of Skilled and Unskilled Sight Readers Focusing on the Cognitive Processes Involved in Reading Key and Time Signatures

Authors: J. F. Viljoen, Catherine Foxcroft

Abstract:

Expert sight readers rely on their ability to recognize patterns in scores, their inner hearing and prediction skills in order to perform complex sight reading exercises. They also have the ability to observe deviations from expected patterns in musical scores. This increases the “Eye-hand span” (reading ahead of the point of playing) in order to process the elements in the score. The study aims to investigate the gaze patterns of expert and non-expert sight readers focusing on key and time signatures. 20 musicians were tasked with playing 12 sight reading examples composed for one hand and five examples composed for two hands to be performed on a piano keyboard. These examples were composed in different keys and time signatures and included accidentals and changes of time signature to test this theory. Results showed that the experts fixate more and for longer on key and time signatures as well as deviations in examples for two hands than the non-expert group. The inverse was true for the examples for one hand, where expert sight readers showed fewer and shorter fixations on key and time signatures as well as deviations. This seems to suggest that experts focus more on the key and time signatures as well as deviations in complex scores to facilitate sight reading. The examples written for one appeared to be too easy for the expert sight readers, compromising gaze patterns.

Keywords: cognition, eye tracking, musical notation, sight reading

Procedia PDF Downloads 141
40065 Assessment of Time-variant Work Stress for Human Error Prevention

Authors: Hyeon-Kyo Lim, Tong-Il Jang, Yong-Hee Lee

Abstract:

For an operator in a nuclear power plant, human error is one of the most dreaded factors that may result in unexpected accidents. The possibility of human errors may be low, but the risk of them would be unimaginably enormous. Thus, for accident prevention, it is quite indispensable to analyze the influence of any factors which may raise the possibility of human errors. During the past decades, not a few research results showed that performance of human operators may vary over time due to lots of factors. Among them, stress is known to be an indirect factor that may cause human errors and result in mental illness. Until now, not a few assessment tools have been developed to assess stress level of human workers. However, it still is questionable to utilize them for human performance anticipation which is related with human error possibility, because they were mainly developed from the viewpoint of mental health rather than industrial safety. Stress level of a person may go up or down with work time. In that sense, if they would be applicable in the safety aspect, they should be able to assess the variation resulted from work time at least. Therefore, this study aimed to compare their applicability for safety purpose. More than 10 kinds of work stress tools were analyzed with reference to assessment items, assessment and analysis methods, and follow-up measures which are known to close related factors with work stress. The results showed that most tools mainly focused their weights on some common organizational factors such as demands, supports, and relationships, in sequence. Their weights were broadly similar. However, they failed to recommend practical solutions. Instead, they merely advised to set up overall counterplans in PDCA cycle or risk management activities which would be far from practical human error prevention. Thus, it was concluded that application of stress assessment tools mainly developed for mental health seemed to be impractical for safety purpose with respect to human performance anticipation, and that development of a new assessment tools would be inevitable if anyone wants to assess stress level in the aspect of human performance variation and accident prevention. As a consequence, as practical counterplans, this study proposed a new scheme for assessment of work stress level of a human operator that may vary over work time which is closely related with the possibility of human errors.

Keywords: human error, human performance, work stress, assessment tool, time-variant, accident prevention

Procedia PDF Downloads 676
40064 Pathological Gambling and Impulsivity: Comparison of the Eight Laboratory Measures of Inhibition Capacities

Authors: Semion Kertzman, Pinhas Dannon

Abstract:

Impulsive behaviour and the underlying brain processes are hypothesized to be central in the development and maintenance of pathological gambling. Inhibition ability can be differentially impaired in pathological gamblers (PGs). Aims: This study aimed to compare the ability of eight widely used inhibition measures to discriminate between PGs and healthy controls (HCs). Methods: PGs (N=51) and demographically matched HCs (N=51) performed cognitive inhibition (the Stroop), motor inhibition (the Go/NoGo) and reflective inhibition (the Matching Familiar Figures (MFFT)) tasks. Results: An augmented total interference response time in the Stroop task (η² =0.054), a large number of commission errors (η² =0.053) in the Go/NoGo task, and the total number of errors in the MFFT (η² =0.05) can discriminate PGs from HCs. Other measures are unable to differentiate between PGs and HCs. No significant correlations were observed between inhibition measures. Conclusion: Inhibition measures varied in the ability to discriminate PGs from HCs. Most inhibition measures were not relevant to gambling behaviour. PGs do not express rash, impulsive behaviour, such as quickly choosing an answer without thinking. In contrast, in PGs, inhibition impairment was related to slow-inaccurate performance.

Keywords: pathological gambling, impulsivity, neurocognition, addiction

Procedia PDF Downloads 304
40063 Rapid, Direct, Real-Time Method for Bacteria Detection on Surfaces

Authors: Evgenia Iakovleva, Juha Koivisto, Pasi Karppinen, J. Inkinen, Mikko Alava

Abstract:

Preventing the spread of infectious diseases throughout the worldwide is one of the most important tasks of modern health care. Infectious diseases not only account for one fifth of the deaths in the world, but also cause many pathological complications for the human health. Touch surfaces pose an important vector for the spread of infections by varying microorganisms, including antimicrobial resistant organisms. Further, antimicrobial resistance is reply of bacteria to the overused or inappropriate used of antibiotics everywhere. The biggest challenges in bacterial detection by existing methods are non-direct determination, long time of analysis, the sample preparation, use of chemicals and expensive equipment, and availability of qualified specialists. Therefore, a high-performance, rapid, real-time detection is demanded in rapid practical bacterial detection and to control the epidemiological hazard. Among the known methods for determining bacteria on the surfaces, Hyperspectral methods can be used as direct and rapid methods for microorganism detection on different kind of surfaces based on fluorescence without sampling, sample preparation and chemicals. The aim of this study was to assess the relevance of such systems to remote sensing of surfaces for microorganisms detection to prevent a global spread of infectious diseases. Bacillus subtilis and Escherichia coli with different concentrations (from 0 to 10x8 cell/100µL) were detected with hyperspectral camera using different filters as visible visualization of bacteria and background spots on the steel plate. A method of internal standards was applied for monitoring the correctness of the analysis results. Distances from sample to hyperspectral camera and light source are 25 cm and 40 cm, respectively. Each sample is optically imaged from the surface by hyperspectral imaging system, utilizing a JAI CM-140GE-UV camera. Light source is BeamZ FLATPAR DMX Tri-light, 3W tri-colour LEDs (red, blue and green). Light colors are changed through DMX USB Pro interface. The developed system was calibrated following a standard procedure of setting exposure and focused for light with λ=525 nm. The filter is ThorLabs KuriousTM hyperspectral filter controller with wavelengths from 420 to 720 nm. All data collection, pro-processing and multivariate analysis was performed using LabVIEW and Python software. The studied human eye visible and invisible bacterial stains clustered apart from a reference steel material by clustering analysis using different light sources and filter wavelengths. The calculation of random and systematic errors of the analysis results proved the applicability of the method in real conditions. Validation experiments have been carried out with photometry and ATP swab-test. The lower detection limit of developed method is several orders of magnitude lower than for both validation methods. All parameters of the experiments were the same, except for the light. Hyperspectral imaging method allows to separate not only bacteria and surfaces, but also different types of bacteria, such as Gram-negative Escherichia coli and Gram-positive Bacillus subtilis. Developed method allows skipping the sample preparation and the use of chemicals, unlike all other microbiological methods. The time of analysis with novel hyperspectral system is a few seconds, which is innovative in the field of microbiological tests.

Keywords: Escherichia coli, Bacillus subtilis, hyperspectral imaging, microorganisms detection

Procedia PDF Downloads 232
40062 Nanocrystalline Cellulose from Oil Palm Fiber

Authors: Ridzuan Ramli, Zianor Azrina Zianon Abdin, Mohammad Dalour Beg, Rosli M. Yunus

Abstract:

Nanocrystalline cellulose (NCC) were produced by using the ultrasound assisted acid hydrolysis from oil palm empty fruit bunch (EFB) pulp with different hydrolysis time then were analyzed by using FESEM and TGA as in comparison with EFB fiber and EFB pulp. Based on the FESEM analysis, it was found that NCC has a rod like shaped under the acid hydrolysis with an assistant of ultrasound. According to thermal stability, the NCC obtained show remarkable sign of high thermal stability compared to EFB fiber and EFB pulp. However, as the hydrolysis time increase, the thermal stability of NCC was deceased. As in conclusion, the NCC can be prepared by using ultrasound assisted acid hydrolysis. The NCC obtained have good thermal stability and have a great potential as the reinforcement in composite materials.

Keywords: Nanocrystalline cellulose, ultrasound assisted acid hydrolysis, thermal stability, morphology, empty fruit bunch (EFB)

Procedia PDF Downloads 484
40061 Performance Evaluation of One and Two Dimensional Prime Codes for Optical Code Division Multiple Access Systems

Authors: Gurjit Kaur, Neena Gupta

Abstract:

In this paper, we have analyzed and compared the performance of various coding schemes. The basic ID prime sequence codes are unique in only dimension, i.e. time slots, whereas 2D coding techniques are not unique by their time slots but with their wavelengths also. In this research, we have evaluated and compared the performance of 1D and 2D coding techniques constructed using prime sequence coding pattern for Optical Code Division Multiple Access (OCDMA) system on a single platform. Analysis shows that 2D prime code supports lesser number of active users than 1D codes, but they are having large code family and are the most secure codes compared to other codes. The performance of all these codes is analyzed on basis of number of active users supported at a Bit Error Rate (BER) of 10-9.

Keywords: CDMA, OCDMA, BER, OOC, PC, EPC, MPC, 2-D PC/PC, λc, λa

Procedia PDF Downloads 338