Search results for: event quantification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1738

Search results for: event quantification

1558 Geospatial Analysis of Hydrological Response to Forest Fires in Small Mediterranean Catchments

Authors: Bojana Horvat, Barbara Karleusa, Goran Volf, Nevenka Ozanic, Ivica Kisic

Abstract:

Forest fire is a major threat in many regions in Croatia, especially in coastal areas. Although they are often caused by natural processes, the most common cause is the human factor, intentional or unintentional. Forest fires drastically transform landscapes and influence natural processes. The main goal of the presented research is to analyse and quantify the impact of the forest fire on hydrological processes and propose the model that best describes changes in hydrological patterns in the analysed catchments. Keeping in mind the spatial component of the processes, geospatial analysis is performed to gain better insight into the spatial variability of the hydrological response to disastrous events. In that respect, two catchments that experienced severe forest fire were delineated, and various hydrological and meteorological data were collected both attribute and spatial. The major drawback is certainly the lack of hydrological data, common in small torrential karstic streams; hence modelling results should be validated with the data collected in the catchment that has similar characteristics and established hydrological monitoring. The event chosen for the modelling is the forest fire that occurred in July 2019 and burned nearly 10% of the analysed area. Surface (land use/land cover) conditions before and after the event were derived from the two Sentinel-2 images. The mapping of the burnt area is based on a comparison of the Normalized Burn Index (NBR) computed from both images. To estimate and compare hydrological behaviour before and after the event, curve number (CN) values are assigned to the land use/land cover classes derived from the satellite images. Hydrological modelling resulted in surface runoff generation and hence prediction of hydrological responses in the catchments to a forest fire event. The research was supported by the Croatian Science Foundation through the project 'Influence of Open Fires on Water and Soil Quality' (IP-2018-01-1645).

Keywords: Croatia, forest fire, geospatial analysis, hydrological response

Procedia PDF Downloads 137
1557 Relationship between ISO 14001 and Market Performance of Firms in China: An Institutional and Market Learning Perspective

Authors: Hammad Riaz, Abubakr Saeed

Abstract:

Environmental Management System (EMS), i.e., ISO 14001 helps to build corporate reputation, legitimacy and can also be considered as firms’ strategic response to institutional pressure to reduce the impact of business activity on natural environment. The financial outcomes of certifying with ISO 14001 are still unclear and equivocal. Drawing on institutional and market learning theories, the impact of ISO 14001 on firms’ market performance is examined for Chinese firms. By employing rigorous event study approach, this paper compared ISO 14001 certified firms with non-certified counterpart firms based on different matching criteria that include size, return on assets and industry. The results indicate that the ISO 14001 has been negatively signed by the investors both in the short and long-run. This paper suggested implications for policy makers, managers, and other nonprofit organizations.

Keywords: ISO 14001, legitimacy, institutional forces, event study approach, emerging markets

Procedia PDF Downloads 164
1556 Optimizing Stormwater Sampling Design for Estimation of Pollutant Loads

Authors: Raja Umer Sajjad, Chang Hee Lee

Abstract:

Stormwater runoff is the leading contributor to pollution of receiving waters. In response, an efficient stormwater monitoring program is required to quantify and eventually reduce stormwater pollution. The overall goals of stormwater monitoring programs primarily include the identification of high-risk dischargers and the development of total maximum daily loads (TMDLs). The challenge in developing better monitoring program is to reduce the variability in flux estimates due to sampling errors; however, the success of monitoring program mainly depends on the accuracy of the estimates. Apart from sampling errors, manpower and budgetary constraints also influence the quality of the estimates. This study attempted to develop optimum stormwater monitoring design considering both cost and the quality of the estimated pollutants flux. Three years stormwater monitoring data (2012 – 2014) from a mix land use located within Geumhak watershed South Korea was evaluated. The regional climate is humid and precipitation is usually well distributed through the year. The investigation of a large number of water quality parameters is time-consuming and resource intensive. In order to identify a suite of easy-to-measure parameters to act as a surrogate, Principal Component Analysis (PCA) was applied. Means, standard deviations, coefficient of variation (CV) and other simple statistics were performed using multivariate statistical analysis software SPSS 22.0. The implication of sampling time on monitoring results, number of samples required during the storm event and impact of seasonal first flush were also identified. Based on the observations derived from the PCA biplot and the correlation matrix, total suspended solids (TSS) was identified as a potential surrogate for turbidity, total phosphorus and for heavy metals like lead, chromium, and copper whereas, Chemical Oxygen Demand (COD) was identified as surrogate for organic matter. The CV among different monitored water quality parameters were found higher (ranged from 3.8 to 15.5). It suggests that use of grab sampling design to estimate the mass emission rates in the study area can lead to errors due to large variability. TSS discharge load calculation error was found only 2 % with two different sample size approaches; i.e. 17 samples per storm event and equally distributed 6 samples per storm event. Both seasonal first flush and event first flush phenomena for most water quality parameters were observed in the study area. Samples taken at the initial stage of storm event generally overestimate the mass emissions; however, it was found that collecting a grab sample after initial hour of storm event more closely approximates the mean concentration of the event. It was concluded that site and regional climate specific interventions can be made to optimize the stormwater monitoring program in order to make it more effective and economical.

Keywords: first flush, pollutant load, stormwater monitoring, surrogate parameters

Procedia PDF Downloads 241
1555 Simulation of Utility Accrual Scheduling and Recovery Algorithm in Multiprocessor Environment

Authors: A. Idawaty, O. Mohamed, A. Z. Zuriati

Abstract:

This paper presents the development of an event based Discrete Event Simulation (DES) for a recovery algorithm known Backward Recovery Global Preemptive Utility Accrual Scheduling (BR_GPUAS). This algorithm implements the Backward Recovery (BR) mechanism as a fault recovery solution under the existing Time/Utility Function/ Utility Accrual (TUF/UA) scheduling domain for multiprocessor environment. The BR mechanism attempts to take the faulty tasks back to its initial safe state and then proceeds to re-execute the affected section of the faulty tasks to enable recovery. Considering that faults may occur in the components of any system; a fault tolerance system that can nullify the erroneous effect is necessary to be developed. Current TUF/UA scheduling algorithm uses the abortion recovery mechanism and it simply aborts the erroneous task as their fault recovery solution. None of the existing algorithm in TUF/UA scheduling domain in multiprocessor scheduling environment have considered the transient fault and implement the BR mechanism as a fault recovery mechanism to nullify the erroneous effect and solve the recovery problem in this domain. The developed BR_GPUAS simulator has derived the set of parameter, events and performance metrics according to a detailed analysis of the base model. Simulation results revealed that BR_GPUAS algorithm can saved almost 20-30% of the accumulated utilities making it reliable and efficient for the real-time application in the multiprocessor scheduling environment.

Keywords: real-time system (RTS), time utility function/ utility accrual (TUF/UA) scheduling, backward recovery mechanism, multiprocessor, discrete event simulation (DES)

Procedia PDF Downloads 306
1554 Algorithm for Quantification of Pulmonary Fibrosis in Chest X-Ray Exams

Authors: Marcela de Oliveira, Guilherme Giacomini, Allan Felipe Fattori Alves, Ana Luiza Menegatti Pavan, Maria Eugenia Dela Rosa, Fernando Antonio Bacchim Neto, Diana Rodrigues de Pina

Abstract:

It is estimated that each year one death every 10 seconds (about 2 million deaths) in the world is attributed to tuberculosis (TB). Even after effective treatment, TB leaves sequelae such as, for example, pulmonary fibrosis, compromising the quality of life of patients. Evaluations of the aforementioned sequel are usually performed subjectively by radiology specialists. Subjective evaluation may indicate variations inter and intra observers. The examination of x-rays is the diagnostic imaging method most accomplished in the monitoring of patients diagnosed with TB and of least cost to the institution. The application of computational algorithms is of utmost importance to make a more objective quantification of pulmonary impairment in individuals with tuberculosis. The purpose of this research is the use of computer algorithms to quantify the pulmonary impairment pre and post-treatment of patients with pulmonary TB. The x-ray images of 10 patients with TB diagnosis confirmed by examination of sputum smears were studied. Initially the segmentation of the total lung area was performed (posteroanterior and lateral views) then targeted to the compromised region by pulmonary sequel. Through morphological operators and the application of signal noise tool, it was possible to determine the compromised lung volume. The largest difference found pre- and post-treatment was 85.85% and the smallest was 54.08%.

Keywords: algorithm, radiology, tuberculosis, x-rays exam

Procedia PDF Downloads 420
1553 A Blending Analysis of Metaphors and Metonymies Used to Depict the Deal of the Century by Jordanian Cartoonists

Authors: Aseel Zibin, Abdel Rahman Altakhaineh

Abstract:

This study analyses 30 cartoons depicting THE DEAL OF THE CENTURY as envisaged by two Jordanian cartoonists, namely, EmadHajjaj and Osama Hajjaj. Conceptual Blending Theory (CBT) and Multimodal Metaphor Theory (MMT) are adopted as a theoretical framework to interpret the metaphors and metonymies used in the target cartoons. The results reveal that the target domain THE DEAL OF THE CENTURY was conceptualized mainly through layered metaphors that have metonymic basis and event metaphors\allegories. Specifically, 6 groups were identified: OBJECT or a situation involving OBJECTS, situations involving HUMANS\HYBRIDS of HUMANS and OBJECTS, an ANIMAL OR situation involving an ANIMAL, hybrids of WEAPONS and humans, and event metaphors used to build a story\allegory. The target domain was also depicted via event metaphors used to build a story; some of which are embedded in the Jordanian culture, while others could be perceivable cross-culturally. The results also demonstrate that the most widely used configurations to construe the metaphors was the pictorial source–verbal target in line with Lan and Zuo (2016); the motivation was probably the greater conceptual density and concreteness of visual representation since the target is better captured verbally because of its abstractness. The use of cross-modal mappings of this type was attributed to the abstractness of the target domain, THE DEAL OF THE CENTURY, which makes it more construable via verbal cues rather than visual ones. In contrast, the source domains used were mainly concrete and thus perceivable pictorially rather than verbally.

Keywords: semiotics, cognitive semantics, metaphor, culture, blending, cartoon

Procedia PDF Downloads 182
1552 Hydrological Characterization of a Watershed for Streamflow Prediction

Authors: Oseni Taiwo Amoo, Bloodless Dzwairo

Abstract:

In this paper, we extend the versatility and usefulness of GIS as a methodology for any river basin hydrologic characteristics analysis (HCA). The Gurara River basin located in North-Central Nigeria is presented in this study. It is an on-going research using spatial Digital Elevation Model (DEM) and Arc-Hydro tools to take inventory of the basin characteristics in order to predict water abstraction quantification on streamflow regime. One of the main concerns of hydrological modelling is the quantification of runoff from rainstorm events. In practice, the soil conservation service curve (SCS) method and the Conventional procedure called rational technique are still generally used these traditional hydrological lumped models convert statistical properties of rainfall in river basin to observed runoff and hydrograph. However, the models give little or no information about spatially dispersed information on rainfall and basin physical characteristics. Therefore, this paper synthesizes morphometric parameters in generating runoff. The expected results of the basin characteristics such as size, area, shape, slope of the watershed and stream distribution network analysis could be useful in estimating streamflow discharge. Water resources managers and irrigation farmers could utilize the tool for determining net return from available scarce water resources, where past data records are sparse for the aspect of land and climate.

Keywords: hydrological characteristic, stream flow, runoff discharge, land and climate

Procedia PDF Downloads 341
1551 Numerical Modeling of Structural Failure of a Ship During the Collision Event

Authors: Adjal Yassine, Semmani Amar

Abstract:

During the last decades, The risk of collision has been increased, especially in high maritime traffic. As the consequence, the demand is required for safety at sea and environmental protection. For this purpose, the consequences prediction of ship collisions is recommended in order to minimize structural failure. additionally, at the design stage of the ship, damage generated during the collision event must be taken into consideration. This structural failure, in some cases, can develop into the progressive collapse of other structural elements and generate catastrophic consequences. The present study investigates the progressive collapse of ships damaged by collisions using the Non -linear finite element method. The failure criteria are taken into account. The impacted area has a refined mesh in order to have more reliable results. Finally, a parametric study was conducted in this study to highlight the effect of the ship's speed, as well as the different impacted areas of double-bottom ships.

Keywords: collsion, strucural failure, ship, finite element analysis

Procedia PDF Downloads 100
1550 Quantitative Polymerase Chain Reaction Analysis of Phytoplankton Composition and Abundance to Assess Eutrophication: A Multi-Year Study in Twelve Large Rivers across the United States

Authors: Chiqian Zhang, Kyle D. McIntosh, Nathan Sienkiewicz, Ian Struewing, Erin A. Stelzer, Jennifer L. Graham, Jingrang Lu

Abstract:

Phytoplankton plays an essential role in freshwater aquatic ecosystems and is the primary group synthesizing organic carbon and providing food sources or energy to ecosystems. Therefore, the identification and quantification of phytoplankton are important for estimating and assessing ecosystem productivity (carbon fixation), water quality, and eutrophication. Microscopy is the current gold standard for identifying and quantifying phytoplankton composition and abundance. However, microscopic analysis of phytoplankton is time-consuming, has a low sample throughput, and requires deep knowledge and rich experience in microbial morphology to implement. To improve this situation, quantitative polymerase chain reaction (qPCR) was considered for phytoplankton identification and quantification. Using qPCR to assess phytoplankton composition and abundance, however, has not been comprehensively evaluated. This study focused on: 1) conducting a comprehensive performance comparison of qPCR and microscopy techniques in identifying and quantifying phytoplankton and 2) examining the use of qPCR as a tool for assessing eutrophication. Twelve large rivers located throughout the United States were evaluated using data collected from 2017 to 2019 to understand the relation between qPCR-based phytoplankton abundance and eutrophication. This study revealed that temporal variation of phytoplankton abundance in the twelve rivers was limited within years (from late spring to late fall) and among different years (2017, 2018, and 2019). Midcontinent rivers had moderately greater phytoplankton abundance than eastern and western rivers, presumably because midcontinent rivers were more eutrophic. The study also showed that qPCR- and microscope-determined phytoplankton abundance had a significant positive linear correlation (adjusted R² 0.772, p-value < 0.001). In addition, phytoplankton abundance assessed via qPCR showed promise as an indicator of the eutrophication status of those rivers, with oligotrophic rivers having low phytoplankton abundance and eutrophic rivers having (relatively) high phytoplankton abundance. This study demonstrated that qPCR could serve as an alternative tool to traditional microscopy for phytoplankton quantification and eutrophication assessment in freshwater rivers.

Keywords: phytoplankton, eutrophication, river, qPCR, microscopy, spatiotemporal variation

Procedia PDF Downloads 101
1549 The Mask of Motherhood a Changing Identity During the Transition to Motherhood

Authors: Geraldine Mc Loughlin, Mary Horgan, Rosaleen Murphy

Abstract:

Childbirth is a life-changing event, a psychological transition for the mother that must be viewed in a social context. Much has been written and documented regarding the preparation for birth and the immediate postnatal period, but the full psychological impact on the mother is not clear. One aspect of the transition process is Identity. Depending on a person’s worldview, the concept of identity is viewed differently; the nature of reality and how they construct knowledge influence these perspectives. Becoming a mother is not just an event but a process that time and experience will help to shape the understanding of the woman. To explore the emotional and psychological aspects of first-time mother’s experience during the transition to new motherhood. To identify factors affecting women’s identities in the period of 36 weeks gestation to 12 weeks postpartum. Interpretative Phenomenological Analysis (IPA) was used. It explores how these women make sense of and give meaning to their experiences. IPA is underpinned by 3 key principles: phenomenology, hermeneutics and idiographics. A purposeful sample of 10 women was recruited for this longitudinal study, to enable data to be collected during the transition to motherhood. Individual identity was interpreted and viewed as developing in response to changing contexts, such as the birth event becoming a parent, enabling one to construct one’s own sense of a meaningful life. Women effectively differentiated themselves from their personal and social identities and took responsibility for their actions. Identity is culturally and socially shaped and experienced, though not experienced similarly by all women. The individualized perspective on identity recognizes that (a) social influences are seen as external to the individual and (b) the view that social influences are, in fact, internalized by the individual.

Keywords: motherhood, transition, identity, IPA

Procedia PDF Downloads 62
1548 A Discrete Event Simulation Model to Manage Bed Usage for Non-Elective Admissions in a Geriatric Medicine Speciality

Authors: Muhammed Ordu, Eren Demir, Chris Tofallis

Abstract:

Over the past decade, the non-elective admissions in the UK have increased significantly. Taking into account limited resources (i.e. beds), the related service managers are obliged to manage their resources effectively due to the non-elective admissions which are mostly admitted to inpatient specialities via A&E departments. Geriatric medicine is one of specialities that have long length of stay for the non-elective admissions. This study aims to develop a discrete event simulation model to understand how possible increases on non-elective demand over the next 12 months affect the bed occupancy rate and to determine required number of beds in a geriatric medicine speciality in a UK hospital. In our validated simulation model, we take into account observed frequency distributions which are derived from a big data covering the period April, 2009 to January, 2013, for the non-elective admission and the length of stay. An experimental analysis, which consists of 16 experiments, is carried out to better understand possible effects of case studies and scenarios related to increase on demand and number of bed. As a result, the speciality does not achieve the target level in the base model although the bed occupancy rate decreases from 125.94% to 96.41% by increasing the number of beds by 30%. In addition, the number of required beds is more than the number of beds considered in the scenario analysis in order to meet the bed requirement. This paper sheds light on bed management for service managers in geriatric medicine specialities.

Keywords: bed management, bed occupancy rate, discrete event simulation, geriatric medicine, non-elective admission

Procedia PDF Downloads 224
1547 A Low-Power Two-Stage Seismic Sensor Scheme for Earthquake Early Warning System

Authors: Arvind Srivastav, Tarun Kanti Bhattacharyya

Abstract:

The north-eastern, Himalayan, and Eastern Ghats Belt of India comprise of earthquake-prone, remote, and hilly terrains. Earthquakes have caused enormous damages in these regions in the past. A wireless sensor network based earthquake early warning system (EEWS) is being developed to mitigate the damages caused by earthquakes. It consists of sensor nodes, distributed over the region, that perform majority voting of the output of the seismic sensors in the vicinity, and relay a message to a base station to alert the residents when an earthquake is detected. At the heart of the EEWS is a low-power two-stage seismic sensor that continuously tracks seismic events from incoming three-axis accelerometer signal at the first-stage, and, in the presence of a seismic event, triggers the second-stage P-wave detector that detects the onset of P-wave in an earthquake event. The parameters of the P-wave detector have been optimized for minimizing detection time and maximizing the accuracy of detection.Working of the sensor scheme has been verified with seven earthquakes data retrieved from IRIS. In all test cases, the scheme detected the onset of P-wave accurately. Also, it has been established that the P-wave onset detection time reduces linearly with the sampling rate. It has been verified with test data; the detection time for data sampled at 10Hz was around 2 seconds which reduced to 0.3 second for the data sampled at 100Hz.

Keywords: earthquake early warning system, EEWS, STA/LTA, polarization, wavelet, event detector, P-wave detector

Procedia PDF Downloads 177
1546 A Timed and Colored Petri Nets for Modeling and Verify Cloud System Elasticity

Authors: Walid Louhichi, Mouhebeddine Berrima, Narjes Ben Rajed

Abstract:

Elasticity is the essential property of cloud computing. As the name suggests, it constitutes the ability of a cloud system to adjust resource provisioning in relation to fluctuating workload. There are two types of elasticity operations, vertical and horizontal. In this work, we are interested in horizontal scaling, which is ensured by two mechanisms; scaling in and scaling out. Following the sizing of the system, we can adopt scaling in in the event of over-supply and scaling out in the event of under-supply. In this paper, we propose a formal model, based on colored and temporized Petri nets, for the modeling of the duplication and the removal of a virtual machine from a server. This model is based on formal Petri Nets modeling language. The proposed models are edited, verified, and simulated with two examples implemented in CPNtools, which is a modeling tool for colored and timed Petri nets.

Keywords: cloud computing, elasticity, elasticity controller, petri nets, scaling in, scaling out

Procedia PDF Downloads 154
1545 Design and Modeling of Amphibious Houses for Flood Prone Areas: The Case of Nigeria

Authors: Onyebuchi Mogbo, Abdulsalam Mohammed, Salsabila Wali

Abstract:

This research discusses the design and modeling of an amphibious building. The amphibious building is a house with the function of floating during a flood event. Over the years, houses have been built to resist flood events some of which have failed. The floating house is designed to work with nature and not against it. In the event of a flood, the house will rise with the increasing water level and protect the house from sinking. For the design and modeling of this house an estimated cost of N250, 000, approximately $700, will be needed. It is expected that the house will rise when lightweight materials are incorporated in the design, and the concrete dock (in form of a hollow box) carrying the entire house in its hollow space is well designed. When there is flooding the water will fill up the concrete dock, and the house will rise upwards with vertical guides preventing it from moving side to side or out of its boundary. Architectural and Structural designs will be used in this project.

Keywords: amphibious building, flood, housing, design and modelling

Procedia PDF Downloads 182
1544 A Comparative Study of Sampling-Based Uncertainty Propagation with First Order Error Analysis and Percentile-Based Optimization

Authors: M. Gulam Kibria, Shourav Ahmed, Kais Zaman

Abstract:

In system analysis, the information on the uncertain input variables cause uncertainty in the system responses. Different probabilistic approaches for uncertainty representation and propagation in such cases exist in the literature. Different uncertainty representation approaches result in different outputs. Some of the approaches might result in a better estimation of system response than the other approaches. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge (MUQC) has posed challenges about uncertainty quantification. Subproblem A, the uncertainty characterization subproblem, of the challenge posed is addressed in this study. In this subproblem, the challenge is to gather knowledge about unknown model inputs which have inherent aleatory and epistemic uncertainties in them with responses (output) of the given computational model. We use two different methodologies to approach the problem. In the first methodology we use sampling-based uncertainty propagation with first order error analysis. In the other approach we place emphasis on the use of Percentile-Based Optimization (PBO). The NASA Langley MUQC’s subproblem A is developed in such a way that both aleatory and epistemic uncertainties need to be managed. The challenge problem classifies each uncertain parameter as belonging to one the following three types: (i) An aleatory uncertainty modeled as a random variable. It has a fixed functional form and known coefficients. This uncertainty cannot be reduced. (ii) An epistemic uncertainty modeled as a fixed but poorly known physical quantity that lies within a given interval. This uncertainty is reducible. (iii) A parameter might be aleatory but sufficient data might not be available to adequately model it as a single random variable. For example, the parameters of a normal variable, e.g., the mean and standard deviation, might not be precisely known but could be assumed to lie within some intervals. It results in a distributional p-box having the physical parameter with an aleatory uncertainty, but the parameters prescribing its mathematical model are subjected to epistemic uncertainties. Each of the parameters of the random variable is an unknown element of a known interval. This uncertainty is reducible. From the study, it is observed that due to practical limitations or computational expense, the sampling is not exhaustive in sampling-based methodology. That is why the sampling-based methodology has high probability of underestimating the output bounds. Therefore, an optimization-based strategy to convert uncertainty described by interval data into a probabilistic framework is necessary. This is achieved in this study by using PBO.

Keywords: aleatory uncertainty, epistemic uncertainty, first order error analysis, uncertainty quantification, percentile-based optimization

Procedia PDF Downloads 241
1543 Managing the Cloud Procurement Process: Findings from a Case Study

Authors: Andreas Jede, Frank Teuteberg

Abstract:

Cloud computing (CC) has already gained overall appreciation in research and practice. Whereas the willingness to integrate cloud services in various IT environments is still unbroken, the previous CC procurement processes run mostly in an unorganized and non-standardized way. In practice, a sufficiently specific, yet applicable business process for the important acquisition phase is often lacking. And research does not appropriately remedy this deficiency yet. Therefore, this paper introduces a field-tested approach for CC procurement. Based on an extensive literature review and augmented by expert interviews, we designed a model that is validated and further refined through an in-depth real-life case study. For the detailed process description, we apply the event-driven process chain notation (EPC). The gained valuable insights into the case study may help CC research to shift to a more socio-technical area. For practice, next to giving useful organizational instructions we will provide extended checklists and lessons learned.

Keywords: cloud procurement process, IT-organization, event-driven process chain, in-depth case study

Procedia PDF Downloads 394
1542 The Patterns of Cross-Sentence: An Event-Related Potential Study of Mathematical Word Problem

Authors: Tien-Ching Yao, Ching-Ching Lu

Abstract:

Understanding human language processing is one of the main challenges of current cognitive neuroscience. The aims of the present study were to use a sentence decision task combined with event-related potentials to investigate the psychological reality of "cross-sentence patterns." Therefore, we take the math word problems the experimental materials and use the ERPs' P600 component to verify. In this study, the experimental material consisted of 200 math word problems with three different conditions were used ( multiplication word problems、division word problems type 1、division word problems type 2 ). Eighteen Mandarin native speakers participated in the ERPs study (14 of whom were female). The result of the grand average waveforms suggests a later posterior positivity at around 500ms - 900ms. These findings were tested statistically using repeated measures ANOVAs at the component caused by the stimulus type of different questions. Results suggest that three conditions present significant (P < 0.05) on the Mean Amplitude, Latency, and Peak Amplitude. The result showed the characteristic timing and posterior scalp distribution of a P600 effect. We interpreted these characteristic responses as the psychological reality of "cross-sentence patterns." These results provide insights into the sentence processing issues in linguistic theory and psycholinguistic models of language processing and advance our understanding of how people make sense of information during language comprehension.

Keywords: language processing, sentence comprehension, event-related potentials, cross-sentence patterns

Procedia PDF Downloads 150
1541 Navigating Rapids And Collecting Medical Insights: A Data Collection Of Athletes Presenting To The Medical Team At The International Canoe Federation Canoe Slalom World Championships 2023

Authors: Grace Scaplehorn, Muhammad Adeel Akhtar, Jane Gibson

Abstract:

Background: Canoe Slalom entails the skilful navigation of a carbon composite canoe or kayak through a series of 18-25 hanging gates, strategically positioned along the course, either upstream or downstream, amidst currents of whitewater rapids in natural and man-made river settings. Athletes compete individually in timed trials, competing for the fastest course time, typically around 80 to 120 seconds. In the new discipline of Kayak Cross, descents of the course are initiated by groups of four athletes freefalling simultaneously from a starting platform situated 3m above the river. Kayak Cross athletes, in contrast to Canoe Slalom, can make physical contact with suspended gates without incurring time penalties and are required to perform a kayak roll half way down the course. The Canoe Slalom World Championships were held at Lee Valley Whitewater Centre, London, from 19th to 24th September 2023. The event comprised 299 international athletes competing for 10 World Championship titles in Canoe/Kayak Slalom events (Olympic Debut Munich 1972), and the new Kayak Cross discipline (Olympic Debut Paris 2024). The inaugural appearance of Kayak Cross at the World Championships occurred in 2017, in Pau, France. There is limited literature surrounding Kayak Cross and the incidence of athlete injuries compared to traditional Canoe Slalom, hence it was felt important to undertake this review to address the perception that the event is dangerous. Aim: The study aimed to quantify and collate data collected from athletes presenting to the event medical centre. Methods: Athletes’ details were collected at initial assessments from the start of the practice period (16th–18th September) and throughout the event. Demographics such as age, sex and nationality were recorded along with presenting complaints, treatment, medication administered and outcome. Specifically, injuries were then sub-classified into body regions. The data does not include athletes who sought medical attention from their own governing body’s medical team. Results: During the 8-day period, there were 11 individual presentations to the medical centre, 3.7% of the athlete population (n=299). The mean age was 23.9 years (n=7), 6 were male (n=10). The most common presentation was minor injury (n=9), with 6 being musculoskeletal and 3 comprising skin damage, followed by insect sting/allergy (n=1) and pain relief requests (n=1). Five presentations were event-related, all being musculoskeletal injuries; 2 shoulder/arm, 1 head/neck, 1 hand/wrist and 1 other (data was not recorded). Of these injuries, the only intervention was 2 cases of 400mg Ibuprofen, which was given to both shoulder/arm injuries. Four of the 11 presentations were pre-existing injuries, which had been exacerbated due to increased intensity of practice. Two patients were advised to return for review, with 100% compliance. There were no unplanned re-presentations, and no emergency transfers to secondary care. Both the Kayak Cross and Canoe Slalom competitions resulted in 1 new event-related athlete presentation each. Conclusion: The event resulted in a negligible incidence of presentations at the medical centre, for both Kayak Cross and Canoe Slalom. This data holds significance in informing risk assessments and medical protocols necessary for the organisation of canoe slalom events.

Keywords: canoe slalom, kayak cross, athlete injuries, event injuries

Procedia PDF Downloads 57
1540 MNH-886(Bt.): A Cotton Cultivar (G. Hirsutum L.) for Cultivation in Virus Infested Regions of Pakistan, Having High Seed Cotton Yield and Desirable Fibre Characteristics

Authors: Wajad Nazeer, Saghir Ahmad, Khalid Mahmood, Altaf Hussain, Abid Mahmood, Baoliang Zhou

Abstract:

MNH-886(Bt.) is a upland cotton cultivar (Gossypium hirsutum L.) developed through hybridization of three parents [(FH-207×MNH-770)×Bollgard-1] at Cotton Research Station Multan, Pakistan. It is resistant to CLCuVD with 16.25 % disease incidence (60 DAS, March sowing) whereas moderately susceptible to CLCuVD when planted in June with disease incidence 34 % (60 DAS). This disease reaction was lowest among 25 cotton advanced lines/varieties tested at hot spots of CLCuVD. Its performance was tested during 2009 to 2012 in various indigenous, provincial, and national varietal trials in comparison with the commercial variety IR-3701 and AA-802 & CIM-496. In PCCT trial during 2009-10; 2011-12, MNH-886 surpassed all the existing Bt. strains along with commercial varieties across the Punjab province with seed cotton yield production 2658 kg ha-1 and 2848 kg ha-1 which was 81.31 and 13% higher than checks, respectively. In National Coordinated Bt. Trial, MNH-886(Bt.) produced 3347 kg ha-1 seed cotton at CCRI, Multan; the hot spot of CLCuVD, in comparison to IR-3701 which gave 2556 kg ha-1. It possesses higher lint percentage (41.01%), along with the most desirable fibre traits (staple length 28.210mm, micronaire value 4.95 µg inch-1 and fibre strength 99.5 tppsi, and uniformity ratio 82.0%). The quantification of toxicity level of crystal protein was found positive for Cry1Ab/Ac protein with toxicity level 2.76µg g-1 and Mon 531 event was confirmed. Having tremendous yield potential, good fibre traits, and great tolerance to CLCuVD we can recommended this variety for cultivation in CLCuVD hotspots of Pakistan.

Keywords: cotton, cultivar, cotton leaf curl virus, CLCuVD hit districts

Procedia PDF Downloads 319
1539 Two-Sided Information Dissemination in Takeovers: Disclosure and Media

Authors: Eda Orhun

Abstract:

Purpose: This paper analyzes a target firm’s decision to voluntarily disclose information during a takeover event and the effect of such disclosures on the outcome of the takeover. Such voluntary disclosures especially in the form of earnings forecasts made around takeover events may affect shareholders’ decisions about the target firm’s value and in return takeover result. This study aims to shed light on this question. Design/methodology/approach: The paper tries to understand the role of voluntary disclosures by target firms during a takeover event in the likelihood of takeover success both theoretically and empirically. A game-theoretical model is set up to analyze the voluntary disclosure decision of a target firm to inform the shareholders about its real worth. The empirical implication of model is tested by employing binary outcome models where the disclosure variable is obtained by identifying the target firms in the sample that provide positive news by issuing increasing management earnings forecasts. Findings: The model predicts that a voluntary disclosure of positive information by the target decreases the likelihood that the takeover succeeds. The empirical analysis confirms this prediction by showing that positive earnings forecasts by target firms during takeover events increase the probability of takeover failure. Overall, it is shown that information dissemination through voluntary disclosures by target firms is an important factor affecting takeover outcomes. Originality/Value: This study is the first to the author's knowledge that studies the impact of voluntary disclosures by the target firm during a takeover event on the likelihood of takeover success. The results contribute to information economics, corporate finance and M&As literatures.

Keywords: takeovers, target firm, voluntary disclosures, earnings forecasts, takeover success

Procedia PDF Downloads 320
1538 Loss Quantification Archaeological Sites in Watershed Due to the Use and Occupation of Land

Authors: Elissandro Voigt Beier, Cristiano Poleto

Abstract:

The main objective of the research is to assess the loss through the quantification of material culture (archaeological fragments) in rural areas, sites explored economically by machining on seasonal crops, and also permanent, in a hydrographic subsystem Camaquã River in the state of Rio Grande do Sul, Brazil. The study area consists of different micro basins and differs in area, ranging between 1,000 m² and 10,000 m², respectively the largest and the smallest, all with a large number of occurrences and outcrop locations of archaeological material and high density in intense farm environment. In the first stage of the research aimed to identify the dispersion of points of archaeological material through field survey through plot points by the Global Positioning System (GPS), within each river basin, was made use of concise bibliography on the topic in the region, helping theoretically in understanding the old landscaping with preferences of occupation for reasons of ancient historical people through the settlements relating to the practice observed in the field. The mapping was followed by the cartographic development in the region through the development of cartographic products of the land elevation, consequently were created cartographic products were to contribute to the understanding of the distribution of the absolute materials; the definition and scope of the material dispersed; and as a result of human activities the development of revolving letter by mechanization of in situ material, it was also necessary for the preparation of materials found density maps, linking natural environments conducive to ancient historical occupation with the current human occupation. The third stage of the project it is for the systematic collection of archaeological material without alteration or interference in the subsurface of the indigenous settlements, thus, the material was prepared and treated in the laboratory to remove soil excesses, cleaning through previous communication methodology, measurement and quantification. Approximately 15,000 were identified archaeological fragments belonging to different periods of ancient history of the region, all collected outside of its environmental and historical context and it also has quite changed and modified. The material was identified and cataloged considering features such as object weight, size, type of material (lithic, ceramic, bone, Historical porcelain and their true association with the ancient history) and it was disregarded its principles as individual lithology of the object and functionality same. As observed preliminary results, we can point out the change of materials by heavy mechanization and consequent soil disturbance processes, and these processes generate loading of archaeological materials. Therefore, as a next step will be sought, an estimate of potential losses through a mathematical model. It is expected by this process, to reach a reliable model of high accuracy which can be applied to an archeological site of lower density without encountering a significant error.

Keywords: degradation of heritage, quantification in archaeology, watershed, use and occupation of land

Procedia PDF Downloads 277
1537 Physiological and Psychological Influence on Office Workers during Demand Response

Authors: Megumi Nishida, Naoya Motegi, Takurou Kikuchi, Tomoko Tokumura

Abstract:

In recent years, power system has been changed and flexible power pricing system such as demand response has been sought in Japan. The demand response system is simple in the household sector and the owner, decision-maker, can gain the benefits of power saving. On the other hand, the execution of the demand response in the office building is more complex than household because various people such as owners, building administrators and occupants are involved in making decisions. While the owners benefit from the demand saving, the occupants are forced to be exposed to demand-saved environment certain benefits. One of the reasons is that building systems are usually centralized control and each occupant cannot choose either participate demand response event or not, and contribution of each occupant to demand response is unclear to provide incentives. However, the recent development of IT and building systems enables the personalized control of office environment where each occupant can control the lighting level or temperature around him or herself. Therefore, it can be possible to have a system which each occupant can make a decision of demand response participation in office building. This study investigates the personal behavior upon demand response requests, under the condition where each occupant can adjust their brightness individually in their workspace. Once workers participate in the demand response, their task lights are automatically turned off. The participation rates in the demand response events are compared between four groups which are divided by different motivation, the presence or absence of incentives and the way of participation. The result shows that there are the significant differences of participation rates in demand response event between four groups. The way of participation has a large effect on the participation rate. ‘Opt-out’ group, where the occupants are automatically enrolled in a demand response event if they don't express non-participation, will have the highest participation rate in the four groups. The incentive has also an effect on the participation rate. This study also reports that the impact of low illumination office environment on the occupants, such as stress or fatigue. The electrocardiogram and the questionnaire are used to investigate the autonomic nervous activity and subjective symptoms about the fatigue of the occupants. There is no big difference between dim workspace during demand response event and bright workspace in autonomic nervous activity and fatigue.

Keywords: demand response, illumination, questionnaire, electrocardiogram

Procedia PDF Downloads 352
1536 Probabilistic Fracture Evaluation of Reactor Pressure Vessel Subjected to Pressurized Thermal Shock

Authors: Jianguo Chen, Fenggang Zang, Yu Yang, Liangang Zheng

Abstract:

Reactor Pressure Vessel (RPV) is an important security barrier in nuclear power plant. Crack like defects may be produced on RPV during the whole operation lifetime due to the harsh operation condition and irradiation embrittlement. During the severe loss of coolant accident, thermal shock happened as the injection of emergency cooling water into RPV, which results in re-pressurization of the vessel and very high tension stress on the vessel wall, this event called Pressurized Thermal Shock (PTS). Crack on the vessel wall may propagate even penetrate the vessel, so the safety of the RPV would undergo great challenge. Many assumptions in structure integrity evaluation make the result of deterministic fracture mechanics very conservative, which affect the operation lifetime of the plant. Actually, many parameters in the evaluation process, such as fracture toughness and nil-ductility transition temperature, have statistical distribution characteristics. So it is necessary to assess the structural integrity of RPV subjected to PTS event by means of Probabilistic Fracture Mechanics (PFM). Structure integrity evaluation methods of RPV subjected to PTS event are summarized firstly, then evaluation method based on probabilistic fracture mechanics are presented by considering the probabilistic characteristics of material and structure parameters. A comprehensive analysis example is carried out at last. The results show that the probability of crack penetrates through wall increases gradually with the growth of fast neutron irradiation flux. The results give advice for reactor life extension.

Keywords: fracture toughness, integrity evaluation, pressurized thermal shock, probabilistic fracture mechanics, reactor pressure vessel

Procedia PDF Downloads 252
1535 Blockchain’s Feasibility in Military Data Networks

Authors: Brenden M. Shutt, Lubjana Beshaj, Paul L. Goethals, Ambrose Kam

Abstract:

Communication security is of particular interest to military data networks. A relatively novel approach to network security is blockchain, a cryptographically secured distribution ledger with a decentralized consensus mechanism for data transaction processing. Recent advances in blockchain technology have proposed new techniques for both data validation and trust management, as well as different frameworks for managing dataflow. The purpose of this work is to test the feasibility of different blockchain architectures as applied to military command and control networks. Various architectures are tested through discrete-event simulation and the feasibility is determined based upon a blockchain design’s ability to maintain long-term stable performance at industry standards of throughput, network latency, and security. This work proposes a consortium blockchain architecture with a computationally inexpensive consensus mechanism, one that leverages a Proof-of-Identity (PoI) concept and a reputation management mechanism.

Keywords: blockchain, consensus mechanism, discrete-event simulation, fog computing

Procedia PDF Downloads 139
1534 Petri Net Modeling and Simulation of a Call-Taxi System

Authors: T. Godwin

Abstract:

A call-taxi system is a type of taxi service where a taxi could be requested through a phone call or mobile app. A schematic functioning of a call-taxi system is modeled using Petri net, which provides the necessary conditions for a taxi to be assigned by a dispatcher to pick a customer as well as the conditions for the taxi to be released by the customer. A Petri net is a graphical modeling tool used to understand sequences, concurrences, and confluences of activities in the working of discrete event systems. It uses tokens on a directed bipartite multi-graph to simulate the activities of a system. The Petri net model is translated into a simulation model and a call-taxi system is simulated. The simulation model helps in evaluating the operation of a call-taxi system based on the fleet size as well as the operating policies for call-taxi assignment and empty call-taxi repositioning. The developed Petri net based simulation model can be used to decide the fleet size as well as the call-taxi assignment policies for a call-taxi system.

Keywords: call-taxi, discrete event system, petri net, simulation modeling

Procedia PDF Downloads 425
1533 Uncertainty Quantification of Fuel Compositions on Premixed Bio-Syngas Combustion at High-Pressure

Authors: Kai Zhang, Xi Jiang

Abstract:

Effect of fuel variabilities on premixed combustion of bio-syngas mixtures is of great importance in bio-syngas utilisation. The uncertainties of concentrations of fuel constituents such as H2, CO and CH4 may lead to unpredictable combustion performances, combustion instabilities and hot spots which may deteriorate and damage the combustion hardware. Numerical modelling and simulations can assist in understanding the behaviour of bio-syngas combustion with pre-defined species concentrations, while the evaluation of variabilities of concentrations is expensive. To be more specific, questions such as ‘what is the burning velocity of bio-syngas at specific equivalence ratio?’ have been answered either experimentally or numerically, while questions such as ‘what is the likelihood of burning velocity when precise concentrations of bio-syngas compositions are unknown, but the concentration ranges are pre-described?’ have not yet been answered. Uncertainty quantification (UQ) methods can be used to tackle such questions and assess the effects of fuel compositions. An efficient probabilistic UQ method based on Polynomial Chaos Expansion (PCE) techniques is employed in this study. The method relies on representing random variables (combustion performances) with orthogonal polynomials such as Legendre or Gaussian polynomials. The constructed PCE via Galerkin Projection provides easy access to global sensitivities such as main, joint and total Sobol indices. In this study, impacts of fuel compositions on combustion (adiabatic flame temperature and laminar flame speed) of bio-syngas fuel mixtures are presented invoking this PCE technique at several equivalence ratios. High-pressure effects on bio-syngas combustion instability are obtained using detailed chemical mechanism - the San Diego Mechanism. Guidance on reducing combustion instability from upstream biomass gasification process is provided by quantifying the significant contributions of composition variations to variance of physicochemical properties of bio-syngas combustion. It was found that flame speed is very sensitive to hydrogen variability in bio-syngas, and reducing hydrogen uncertainty from upstream biomass gasification processes can greatly reduce bio-syngas combustion instability. Variation of methane concentration, although thought to be important, has limited impacts on laminar flame instabilities especially for lean combustion. Further studies on the UQ of percentage concentration of hydrogen in bio-syngas can be conducted to guide the safer use of bio-syngas.

Keywords: bio-syngas combustion, clean energy utilisation, fuel variability, PCE, targeted uncertainty reduction, uncertainty quantification

Procedia PDF Downloads 276
1532 The Strategy of Traditional Religious Culture Tourism: Taking Taiwan Minhsiung Infernal Lord Festival for Example

Authors: Ching-Yi Wang

Abstract:

The purpose of this study is to explore strategies for integrate Minhsiung environments and cultural resources for Infernal Lord Festival. Minhsiung Infernal Lord Festival is one of the famous religious event in Chia-Yi County, Taiwan. This religious event and the life of local residents are inseparable. Minhsiung Infernal Lord Festival has a rich cultural ceremonies meaning and sentiment of local concern. This study apply field study, document analysis and interviews to analyze Minhsiung Township’s featured attractions and folklore events. The research results reveal the difficulties and strategies while incorporating culture elements into culture tourism. This study hopes to provide innovative techniques for the purpose of prolonging the feasibility of future development of the tradition folk culture.

Keywords: Taiwan folk culture, Minhsiung Infernal Lord Festival, religious tourism, folklore, cultural tourism

Procedia PDF Downloads 340
1531 Rapid, Label-Free, Direct Detection and Quantification of Escherichia coli Bacteria Using Nonlinear Acoustic Aptasensor

Authors: Shilpa Khobragade, Carlos Da Silva Granja, Niklas Sandström, Igor Efimov, Victor P. Ostanin, Wouter van der Wijngaart, David Klenerman, Sourav K. Ghosh

Abstract:

Rapid, label-free and direct detection of pathogenic bacteria is critical for the prevention of disease outbreaks. This paper for the first time attempts to probe the nonlinear acoustic response of quartz crystal resonator (QCR) functionalized with specific DNA aptamers for direct detection and quantification of viable E. coli KCTC 2571 bacteria. DNA aptamers were immobilized through biotin and streptavidin conjugation, onto the gold surface of QCR to capture the target bacteria and the detection was accomplished by shift in amplitude of the peak 3f signal (3 times the drive frequency) upon binding, when driven near fundamental resonance frequency. The developed nonlinear acoustic aptasensor system demonstrated better reliability than conventional resonance frequency shift and energy dissipation monitoring that were recorded simultaneously. This sensing system could directly detect 10⁽⁵⁾ cells/mL target bacteria within 30 min or less and had high specificity towards E. coli KCTC 2571 bacteria as compared to the same concentration of S.typhi bacteria. Aptasensor response was observed for the bacterial suspensions ranging from 10⁽⁵⁾-10⁽⁸⁾ cells/mL. Conclusively, this nonlinear acoustic aptasensor is simple to use, gives real-time output, cost-effective and has the potential for rapid, specific, label-free direction detection of bacteria.

Keywords: acoustic, aptasensor, detection, nonlinear

Procedia PDF Downloads 567
1530 Determination of MDA by HPLC in Blood of Levofloxacin Treated Rats

Authors: D. S. Mohale, A. P. Dewani, A. S.tripathi, A. V. Chandewar

Abstract:

Present work demonstrates the applicability of high-performance liquid chromatography (HPLC) with UV-Vis detection for the quantification of malondialdehyde as malondialdehyde-thiobarbituric acid complex (MDA-TBA) in-vivo in rats. The HPLC method for MDA-TBA was achieved by isocratic mode on a reverse-phase C18 column (250mm×4.6mm) at a flow rate of 1.0mLmin−1 followed by detection at 532 nm. The chromatographic conditions were optimized by varying the concentration and pH of water followed by changes in percentage of organic phase optimal mobile phase consisted of mixture of water (0.2% triethylamine pH adjusted to 2.3 by ortho-phosphoric acid) and acetonitrile in ratio (80:20v/v). The retention time of MDA-TBA complex was 3.7 min. The developed method was sensitive as limit of detection and quantification (LOD and LOQ) for MDA-TBA complex were (standard deviation and slope of calibration curve) 110 ng/ml and 363 ng/ml respectively. Calibration studies were done by spiking MDA into rat plasma at concentrations ranging from 500 to 1000 ng/ml. The precision of developed method measured in terms of relative standard deviations for intra-day and inter-day studies was 1.6–5.0% and 1.9–3.6% respectively. The HPLC method was applied for monitoring MDA levels in rats subjected to chronic treatment of levofloxacin (LEV) (5mg/kg/day) for 21 days. Results were compared by findings in control group rats. Mean peak areas of both study groups was subjected for statistical treatment to unpaired student t-test to find p-values. The p value was <0.001 indicating significant results and suggesting increased MDA levels in rats subjected to chronic treatment of LEV of 21 days.

Keywords: malondialdehyde-thiobarbituric acid complex, levofloxacin, HPLC, oxidative stress

Procedia PDF Downloads 334
1529 Nanoparticle-Based Histidine-Rich Protein-2 Assay for the Detection of the Malaria Parasite Plasmodium Falciparum

Authors: Yagahira E. Castro-Sesquen, Chloe Kim, Robert H. Gilman, David J. Sullivan, Peter C. Searson

Abstract:

Diagnosis of severe malaria is particularly important in highly endemic regions since most patients are positive for parasitemia and treatment differs from non-severe malaria. Diagnosis can be challenging due to the prevalence of diseases with similar symptoms. Accurate diagnosis is increasingly important to avoid overprescribing antimalarial drugs, minimize drug resistance, and minimize costs. A nanoparticle-based assay for detection and quantification of Plasmodium falciparum histidine-rich protein 2 (HRP2) in urine and serum is reported. The assay uses magnetic beads conjugated with anti-HRP2 antibody for protein capture and concentration, and antibody-conjugated quantum dots for optical detection. Western Blot analysis demonstrated that magnetic beads allows the concentration of HRP2 protein in urine by 20-fold. The concentration effect was achieved because large volume of urine can be incubated with beads, and magnetic separation can be easily performed in minutes to isolate beads containing HRP2 protein. Magnetic beads and Quantum Dots 525 conjugated to anti-HRP2 antibodies allows the detection of low concentration of HRP2 protein (0.5 ng mL-1), and quantification in the range of 33 to 2,000 ng mL-1 corresponding to the range associated with non-severe to severe malaria. This assay can be easily adapted to a non-invasive point-of-care test for classification of severe malaria.

Keywords: HRP2 protein, malaria, magnetic beads, Quantum dots

Procedia PDF Downloads 333