Search results for: event study methodology
51830 Competing Risks Modeling Using within Node Homogeneity Classification Tree
Authors: Kazeem Adesina Dauda, Waheed Babatunde Yahya
Abstract:
To design a tree that maximizes within-node homogeneity, there is a need for a homogeneity measure that is appropriate for event history data with multiple risks. We consider the use of Deviance and Modified Cox-Snell residuals as a measure of impurity in Classification Regression Tree (CART) and compare our results with the results of Fiona (2008) in which homogeneity measures were based on Martingale Residual. Data structure approach was used to validate the performance of our proposed techniques via simulation and real life data. The results of univariate competing risk revealed that: using Deviance and Cox-Snell residuals as a response in within node homogeneity classification tree perform better than using other residuals irrespective of performance techniques. Bone marrow transplant data and double-blinded randomized clinical trial, conducted in other to compare two treatments for patients with prostate cancer were used to demonstrate the efficiency of our proposed method vis-à-vis the existing ones. Results from empirical studies of the bone marrow transplant data showed that the proposed model with Cox-Snell residual (Deviance=16.6498) performs better than both the Martingale residual (deviance=160.3592) and Deviance residual (Deviance=556.8822) in both event of interest and competing risks. Additionally, results from prostate cancer also reveal the performance of proposed model over the existing one in both causes, interestingly, Cox-Snell residual (MSE=0.01783563) outfit both the Martingale residual (MSE=0.1853148) and Deviance residual (MSE=0.8043366). Moreover, these results validate those obtained from the Monte-Carlo studies.Keywords: within-node homogeneity, Martingale residual, modified Cox-Snell residual, classification and regression tree
Procedia PDF Downloads 27251829 Impact of Minimalism in Dance Education on the Development of Aesthetic Sensibilities
Authors: Meghamala Nugehally
Abstract:
This paper hypothesises and draws inferences on the impact of minimalism in dance education on the development of artistic and aesthetic sensibilities in individuals in the age group of 5-18 yrs of age. This research and conclusions are within the context of Indian Classical Dance, which is based on Indian theories of aesthetics drawn from the Natyashastra, an ancient treatise on Indian dance and drama. The research employs training methods handed down through a strict one-on-one teacher-student tradition known as the Guru-Shishya Parampara. Aesthetic principles used are defined, and basic theories from the Natyashastra are explained to provide background for the research design. The paper also discusses dance curriculum design and training methodology design within the context of these aesthetic theories. The scope of the research is limited to two genres of Indian classical forms: Bharatanatyam and Odissi. A brief description of these dance forms is given as background and dance aesthetics specific to these forms are described. The research design includes individual case studies of subjects studied, independent predetermined attributes for observations and a qualitative scoring methodology devised for the purpose of the study. The study describes the training techniques used and contrasts minimal solo training techniques with the more elaborate group training techniques. Study groups were divided and the basis for the division are discussed. Study observations are recorded and presented as evidences. The results inform the conclusion and set the stage for further research in this area.Keywords: dance aesthetics, dance education, Indian classical dance, minimalism
Procedia PDF Downloads 22851828 Sweet to Bitter Perception Parageusia: Case of Posterior Inferior Cerebellar Artery Territory Diaschisis
Authors: I. S. Gandhi, D. N. Patel, M. Johnson, A. R. Hirsch
Abstract:
Although distortion of taste perception following a cerebrovascular event may seem to be a frivolous consequence of a classic stroke presentation, altered taste perception places patients at an increased risk for malnutrition, weight loss, and depression, all of which negatively impact the quality of life. Impaired taste perception can result from a wide variety of cerebrovascular lesions to various locations, including pons, insular cortices, and ventral posteromedial nucleus of the thalamus. Wallenberg syndrome, also known as a lateral medullary syndrome, has been described to impact taste; however, specific sweet to bitter taste dysgeusia from a territory infarction is an infrequent event; as such, a case is presented. One year prior to presentation, this 64-year-old right-handed woman, suffered a right posterior inferior cerebellar artery aneurysm rupture with resultant infarction, culminating in a ventriculoperitoneal shunt placement. One and half months after this event, she noticed the gradual onset of lack of ability to taste sweet, to eventually all sweet food tasting bitter. Since the onset of her chemosensory problems, the patient has lost 60-pounds. Upon gustatory testing, the patient's taste threshold showed ageusia to sucrose and hydrochloric acid, while normogeusia to sodium chloride, urea, and phenylthiocarbamide. The gustatory cortex is made in part by the right insular cortex as well as the right anterior operculum, which are primarily involved in the sensory taste modalities. In this model, sweet is localized in the posterior-most along with the rostral aspect of the right insular cortex, notably adjacent to the region responsible for bitter taste. The sweet to bitter dysgeusia in our patient suggests the presence of a lesion in this localization. Although the primary lesion in this patient was located in the right medulla of the brainstem, neurodegeneration in the rostal and posterior-most aspect, of the right insular cortex may have occurred due to diaschisis. Diaschisis has been described as neurophysiological changes that occur in remote regions to a focal brain lesion. Although hydrocephalus and vasospasm due to aneurysmal rupture may explain the distal foci of impairment, the gradual onset of dysgeusia is more indicative of diaschisis. The perception of sweet, now tasting bitter, suggests that in the absence of sweet taste reception, the intrinsic bitter taste of food is now being stimulated rather than sweet. In the evaluation and treatment of taste parageusia secondary to cerebrovascular injury, prophylactic neuroprotective measures may be worthwhile. Further investigation is warranted.Keywords: diaschisis, dysgeusia, stroke, taste
Procedia PDF Downloads 18051827 Explaining Irregularity in Music by Entropy and Information Content
Authors: Lorena Mihelac, Janez Povh
Abstract:
In 2017, we conducted a research study using data consisting of 160 musical excerpts from different musical styles, to analyze the impact of entropy of the harmony on the acceptability of music. In measuring the entropy of harmony, we were interested in unigrams (individual chords in the harmonic progression) and bigrams (the connection of two adjacent chords). In this study, it has been found that 53 musical excerpts out from 160 were evaluated by participants as very complex, although the entropy of the harmonic progression (unigrams and bigrams) was calculated as low. We have explained this by particularities of chord progression, which impact the listener's feeling of complexity and acceptability. We have evaluated the same data twice with new participants in 2018 and with the same participants for the third time in 2019. These three evaluations have shown that the same 53 musical excerpts, found to be difficult and complex in the study conducted in 2017, are exhibiting a high feeling of complexity again. It was proposed that the content of these musical excerpts, defined as “irregular,” is not meeting the listener's expectancy and the basic perceptual principles, creating a higher feeling of difficulty and complexity. As the “irregularities” in these 53 musical excerpts seem to be perceived by the participants without being aware of it, affecting the pleasantness and the feeling of complexity, they have been defined as “subliminal irregularities” and the 53 musical excerpts as “irregular.” In our recent study (2019) of the same data (used in previous research works), we have proposed a new measure of the complexity of harmony, “regularity,” based on the irregularities in the harmonic progression and other plausible particularities in the musical structure found in previous studies. We have in this study also proposed a list of 10 different particularities for which we were assuming that they are impacting the participant’s perception of complexity in harmony. These ten particularities have been tested in this paper, by extending the analysis in our 53 irregular musical excerpts from harmony to melody. In the examining of melody, we have used the computational model “Information Dynamics of Music” (IDyOM) and two information-theoretic measures: entropy - the uncertainty of the prediction before the next event is heard, and information content - the unexpectedness of an event in a sequence. In order to describe the features of melody in these musical examples, we have used four different viewpoints: pitch, interval, duration, scale degree. The results have shown that the texture of melody (e.g., multiple voices, homorhythmic structure) and structure of melody (e.g., huge interval leaps, syncopated rhythm, implied harmony in compound melodies) in these musical excerpts are impacting the participant’s perception of complexity. High information content values were found in compound melodies in which implied harmonies seem to have suggested additional harmonies, affecting the participant’s perception of the chord progression in harmony by creating a sense of an ambiguous musical structure.Keywords: entropy and information content, harmony, subliminal (ir)regularity, IDyOM
Procedia PDF Downloads 13151826 Patient Safety Culture in Brazilian Hospitals from Nurse's Team Perspective
Authors: Carmen Silvia Gabriel, Dsniele Bernardi da Costa, Andrea Bernardes, Sabrina Elias Mikael, Daniele da Silva Ramos
Abstract:
The goal of this quantitative study is to investigate patient safety culture from the perspective of professional from the hospital nursing team.It was conducted in two Brazilian hospitals,.The sample included 282 nurses Data collection occurred in 2013, through the questionnaire Hospital Survey on Patient Safety Culture.Based on the assessment of the dimensions is stressed that, in the dimension teamwork across hospital units, 69.4% of professionals agree that when a lot of work needs to be done quickly, they work together as a team; about the dimension supervisor/ manager expectations and actions promoting safety, 70.2% agree that their supervisor overlooks patient safety problems.Related to organizational learning and continuous improvement, 56.5% agree that there is evaluation of the effectiveness of the changes after its implementation.On hospital management support for patient safety, 52.8% refer that the actions of hospital management show that patient safety is a top priority.On the overall perception of patient safety, 57.2% disagree that patient safety is never compromised due to higher amount of work to be completed.In what refers to feedback and communication about error, 57.7% refer that always and usually receive such information. Relative to communication openness, 42.9% said they never or rarely feel free to question the decisions / actions of their superiors.On frequency of event reporting, 64.7% said often and always notify events with no damages to patients..About teamwork across hospital units is noted similarity between the percentages of agreement and disagreement, as on the item there is a good cooperation among hospital units that need to work together, that indicates 41.4% and 40.5% respectively.Related to adequacy of professionals, 77.8 % disagree on the existence of sufficient amount of employees to do the job, 52.4% agree that shift changes are problematic for patients. On nonpunitive response to errors, 71.7% indicate that when an event is reported it seems that the focus is on the person.On the patient safety grade of the institution, 41.6 % classified it as very good. it is concluded that there are positive points in the safety culture, and some weaknesses as a punitive culture and impaired patient safety due to work overload .Keywords: quality of health care, health services evaluation, safety culture, patient safety, nursing team
Procedia PDF Downloads 29951825 Using Scrum in an Online Smart Classroom Environment: A Case Study
Authors: Ye Wei, Sitalakshmi Venkatraman, Fahri Benli, Fiona Wahr
Abstract:
The present digital world poses many challenges to various stakeholders in the education sector. In particular, lecturers of higher education (HE) are faced with the problem of ensuring that students are able to achieve the required learning outcomes despite rapid changes taking place worldwide. Different strategies are adopted to retain student engagement and commitment in classrooms to address the differences in learning habits, preferences, and styles of the digital generation of students recently. Further, the onset of the coronavirus disease (COVID-19) pandemic has resulted in online teaching being mandatory. These changes have compounded the problems in the learning engagement and short attention span of HE students. New agile methodologies that have been successfully employed to manage projects in different fields are gaining prominence in the education domain. In this paper, we present the application of Scrum as an agile methodology to enhance student learning and engagement in an online smart classroom environment. We demonstrate the use of our proposed approach using a case study to teach key topics in information technology that require students to gain technical and business-related data analytics skills.Keywords: agile methodology, Scrum, online learning, smart classroom environment, student engagement, active learning
Procedia PDF Downloads 16351824 A Soft System Approach to Explore Ill-Defined Issues in Distance Education System - A Case of Saudi Arabia
Authors: Sulafah Basahel
Abstract:
Nowadays, Higher Education Institutions (HEIs) around the world are attempting to utilize Information and Communication Technologies (ICTs) to enhance learning process and strategies of knowledge delivery for students through Distance Education (DE) system. Stakeholders in DE system face a complex situation of different ill-defined and related issues that influence decision making process. In this study system thinking as a body of knowledge is used to explore the emergent properties that produced from these connections between issues and could have either positive or negative outcomes for the DE development. Checkland Soft System Methodology (SSM) - Mode 2 is employed in a cultural context of Saudi Arabia for more knowledge acquisition purposes among multiple stakeholders in DE rather than solving problems to achieve an overall development of DE system. This paper will discuss some political, cultural issues and connections between them that impact on effectiveness of stakeholders’ activities and relations. This study will significantly contribute to both system thinking and education fields by leading decision makers in DE to reconsider future plans, strategies and right actions for more successful educational practices.Keywords: distance education, higher education institutions, ill-defined issues, soft system methodology-Mode 2
Procedia PDF Downloads 27051823 Optimization of Extraction Conditions and Characteristics of Scale collagen From Sardine: Sardina pilchardus
Authors: F. Bellali, M. Kharroubi, M. Loutfi, N.Bourhim
Abstract:
In Morocco, fish processing industry is an important source income for a large amount of byproducts including skins, bones, heads, guts and scales. Those underutilized resources particularly scales contain a large amount of proteins and calcium. Scales from Sardina plichardus resulting from the transformation operation have the potential to be used as raw material for the collagen production. Taking into account this strong expectation of the regional fish industry, scales sardine upgrading is well justified. In addition, political and societal demands for sustainability and environment-friendly industrial production systems, coupled with the depletion of fish resources, drive this trend forward. Therefore, fish scale used as a potential source to isolate collagen has a wide large of applications in food, cosmetic and bio medical industry. The main aim of this study is to isolate and characterize the acid solubilize collagen from sardine fish scale, Sardina pilchardus. Experimental design methodology was adopted in collagen processing for extracting optimization. The first stage of this work is to investigate the optimization conditions of the sardine scale deproteinization on using response surface methodology (RSM). The second part focus on the demineralization with HCl solution or EDTA. Moreover, the last one is to establish the optimum condition for the isolation of collagen from fish scale by solvent extraction. The basic principle of RSM is to determinate model equations that describe interrelations between the independent variables and the dependent variables.Keywords: Sardina pilchardus, scales, valorization, collagen extraction, response surface methodology
Procedia PDF Downloads 41551822 Calculation of Methane Emissions from Wetlands in Slovakia via IPCC Methodology
Authors: Jozef Mindas, Jana Skvareninova
Abstract:
Wetlands are a main natural source of methane emissions, but they also represent the important biodiversity reservoirs in the landscape. There are about 26 thousands hectares of wetlands in Slovakia identified via the wetlands monitoring program. Created database of wetlands in Slovakia allows to analyze several ecological processes including also the methane emissions estimate. Based on the information from the database, the first estimate of the methane emissions from wetlands in Slovakia has been done. The IPCC methodology (Tier 1 approach) has been used with proposed emission factors for the ice-free period derived from the climatic data. The highest methane emissions of nearly 550 Gg are associated with the category of fens. Almost 11 Gg of methane is emitted from bogs, and emissions from flooded lands represent less than 8 Gg.Keywords: bogs, methane emissions, Slovakia, wetlands
Procedia PDF Downloads 28451821 Peril´s Environment of Energetic Infrastructure Complex System, Modelling by the Crisis Situation Algorithms
Authors: Jiří F. Urbánek, Alena Oulehlová, Hana Malachová, Jiří J. Urbánek Jr.
Abstract:
Crisis situations investigation and modelling are introduced and made within the complex system of energetic critical infrastructure, operating on peril´s environments. Every crisis situations and perils has an origin in the emergency/ crisis event occurrence and they need critical/ crisis interfaces assessment. Here, the emergency events can be expected - then crisis scenarios can be pre-prepared by pertinent organizational crisis management authorities towards their coping; or it may be unexpected - without pre-prepared scenario of event. But the both need operational coping by means of crisis management as well. The operation, forms, characteristics, behaviour and utilization of crisis management have various qualities, depending on real critical infrastructure organization perils, and prevention training processes. An aim is always - better security and continuity of the organization, which successful obtainment needs to find and investigate critical/ crisis zones and functions in critical infrastructure organization models, operating in pertinent perils environment. Our DYVELOP (Dynamic Vector Logistics of Processes) method is disposables for it. Here, it is necessary to derive and create identification algorithm of critical/ crisis interfaces. The locations of critical/ crisis interfaces are the flags of crisis situation in organization of critical infrastructure models. Then, the model of crisis situation will be displayed at real organization of Czech energetic crisis infrastructure subject in real peril environment. These efficient measures are necessary for the infrastructure protection. They will be derived for peril mitigation, crisis situation coping and for environmentally friendly organization survival, continuity and its sustainable development advanced possibilities.Keywords: algorithms, energetic infrastructure complex system, modelling, peril´s environment
Procedia PDF Downloads 40251820 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes
Authors: Sky Chou, Joseph C. Chen
Abstract:
This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.Keywords: CNC machining, six sigma, surface roughness, Taguchi methodology
Procedia PDF Downloads 24251819 Probabilistic and Stochastic Analysis of a Retaining Wall for C-Φ Soil Backfill
Authors: André Luís Brasil Cavalcante, Juan Felix Rodriguez Rebolledo, Lucas Parreira de Faria Borges
Abstract:
A methodology for the probabilistic analysis of active earth pressure on retaining wall for c-Φ soil backfill is described in this paper. The Rosenblueth point estimate method is used to measure the failure probability of a gravity retaining wall. The basic principle of this methodology is to use two point estimates, i.e., the standard deviation and the mean value, to examine a variable in the safety analysis. The simplicity of this framework assures to its wide application. For the calculation is required 2ⁿ repetitions during the analysis, since the system is governed by n variables. In this study, a probabilistic model based on the Rosenblueth approach for the computation of the overturning probability of failure of a retaining wall is presented. The obtained results have shown the advantages of this kind of models in comparison with the deterministic solution. In a relatively easy way, the uncertainty on the wall and fill parameters are taken into account, and some practical results can be obtained for the retaining structure design.Keywords: retaining wall, active earth pressure, backfill, probabilistic analysis
Procedia PDF Downloads 41851818 We Are the Earth That Defends Itself: An Exploration of Discursive Practices of Les Soulèvements De La Terre
Authors: Sophie Del Fa, Loup Ducol
Abstract:
This presentation will focus on the discursive practices of Les Soulèvements de la Terre (hereafter SdlT), a French environmentalist group mobilized against agribusiness. More specifically, we will use, as a case study, the violently repressed demonstration that took place in Sainte-Soline on March 25, 2023 (see after for details). The SdlT embodies the renewal of anti-capitalist and environmentalist struggles that began with Occupy Wall Street in 2009 and in France with the Nuit debout in 2016 and the yellow vests movement from 2019 to 2020. These struggles have three things in common: they are self-organized without official leaders, they rely mainly on occupations to reappropriate public places (squares, roundabouts, natural territories) and they are anti-capitalist. The SdlT was created in 2021 by activists coming from the Zone-to-Defend of Notre-Dame-des-Landes, a victorious 10 yearlong occupation movement against an airport near Nantes, France (from 2009 to 2018). The SdlT is not labeled as a formal association, nor as a constituted group, but as an anti-capitalist network of local struggles at the crossroads of ecology and social issues. Indeed, although they target agro-industry, land grabbing, soil artificialization and ecology without transition, the SdlT considers ecological and social questions as interdependent. Moreover, they have an encompassing vision of ecology that they consider as a concern for the living as a whole by erasing the division between Nature and Culture. Their radicality is structured around three main elements: federative and decentralized dimensions, the rhetoric of living alliances and militant creatives strategies. The objective of this reflexion is to understand how these three dimensions are articulated through the SdlT’s discursive practices. To explore these elements, we take as a case study one specific event: the demonstration against the ‘basins’ held in Sainte-Soline on March 25, 2023, on the construction site of new water storage infrastructure for agricultural irrigation in western France. This event represents a turning point for the SdlT. Indeed, the protest was violently repressed: 5000 grenades were fired by the police, hundreds of people were injured, and one person was still in a coma at the time of writing these lines. Moreover, following Saint-Soline’s events, the Minister of Interior Affairs, Gérald Darmin, threatened to dissolve the SdlT, thus adding fuel to the fire in an already tense social climate (with the ongoing strikes against the pensions reform). We anchor our reflexion on three types of data: 1) our own experiences (inspired by ethnography) of the Sainte-Soline demonstration; 2) the collection of more than 500 000 Tweets with the #SainteSoline hashtag and 3) a press review of texts and articles published after Sainte-Soline’s demonstration. The exploration of these data from a turning point in the history of the SdlT will allow us to analyze how the three dimensions highlighted earlier (federative and decentralized dimensions, rhetoric of living alliances and creatives militant strategies) are materialized through the discursive practices surrounding the Sainte-Soline event. This will allow us to shed light on how a new contemporary movement implements contemporary environmental struggles.Keywords: discursive practices, Sainte-Soline, Ecology, radical ecology
Procedia PDF Downloads 7051817 Influence of Antecedent Soil Moisture on Soil Erosion: A Two-Year Field Study
Authors: Yu-Da Chen, Chia-Chun Wu
Abstract:
The relationship between antecedent soil moisture content and soil erosion is a complicated phenomenon. Some studies confirm the effect of antecedent soil moisture content on soil erosion, but some deny it. Therefore, the objective of this study is to clarify such contradictions through field experiments. This study conducted two-year field observations of soil losses from natural rainfall events on runoff plots with a length of 10 meters, width of 3 meters, and uniform slope of 9%. Volumetric soil moisture sensors were used to log the soil moisture changes for each rainfall event. A total of 49 effective events were monitored. Results of this study show that antecedent soil moisture content promotes the generation of surface runoff, especially for rainfall events with short duration or lower magnitudes. A positive correlation was found between antecedent soil moisture content and soil loss per unit Rainfall-Runoff Erosivity Index, which indicated that soil with high moisture content is more susceptible to detachment. Once the rainfall duration exceeds 10 hours, the impact from the rainfall duration to soil erosion overwrites, and the effect of antecedent soil moisture is almost negligible.Keywords: antecedent soil moisture content, soil loss, runoff coefficient, rainfall-runoff erosivity
Procedia PDF Downloads 6551816 Non-Uniform Filter Banks-based Minimum Distance to Riemannian Mean Classifition in Motor Imagery Brain-Computer Interface
Authors: Ping Tan, Xiaomeng Su, Yi Shen
Abstract:
The motion intention in the motor imagery braincomputer interface is identified by classifying the event-related desynchronization (ERD) and event-related synchronization ERS characteristics of sensorimotor rhythm (SMR) in EEG signals. When the subject imagines different limbs or different parts moving, the rhythm components and bandwidth will change, which varies from person to person. How to find the effective sensorimotor frequency band of subjects is directly related to the classification accuracy of brain-computer interface. To solve this problem, this paper proposes a Minimum Distance to Riemannian Mean Classification method based on Non-Uniform Filter Banks. During the training phase, the EEG signals are decomposed into multiple different bandwidt signals by using multiple band-pass filters firstly; Then the spatial covariance characteristics of each frequency band signal are computered to be as the feature vectors. these feature vectors will be classified by the MDRM (Minimum Distance to Riemannian Mean) method, and cross validation is employed to obtain the effective sensorimotor frequency bands. During the test phase, the test signals are filtered by the bandpass filter of the effective sensorimotor frequency bands, and the extracted spatial covariance feature vectors will be classified by using the MDRM. Experiments on the BCI competition IV 2a dataset show that the proposed method is superior to other classification methods.Keywords: non-uniform filter banks, motor imagery, brain-computer interface, minimum distance to Riemannian mean
Procedia PDF Downloads 12451815 Next Generation UK Storm Surge Model for the Insurance Market: The London Case
Authors: Iacopo Carnacina, Mohammad Keshtpoor, Richard Yablonsky
Abstract:
Non-structural protection measures against flooding are becoming increasingly popular flood risk mitigation strategies. In particular, coastal flood insurance impacts not only private citizens but also insurance and reinsurance companies, who may require it to retain solvency and better understand the risks they face from a catastrophic coastal flood event. In this context, a framework is presented here to assess the risk for coastal flooding across the UK. The area has a long history of catastrophic flood events, including the Great Flood of 1953 and the 2013 Cyclone Xaver storm, both of which led to significant loss of life and property. The current framework will leverage a technology based on a hydrodynamic model (Delft3D Flexible Mesh). This flexible mesh technology, coupled with a calibration technique, allows for better utilisation of computational resources, leading to higher resolution and more detailed results. The generation of a stochastic set of extra tropical cyclone (ETC) events supports the evaluation of the financial losses for the whole area, also accounting for correlations between different locations in different scenarios. Finally, the solution shows a detailed analysis for the Thames River, leveraging the information available on flood barriers and levees. Two realistic disaster scenarios for the Greater London area are simulated: In the first scenario, the storm surge intensity is not high enough to fail London’s flood defences, but in the second scenario, London’s flood defences fail, highlighting the potential losses from a catastrophic coastal flood event.Keywords: storm surge, stochastic model, levee failure, Thames River
Procedia PDF Downloads 23251814 Applying Lean Six Sigma in an Emergency Department, of a Private Hospital
Authors: Sarah Al-Lumai, Fatima Al-Attar, Nour Jamal, Badria Al-Dabbous, Manal Abdulla
Abstract:
Today, many commonly used Industrial Engineering tools and techniques are being used in hospitals around the world for the goal of producing a more efficient and effective healthcare system. A common quality improvement methodology known as Lean Six-Sigma has been successful in manufacturing industries and recently in healthcare. The objective of our project is to use the Lean Six-Sigma methodology to reduce waiting time in the Emergency Department (ED), in a local private hospital. Furthermore, a comprehensive literature review was conducted to evaluate the success of Lean Six-Sigma in the ED. According to the study conducted by Ibn Sina Hospital, in Morocco, the most common problem that patients complain about is waiting time. To ensure patient satisfaction many hospitals such as North Shore University Hospital were able to reduce waiting time up to 37% by using Lean Six-Sigma. Other hospitals, such as John Hopkins’s medical center used Lean Six-Sigma successfully to enhance the overall patient flow that ultimately decreased waiting time. Furthermore, it was found that capacity constraints, such as staff shortages and lack of beds were one of the main reasons behind long waiting time. With the use of Lean Six-Sigma and bed management, hospitals like Memorial Hermann Southwest Hospital were able to reduce patient delays. Moreover, in order to successfully implement Lean Six-Sigma in our project, two common methodologies were considered, DMAIC and DMADV. After the assessment of both methodologies, it was found that DMAIC was a more suitable approach to our project because it is more concerned with improving an already existing process. With many of its successes, Lean Six-Sigma has its limitation especially in healthcare; but limitations can be minimized if properly approached.Keywords: lean six sigma, DMAIC, hospital, methodology
Procedia PDF Downloads 49551813 Enhancing Small and Medium Enterprises Access to Finance: The Opportunities and Challenges of Using Intellectual Property Rights as Collateral in Sri Lanka
Authors: Nihal Chandratilaka Matara Arachchige, Nishantha Sampath Punichihewa
Abstract:
Intellectual property (IP) assets are the ‘crown-jewels’ of innovation-driven businesses in the knowledge-based economy. In that sense, IP rights such as patents, trademarks and copyrights afford enormous economic opportunities to an enterprise, especially Small and Medium Enterprise (SME). As can be gleaned from the latest statistics, the domestic industries in Sri Lanka are predominantly represented by SMEs. Undeniably, in terms of economic contribution, the SME sector is considered to be the backbone of the country’s ‘real economy’. However, the SME sector in Sri Lanka faces number of challenges. One of the nearly-insurmountable-hurdles for small businesses is the access to credit facilities, due to the lack of collateral. In the eyes of law, the collateral is something pledged as security for repayment in the event of default. Even though the intellectual property rights are used as collateral in order to facilitate obtaining credit for businesses in number of Asian jurisdictions, financial institutions in Sri Lanka are extremely reluctant to accept IP rights as collateral for granting financial resources to SMEs. Against this backdrop, this research investigates from a legal perspective reasons for not accepting IP rights as collateral when granting loans for SMEs. Drawing emerging examples from other jurisdiction, it further examines the inadequacies of existing legal framework in relation to the use of IP rights as collateral. The methodology followed in this paper is qualitative research. Empirical research and analysis concerning the core research question are carried out by conducting in-depth interviews with stakeholders, including leading financial institutions in Sri Lanka.Keywords: intellectual property assets, SMEs, collaterals financial facilities, credits
Procedia PDF Downloads 27551812 The Effect of Gross Vehicle Weight on the Stability of Heavy Vehicle during Cornering
Authors: Nurzaki Ikhsan, Ahmad Saifizul Abdullah, Rahizar Ramli
Abstract:
One of the functions of the commercial heavy vehicle is to safely and efficiently transport goods and people. Due to its size and carrying capacity, it is important to study the vehicle dynamic stability during cornering. Study has shown that there are a number of overloaded heavy vehicles or permissible gross vehicle weight (GVW) violations recorded at selected areas in Malaysia assigned by its type and category. Thus, the objective of this study is to investigate the correlation and effect of the GVW on heavy vehicle stability during cornering event using simulation. Various selected heavy vehicle types and category are simulated using IPG/Truck Maker® with different GVW and road condition (coefficient of friction of road surface), while the speed, driver characteristic, center of gravity of load and road geometry are constant. Based on the analysis, the relationship between GVW and lateral acceleration were established. As expected, on the same value of coefficient of friction, the maximum lateral acceleration would be increased as the GVW increases.Keywords: heavy vehicle, road safety, vehicle stability, lateral acceleration, gross vehicle weight
Procedia PDF Downloads 53251811 Role of von Willebrand Factor and ADAMTS13 In The Prediction of Thrombotic Complications In Patients With COVID-19
Authors: Nataliya V. Dolgushina, Elena A. Gorodnova, Olga S. Beznoshenco, Andrey Yu Romanov, Irina V. Menzhinskaya, Lyubov V. Krechetova, Gennady T. Suchich
Abstract:
In patients with COVID-19, generalized hypercoagulability can lead to the development of severe coagulopathy. This event is accompanied by the development of a pronounced inflammatory reaction. The observational prospective study included 39 patients with mild COVID-19 and 102 patients with moderate and severe COVID-19. Patients were then stratified into groups depending on the risk of venous thromboembolism. vWF to ADAMTS-13 concentrations and activity ratios were significantly higher in patients with a high venous thromboembolism risks in patients with moderate and severe forms COVID-19.Keywords: ADAMTS-13, COVID-19, hypercoagulation, thrombosis, von Willebrand factor
Procedia PDF Downloads 8951810 The Death Philosophy of Taiwanese Aerial Acrobats
Authors: Tien-Mei Hu
Abstract:
Death is not only a physical event and a fact of life ending but also one of the ultimate issues of philosophy. The aerial acrobats’ dangerous nature and protective rope culture have kept the concept of death in this profession. This study aims to interpret the Taiwanese aerialists’ view of death through the philosophy of death, starting from the archetype of traditional Eastern body practices (aerial acrobatics). Five Taiwanese acrobats (two male and three female) were interviewed through a snowball approach. After the interviews, ATLAS.ti, a qualitative analysis software, was used to analyze the verbatim transcripts, photographs, and documents. The following three conclusions were drawn from this study: every performance by Taiwanese aerial acrobats is a life-threatening performance; Taiwanese aerialists’ perception of death changes with different life stages; Taiwanese aerialists’ philosophy of death is based on the heritage foundation of the "acrobatics" profession, which has created the phenomenon of not using safety equipment unique to Taiwanese aerialists.Keywords: acrobatics, body culture, circus, tightrope walker
Procedia PDF Downloads 9851809 Analyzing Changes in Runoff Patterns Due to Urbanization Using SWAT Models
Authors: Asawari Ajay Avhad
Abstract:
The Soil and Water Assessment Tool (SWAT) is a hydrological model designed to predict the complex interactions within natural and human-altered watersheds. This research applies the SWAT model to the Ulhas River basin, a small watershed undergoing urbanization and characterized by bowl-like topography. Three simulation scenarios (LC17, LC22, and LC27) are investigated, each representing different land use and land cover (LULC) configurations, to assess the impact of urbanization on runoff. The LULC for the year 2027 is generated using the MOLUSCE Plugin of QGIS, incorporating various spatial factors such as DEM, Distance from Road, Distance from River, Slope, and distance from settlements. Future climate data is simulated within the SWAT model using historical data spanning 30 years. A susceptibility map for runoff across the basin is created, classifying runoff into five susceptibility levels ranging from very low to very high. Sub-basins corresponding to major urban settlements are identified as highly susceptible to runoff. With consideration of future climate projections, a slight increase in runoff is forecasted. The reliability of the methodology was validated through the identification of sub-basins known for experiencing severe flood events, which were determined to be highly susceptible to runoff. The susceptibility map successfully pinpointed these sub-basins with a track record of extreme flood occurrences, thus reinforcing the credibility of the assessment methodology. This study suggests that the methodology employed could serve as a valuable tool in flood management planning.Keywords: future land use impact, flood management, run off prediction, ArcSWAT
Procedia PDF Downloads 4651808 Flash Flood in Gabes City (Tunisia): Hazard Mapping and Vulnerability Assessment
Authors: Habib Abida, Noura Dahri
Abstract:
Flash floods are among the most serious natural hazards that have disastrous environmental and human impacts. They are associated with exceptional rain events, characterized by short durations, very high intensities, rapid flows and small spatial extent. Flash floods happen very suddenly and are difficult to forecast. They generally cause damage to agricultural crops and property, infrastructures, and may even result in the loss of human lives. The city of Gabes (South-eastern Tunisia) has been exposed to numerous damaging floods because of its mild topography, clay soil, high urbanization rate and erratic rainfall distribution. The risks associated with this situation are expected to increase further in the future because of climate change, deemed responsible for the increase of the frequency and the severity of this natural hazard. Recently, exceptional events hit Gabes City causing death and major property losses. A major flooding event hit the region on June 2nd, 2014, causing human deaths and major material losses. It resulted in the stagnation of storm water in the numerous low zones of the study area, endangering thereby human health and causing disastrous environmental impacts. The characterization of flood risk in Gabes Watershed (South-eastern Tunisia) is considered an important step for flood management. Analytical Hierarchy Process (AHP) method coupled with Monte Carlo simulation and geographic information system were applied to delineate and characterize flood areas. A spatial database was developed based on geological map, digital elevation model, land use, and rainfall data in order to evaluate the different factors susceptible to affect flood analysis. Results obtained were validated by remote sensing data for the zones that showed very high flood hazard during the extreme rainfall event of June 2014 that hit the study basin. Moreover, a survey was conducted from different areas of the city in order to understand and explore the different causes of this disaster, its extent and its consequences.Keywords: analytical hierarchy process, flash floods, Gabes, remote sensing, Tunisia
Procedia PDF Downloads 10951807 Predicting Financial Distress in South Africa
Authors: Nikki Berrange, Gizelle Willows
Abstract:
Business rescue has become increasingly popular since its inclusion in the Companies Act of South Africa in May 2011. The Alternate Exchange (AltX) of the Johannesburg Stock Exchange has experienced a marked increase in the number of companies entering business rescue. This study sampled twenty companies listed on the AltX to determine whether Altman’s Z-score model for emerging markets (ZEM) or Taffler’s Z-score model is a more accurate model in predicting financial distress for small to medium size companies in South Africa. The study was performed over three different time horizons; one, two and three years prior to the event of financial distress, in order to determine how many companies each model predicted would be unlikely to succeed as well as the predictive ability and accuracy of the respective models. The study found that Taffler’s Z-score model had a greater ability at predicting financial distress from all three-time horizons.Keywords: Altman’s ZEM-score, Altman’s Z-score, AltX, business rescue, Taffler’s Z-score
Procedia PDF Downloads 37251806 Supplier Selection in a Scenario Based Stochastic Model with Uncertain Defectiveness and Delivery Lateness Rates
Authors: Abeer Amayri, Akif A. Bulgak
Abstract:
Due to today’s globalization as well as outsourcing practices of the companies, the Supply Chain (SC) performances have become more dependent on the efficient movement of material among places that are geographically dispersed, where there is more chance for disruptions. One such disruption is the quality and delivery uncertainties of outsourcing. These uncertainties could lead the products to be unsafe and, as is the case in a number of recent examples, companies may have to end up in recalling their products. As a result of these problems, there is a need to develop a methodology for selecting suppliers globally in view of risks associated with low quality and late delivery. Accordingly, we developed a two-stage stochastic model that captures the risks associated with uncertainty in quality and delivery as well as a solution procedure for the model. The stochastic model developed simultaneously optimizes supplier selection and purchase quantities under price discounts over a time horizon. In particular, our target is the study of global organizations with multiple sites and multiple overseas suppliers, where the pricing is offered in suppliers’ local currencies. Our proposed methodology is applied to a case study for a US automotive company having two assembly plants and four potential global suppliers to illustrate how the proposed model works in practice.Keywords: global supply chains, quality, stochastic programming, supplier selection
Procedia PDF Downloads 45851805 Generation of Automated Alarms for Plantwide Process Monitoring
Authors: Hyun-Woo Cho
Abstract:
Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.Keywords: detection, monitoring, process data, noise
Procedia PDF Downloads 25251804 The Importance of the Fluctuation in Blood Sugar and Blood Pressure of Insulin-Dependent Diabetic Patients with Chronic Kidney Disease
Authors: Hitoshi Minakuchi, Izumi Takei, Shu Wakino, Koichi Hayashi, Hiroshi Itoh
Abstract:
Objectives: Among type 2 diabetics, patients with CKD(chronic kidney disease), insulin resistance, impaired glyconeogenesis in kidney and reduced degradation of insulin are recognized, and we observed different fluctuational patterns of blood sugar between CKD patients and non-CKD patients. On the other hand, non-dipper type blood pressure change is the risk of organ damage and mortality. We performed cross-sectional study to elucidate the characteristic of the fluctuation of blood glucose and blood pressure at insulin-treated diabetic patients with chronic kidney disease. Methods: From March 2011 to April 2013, at the Ichikawa General Hospital of Tokyo Dental College, we recruited 20 outpatients. All participants are insulin-treated type 2 diabetes with CKD. We collected serum samples, urine samples for several hormone measurements, and performed CGMS(Continuous glucose measurement system), ABPM (ambulatory blood pressure monitoring), brain computed tomography, carotid artery thickness, ankle brachial index, PWV, CVR-R, and analyzed these data statistically. Results: Among all 20 participants, hypoglycemia was decided blood glucose 70mg/dl by CGMS of 9 participants (45.0%). The event of hypoglycemia was recognized lower eGFR (29.8±6.2ml/min:41.3±8.5ml/min, P<0.05), lower HbA1c (6.44±0.57%:7.53±0.49%), higher PWV (1858±97.3cm/s:1665±109.2cm/s), higher serum glucagon (194.2±34.8pg/ml:117.0±37.1pg/ml), higher free cortisol of urine (53.8±12.8μg/day:34.8±7.1μg/day), and higher metanephrin of urine (0.162±0.031mg/day:0.076±0.029mg/day). Non-dipper type blood pressure change in ABPM was detected 8 among 9 participants with hypoglycemia (88.9%), 4 among 11 participants (36.4%) without hypoglycemia. Multiplex logistic-regression analysis revealed that the event of hypoglycemia is the independent factor of non-dipper type blood pressure change. Conclusions: Among insulin-treated type 2 diabetic patients with CKD, the events of hypoglycemia were frequently detected, and can associate with the organ derangements through the medium of non-dipper type blood pressure change.Keywords: chronic kidney disease, hypoglycemia, non-dipper type blood pressure change, diabetic patients
Procedia PDF Downloads 41451803 A Multi-Dimensional Neural Network Using the Fisher Transform to Predict the Price Evolution for Algorithmic Trading in Financial Markets
Authors: Cristian Pauna
Abstract:
Trading the financial markets is a widespread activity today. A large number of investors, companies, public of private funds are buying and selling every day in order to make profit. Algorithmic trading is the prevalent method to make the trade decisions after the electronic trading release. The orders are sent almost instantly by computers using mathematical models. This paper will present a price prediction methodology based on a multi-dimensional neural network. Using the Fisher transform, the neural network will be instructed for a low-latency auto-adaptive process in order to predict the price evolution for the next period of time. The model is designed especially for algorithmic trading and uses the real-time price series. It was found that the characteristics of the Fisher function applied at the nodes scale level can generate reliable trading signals using the neural network methodology. After real time tests it was found that this method can be applied in any timeframe to trade the financial markets. The paper will also include the steps to implement the presented methodology into an automated trading system. Real trading results will be displayed and analyzed in order to qualify the model. As conclusion, the compared results will reveal that the neural network methodology applied together with the Fisher transform at the nodes level can generate a good price prediction and can build reliable trading signals for algorithmic trading.Keywords: algorithmic trading, automated trading systems, financial markets, high-frequency trading, neural network
Procedia PDF Downloads 16051802 Implementation of Lean Manufacturing in Some Companies in Colombia: A Case Study
Authors: Natalia Marulanda, Henry González, Gonzalo León, Alejandro Hincapié
Abstract:
Continuous improvement tools are the result of a set of studies that developed theories and methodologies. These methodologies enable organizations to increase their levels of efficiency, effectiveness, and productivity. Based on these methodologies, lean manufacturing philosophy, which is based on the optimization of resources, waste disposal, and generation of value to products and services, was developed. Lean application has been massive globally, but Colombian companies have been made it incipiently. Therefore, the purpose of this article is to identify the impacts generated by the implementation of lean manufacturing tools in five companies located in Colombia and Medellín metropolitan area. It also seeks to make a comparison of the results obtained from the implementation of lean philosophy and Theory of Constraints. The methodology is qualitative and quantitative, is based on the case study interview from dialogue with the leaders of the processes that used lean tools. The most used tools by research companies are 5's with 100% and TPM with 80%. The less used tool is the synchronous production with 20%. The main reason for the implementation of lean was supply chain management with 83.3%. For the application of lean and TOC, we did not find significant differences between the impact, in terms of methodology, areas of application, staff initiatives, supply chain management, planning, and training.Keywords: business strategy, lean manufacturing, theory of constraints, supply chain
Procedia PDF Downloads 35451801 Multicollinearity and MRA in Sustainability: Application of the Raise Regression
Authors: Claudia García-García, Catalina B. García-García, Román Salmerón-Gómez
Abstract:
Much economic-environmental research includes the analysis of possible interactions by using Moderated Regression Analysis (MRA), which is a specific application of multiple linear regression analysis. This methodology allows analyzing how the effect of one of the independent variables is moderated by a second independent variable by adding a cross-product term between them as an additional explanatory variable. Due to the very specification of the methodology, the moderated factor is often highly correlated with the constitutive terms. Thus, great multicollinearity problems arise. The appearance of strong multicollinearity in a model has important consequences. Inflated variances of the estimators may appear, there is a tendency to consider non-significant regressors that they probably are together with a very high coefficient of determination, incorrect signs of our coefficients may appear and also the high sensibility of the results to small changes in the dataset. Finally, the high relationship among explanatory variables implies difficulties in fixing the individual effects of each one on the model under study. These consequences shifted to the moderated analysis may imply that it is not worth including an interaction term that may be distorting the model. Thus, it is important to manage the problem with some methodology that allows for obtaining reliable results. After a review of those works that applied the MRA among the ten top journals of the field, it is clear that multicollinearity is mostly disregarded. Less than 15% of the reviewed works take into account potential multicollinearity problems. To overcome the issue, this work studies the possible application of recent methodologies to MRA. Particularly, the raised regression is analyzed. This methodology mitigates collinearity from a geometrical point of view: the collinearity problem arises because the variables under study are very close geometrically, so by separating both variables, the problem can be mitigated. Raise regression maintains the available information and modifies the problematic variables instead of deleting variables, for example. Furthermore, the global characteristics of the initial model are also maintained (sum of squared residuals, estimated variance, coefficient of determination, global significance test and prediction). The proposal is implemented to data from countries of the European Union during the last year available regarding greenhouse gas emissions, per capita GDP and a dummy variable that represents the topography of the country. The use of a dummy variable as the moderator is a special variant of MRA, sometimes called “subgroup regression analysis.” The main conclusion of this work is that applying new techniques to the field can improve in a substantial way the results of the analysis. Particularly, the use of raised regression mitigates great multicollinearity problems, so the researcher is able to rely on the interaction term when interpreting the results of a particular study.Keywords: multicollinearity, MRA, interaction, raise
Procedia PDF Downloads 104