Search results for: event study methodology
52366 Bayesian System and Copula for Event Detection and Summarization of Soccer Videos
Authors: Dhanuja S. Patil, Sanjay B. Waykar
Abstract:
Event detection is a standout amongst the most key parts for distinctive sorts of area applications of video data framework. Recently, it has picked up an extensive interest of experts and in scholastics from different zones. While detecting video event has been the subject of broad study efforts recently, impressively less existing methodology has considered multi-model data and issues related efficiency. Start of soccer matches different doubtful circumstances rise that can't be effectively judged by the referee committee. A framework that checks objectively image arrangements would prevent not right interpretations because of some errors, or high velocity of the events. Bayesian networks give a structure for dealing with this vulnerability using an essential graphical structure likewise the probability analytics. We propose an efficient structure for analysing and summarization of soccer videos utilizing object-based features. The proposed work utilizes the t-cherry junction tree, an exceptionally recent advancement in probabilistic graphical models, to create a compact representation and great approximation intractable model for client’s relationships in an interpersonal organization. There are various advantages in this approach firstly; the t-cherry gives best approximation by means of junction trees class. Secondly, to construct a t-cherry junction tree can be to a great extent parallelized; and at last inference can be performed utilizing distributed computation. Examination results demonstrates the effectiveness, adequacy, and the strength of the proposed work which is shown over a far reaching information set, comprising more soccer feature, caught at better places.Keywords: summarization, detection, Bayesian network, t-cherry tree
Procedia PDF Downloads 32352365 A Qualitative Study of the Psychologically Challenging Aspects of Taking Part in an Ultra-Endurance Atlantic Rowing Event
Authors: John Allbutt, Andrew Murray, Jonathan Ling, Thomas M. Heffernan
Abstract:
Ultra-endurance events place unique physical and psychological pressures on participants. In this study, we examined the psychologically challenging aspects of taking part in a 3000 mile transatlantic rowing race using a qualitative approach. To date, more people have been into space than have rowed an ocean and only one psychological study has been conducted on this experience which had a specific research focus. The current study was a qualitative study using semi-structured interviews. Participants were an opportunity sample of seven competitors from a recent ocean rowing race. Participants were asked about the psychological aspects of the event after it had finished. The data were analysed using thematic analysis. Several themes emerged from the analysis. These related to: 1) preparation; 2) bodily aches/pains, 3) race setbacks; 4) boat conditions; 5) interpersonal factors and communication; 6) strategies for managing stress and interpersonal tensions. While participants were generally very positive about the event, the analysis showed that they experienced significant psychological challenges during their voyage. Competitors paid considerable attention to preparing for the physical challenges of the event. However, not all prospective competitors gave the same time to preparing for psychological factors or were aware how they might play out during their voyage. All Atlantic rowing crews should be aware of the psychological challenges they face, and have strategies in place to help cope with the psychological strain of taking part.Keywords: confinement experiences, ocean rowing, stress, ultra-endurance sport
Procedia PDF Downloads 33252364 The Impact of Inpatient New Boarding Policy on Emergency Department Overcrowding: A Discrete Event Simulation Study
Authors: Wheyming Tina Song, Chi-Hao Hong
Abstract:
In this study, we investigate the effect of a new boarding policy - short stay, on the overcrowding efficiency in emergency department (ED). The decision variables are no. of short stay beds for least acuity ED patients. The performance measurements used are national emergency department overcrowding score (NEDOCS) and ED retention rate (the percentage that patients stay in ED over than 48 hours in one month). Discrete event simulation (DES) is used as an analysis tool to evaluate the strategy. Also, common random number (CRN) technique is applied to enhance the simulation precision. The DES model was based on a census of 6 months' patients who were treated in the ED of the National Taiwan University Hospital Yunlin Branch. Our results show that the new short-stay boarding significantly impacts both the NEDOCS and ED retention rate when the no. of short stay beds is more than three.Keywords: emergency department (ED), common random number (CRN), national emergency department overcrowding score (NEDOCS), discrete event simulation (DES)
Procedia PDF Downloads 34852363 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery
Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene
Abstract:
Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.Keywords: multi-objective, analysis, data flow, freight delivery, methodology
Procedia PDF Downloads 18052362 Applications of Analytical Probabilistic Approach in Urban Stormwater Modeling in New Zealand
Authors: Asaad Y. Shamseldin
Abstract:
Analytical probabilistic approach is an innovative approach for urban stormwater modeling. It can provide information about the long-term performance of a stormwater management facility without being computationally very demanding. This paper explores the application of the analytical probabilistic approach in New Zealand. The paper presents the results of a case study aimed at development of an objective way of identifying what constitutes a rainfall storm event and the estimation of the corresponding statistical properties of storms using two selected automatic rainfall stations located in the Auckland region in New Zealand. The storm identification and the estimation of the storm statistical properties are regarded as the first step in the development of the analytical probabilistic models. The paper provides a recommendation about the definition of the storm inter-event time to be used in conjunction with the analytical probabilistic approach.Keywords: hydrology, rainfall storm, storm inter-event time, New Zealand, stormwater management
Procedia PDF Downloads 34452361 Processing Mild versus Strong Violations in Music: A Pilot Study Using Event-Related Potentials
Authors: Marie-Eve Joret, Marijn Van Vliet, Flavio Camarrone, Marc M. Van Hulle
Abstract:
Event-related potentials (ERPs) provide evidence that the human brain can process and understand music at a pre-attentive level. Music-specific ERPs include the Early Right Anterior Negativity (ERAN) and a late Negativity (N5). This study aims to further investigate this issue using two types of syntactic manipulations in music: mild violations, containing no out-of-key tones and strong violations, containing out-of-key tones. We will examine whether both manipulations will elicit the same ERPs.Keywords: ERAN ERPs, Music, N5, P3, ERPs, Music, N5 component, P3 component
Procedia PDF Downloads 27552360 Traffic Safety and Risk Assessment Model by Analysis of Questionnaire Survey: A Case Study of S. G. Highway, Ahmedabad, India
Authors: Abhijitsinh Gohil, Kaushal Wadhvaniya, Kuldipsinh Jadeja
Abstract:
Road Safety is a multi-sectoral and multi-dimensional issue. An effective model can assess the risk associated with highway safety. A questionnaire survey is very essential to identify the events or activities which are causing unsafe condition for traffic on an urban highway. A questionnaire of standard questions including vehicular, human and infrastructure characteristics can be made. Responses from the age wise group of road users can be taken on field. Each question or an event holds a specific risk weightage, which contributes in creating an inappropriate and unsafe flow of traffic. The probability of occurrence of an event can be calculated from the data collected from the road users. Finally, the risk score can be calculated by considering the risk factor and the probability of occurrence of individual event and addition of all risk score for the individual event will give the total risk score of a particular road. Standards for risk score can be made and total risk score can be compared with the standards. Thus road can be categorized based on risk associated and traffic safety on it. With this model, one can assess the need for traffic safety improvement on a given road, and qualitative data can be analysed.Keywords: probability of occurrence, questionnaire, risk factor, risk score
Procedia PDF Downloads 33852359 Stage-Gate Based Integrated Project Management Methodology for New Product Development
Authors: Mert Kıranç, Ekrem Duman, Murat Özbilen
Abstract:
In order to achieve new product development (NPD) activities on time and within budgetary constraints, the NPD managers need a well-designed methodology. This study intends to create an integrated project management methodology for the ones who focus on new product development projects. In the scope of the study, four different management systems are combined. These systems are called as 'Schedule-oriented Stage-Gate Method, Risk Management, Change Management and Earned Value Management'. New product development term is quite common in many different industries such as defense industry, construction, health care/dental, higher education, fast moving consumer goods, white goods, electronic devices, marketing and advertising and software development. All product manufacturers run against each other’s for introducing a new product to the market. In order to achieve to produce a more competitive product in the market, an optimum project management methodology is chosen, and this methodology is adapted to company culture. The right methodology helps the company to present perfect product to the customers at the right time. The benefits of proposed methodology are discussed as an application by a company. As a result, how the integrated methodology improves the efficiency and how it achieves the success of the project are unfolded.Keywords: project, project management, management methodology, new product development, risk management, change management, earned value, stage-gate
Procedia PDF Downloads 31252358 Amplitude and Latency of P300 Component from Auditory Stimulus in Different Types of Personality: An Event Related Potential Study
Authors: Nasir Yusoff, Ahmad Adamu Adamu, Tahamina Begum, Faruque Reza
Abstract:
The P300 from Event related potential (ERP) explains the psycho-physiological phenomenon in human body. The present study aims to identify the differences of amplitude and latency of P300 component from auditory stimuli, between ambiversion and extraversion types of personality. Ambivert (N=20) and extravert (N=20) undergoing ERP recording at the Hospital Universiti Sains Malaysia (HUSM) laboratory. Electroencephalogram data was recorded with oddball paradigm, counting auditory standard and target tones, from nine electrode sites (Fz, Cz, Pz, T3, T4, T5, T6, P3 and P4) by using the 128 HydroCel Geodesic Sensor Net. The P300 latency of the target tones at all electrodes were insignificant. Similarly, the P300 latency of the standard tones were also insignificant except at Fz and T3 electrode. Likewise, the P300 amplitude of the target and standard tone in all electrode sites were insignificant. Extravert and ambivert indicate similar characteristic in cognition processing from auditory task.Keywords: amplitude, event related potential, p300 component, latency
Procedia PDF Downloads 37752357 Event Driven Dynamic Clustering and Data Aggregation in Wireless Sensor Network
Authors: Ashok V. Sutagundar, Sunilkumar S. Manvi
Abstract:
Energy, delay and bandwidth are the prime issues of wireless sensor network (WSN). Energy usage optimization and efficient bandwidth utilization are important issues in WSN. Event triggered data aggregation facilitates such optimal tasks for event affected area in WSN. Reliable delivery of the critical information to sink node is also a major challenge of WSN. To tackle these issues, we propose an event driven dynamic clustering and data aggregation scheme for WSN that enhances the life time of the network by minimizing redundant data transmission. The proposed scheme operates as follows: (1) Whenever the event is triggered, event triggered node selects the cluster head. (2) Cluster head gathers data from sensor nodes within the cluster. (3) Cluster head node identifies and classifies the events out of the collected data using Bayesian classifier. (4) Aggregation of data is done using statistical method. (5) Cluster head discovers the paths to the sink node using residual energy, path distance and bandwidth. (6) If the aggregated data is critical, cluster head sends the aggregated data over the multipath for reliable data communication. (7) Otherwise aggregated data is transmitted towards sink node over the single path which is having the more bandwidth and residual energy. The performance of the scheme is validated for various WSN scenarios to evaluate the effectiveness of the proposed approach in terms of aggregation time, cluster formation time and energy consumed for aggregation.Keywords: wireless sensor network, dynamic clustering, data aggregation, wireless communication
Procedia PDF Downloads 45052356 Social Movements and the Diffusion of Tactics and Repertoires: Activists' Network in Anti-Globalism Movement
Authors: Kyoko Tominaga
Abstract:
Non-Government Organizations (NGOs), Non-Profit Organizations (NPOs), Social Enterprises and other actors play an important role in political decisions in governments at the international levels. Especially, such organizations’ and activists’ network in civil society is quite important to effect to the global politics. To solve the complex social problems in global era, diverse actors should corporate each other. Moreover, network of protesters is also contributes to diffuse tactics, information and other resources of social movements. Based on the findings from the study of International Trade Fairs (ITFs), the author analyzes the network of activists in anti-globalism movement. This research focuses the transition of 54 activists’ whole network in the “protest event” against 2008 G8 summit in Japan. Their network is examined at the three periods: Before protest event phase, during protest event phase and after event phase. A mixed method is used in this study: the author shows the hypothesis from social network analysis and evaluates that with interview data analysis. This analysis gives the two results. Firstly, the more protesters participate to the various events during the protest event, the more they build the network. After that, active protesters keep their network as well. From interview data, we can understand that the active protesters can build their network and diffuse the information because they communicate with other participants and understand that diverse issues are related. This paper comes to same conclusion with previous researches: protest events activate the network among the political activists. However, some participants succeed to build their network, others do not. “Networked” activists are participated in the various events for short period of time and encourage the diffusion of information and tactics of social movements.Keywords: social movement, global justice movement, tactics, diffusion
Procedia PDF Downloads 38252355 Shaping the Image of Museum Events in the Digital Media Era: A Quantitative Analysis of the Cat-Themed ‘Night at the Museum’ Event
Authors: Shuyu Zhao
Abstract:
This study uses the cat-themed "Night at the Museum" event of the Shanghai Museum as a case to examine how museum events are portrayed across various digital news platforms. Grounded in communication and cultural creativity theories and employing a three-tier framing approach, this research provides an in-depth analysis of media strategies in cross-platform museum image building. Through a quantitative content analysis, it is investigated that how digital media employ specific narrative strategies to shape the public perception of museum events. The findings reveal a prevalent use of leadership framing, highlighting the museum's unique role in cultural dissemination. By combining elements of museum culture with a pet-friendly theme, the "catty Night at the Museum" event serves as a distinctive example in exploring museum image construction within digital media. This study sheds light on how museum events, as unique cultural arenas, are positioned in the public mind, offering a fresh perspective for the promotion and image-building of museum activities.Keywords: cultural communication, digital media, museum, framing theory
Procedia PDF Downloads 1852354 Formal Implementation of Routing Information Protocol Using Event-B
Authors: Jawid Ahmad Baktash, Tadashi Shiroma, Tomokazu Nagata, Yuji Taniguchi, Morikazu Nakamura
Abstract:
The goal of this paper is to explore the use of formal methods for Dynamic Routing, The purpose of network communication with dynamic routing is sending a massage from one node to others by using pacific protocols. In dynamic routing connections are possible based on protocols of Distance vector (Routing Information Protocol, Border Gateway protocol), Link State (Open Shortest Path First, Intermediate system Intermediate System), Hybrid (Enhanced Interior Gateway Routing Protocol). The responsibility for proper verification becomes crucial with Dynamic Routing. Formal methods can play an essential role in the Routing, development of Networks and testing of distributed systems. Event-B is a formal technique consists of describing rigorously the problem; introduce solutions or details in the refinement steps to obtain more concrete specification, and verifying that proposed solutions are correct. The system is modeled in terms of an abstract state space using variables with set theoretic types and the events that modify state variables. Event-B is a variant of B, was designed for developing distributed systems. In Event-B, the events consist of guarded actions occurring spontaneously rather than being invoked. The invariant state properties must be satisfied by the variables and maintained by the activation of the events.Keywords: dynamic rout RIP, formal method, event-B, pro-B
Procedia PDF Downloads 40152353 Optimizing Coal Yard Management Using Discrete Event Simulation
Authors: Iqbal Felani
Abstract:
A Coal-Fired Power Plant has some integrated facilities to handle coal from three separated coal yards to eight units power plant’s bunker. But nowadays the facilities are not reliable enough for supporting the system. Management planned to invest some facilities to increase the reliability. They also had a plan to make single spesification of coal used all of the units, called Single Quality Coal (SQC). This simulation would compare before and after improvement with two scenarios i.e First In First Out (FIFO) and Last In First Out (LIFO). Some parameters like stay time, reorder point and safety stock is determined by the simulation. Discrete event simulation based software, Flexsim 5.0, is used to help the simulation. Based on the simulation, Single Quality Coal with FIFO scenario has the shortest staytime with 8.38 days.Keywords: Coal Yard Management, Discrete event simulation First In First Out, Last In First Out.
Procedia PDF Downloads 67152352 Methodology of Islamic Economics: Scope and Prospects
Authors: Ahmad Abdulkadir Ibrahim
Abstract:
Observation of the methodology of Islamic economics laid down for the methods and instruments of analysis and even some of its basic assumptions in the modern world; is a matter that is of paramount importance. There is a need to examine the implications of different suggested definitions of Islamic economics, exploring its scope and attempting to outline its methodology. This paper attempts to deal with the definition of Islamic economics, its methodology, and its scope. It will outline the main methodological problem by addressing the question of whether Islamic economics calls for a methodology of its own or as an expanded economics. It also aims at drawing the attention of economists in the modern world to the obligation and consideration of the methodology of Islamic economics. The methodology adopted in this research is library research through the consultation of relevant literature, which focuses on the thematic study of the subject matter. This is followed by an analysis and discussion of the contents of the materials used. It is concluded that there is a certain degree of inconsistency in the way assumptions are incorporated that perhaps are alien to Islamic economics. The paper also observed that there is a difference between Islamic economists and other (conventional) economists in the profession. An important conclusion is that Islamic economists need to rethink what economics is all about and whether we really have to create an alternative to economics in the form of Islamic economics or simply have an Islamic perspective of the same discipline.Keywords: methodology, Islamic economics, conventional economics, Muslim economists, framework, knowledge
Procedia PDF Downloads 12852351 Teaching Creative Thinking and Writing to Simultaneous Bilinguals: A Longitudinal Study of 6-7 Years Old English and Punjabi Language Learners
Authors: Hafiz Muhammad Fazalehaq
Abstract:
This paper documents the results of a longitudinal study done on two bilingual children who speak English and Punjabi simultaneously. Their father is a native English speaker whereas their mother speaks Punjabi. Their mother can speak both the languages (English and Punjabi) whereas their father only speaks English. At the age of six, these children have difficulty in creative thinking and of course creative writing. So, the first task for the researcher is to impress and entice the children to think creatively. Various and different methodologies and techniques were used to entice them to start thinking creatively. Creative thinking leads to creative writing. These children were exposed to numerous sources including videos, photographs, texts and audios at first place in order to have a taste of creative genres (stories in this case). The children were encouraged to create their own stories sometimes with photographs and sometimes by using their favorite toys. At a second stage, they were asked to write about an event or incident. After that, they were motivated to create new stories and write them. Length of their creative writing varies from a few sentences to a two standard page. After this six months’ study, the researcher was able to develop a ten steps methodology for creating and improving/enhancing creative thinking and creative writing skills of the subjects understudy. This ten-step methodology entices and motivates the learner to think creatively for producing a creative piece.Keywords: bilinguals, creative thinking, creative writing, simultaneous bilingual
Procedia PDF Downloads 35252350 Competing Risk Analyses in Survival Trials During COVID-19 Pandemic
Authors: Ping Xu, Gregory T. Golm, Guanghan (Frank) Liu
Abstract:
In the presence of competing events, traditional survival analysis may not be appropriate and can result in biased estimates, as it assumes independence between competing events and the event of interest. Instead, competing risk analysis should be considered to correctly estimate the survival probability of the event of interest and the hazard ratio between treatment groups. The COVID-19 pandemic has provided a potential source of competing risks in clinical trials, as participants in trials may experienceCOVID-related competing events before the occurrence of the event of interest, for instance, death due to COVID-19, which can affect the incidence rate of the event of interest. We have performed simulation studies to compare multiple competing risk analysis models, including the cumulative incidence function, the sub-distribution hazard function, and the cause-specific hazard function, to the traditional survival analysis model under various scenarios. We also provide a general recommendation on conducting competing risk analysis in randomized clinical trials during the era of the COVID-19 pandemic based on the extensive simulation results.Keywords: competing risk, survival analysis, simulations, randomized clinical trial, COVID-19 pandemic
Procedia PDF Downloads 18852349 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach
Authors: Jerry Q. Cheng
Abstract:
Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing
Procedia PDF Downloads 16552348 Empirical Decomposition of Time Series of Power Consumption
Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats
Abstract:
Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;
Procedia PDF Downloads 8252347 Factor Analysis Based on Semantic Differential of the Public Perception of Public Art: A Case Study of the Malaysia National Monument
Authors: Yuhanis Ibrahim, Sung-Pil Lee
Abstract:
This study attempts to address factors that contribute to outline public art factors assessment, memorial monument specifically. Memorial monuments hold significant and rich message whether the intention of the art is to mark and commemorate important event or to inform younger generation about the past. Public monument should relate to the public and raise awareness about the significant issue. Therefore, by investigating the impact of the existing public memorial art will hopefully shed some lights to the upcoming public art projects’ stakeholders to ensure the lucid memorial message is delivered to the public directly. Public is the main actor as public is the fundamental purpose that the art was created. Perception is framed as one of the reliable evaluation tools to assess the public art impact factors. The Malaysia National Monument was selected to be the case study for the investigation. The public’s perceptions were gathered using a questionnaire that involved (n-115) participants to attain keywords, and next Semantical Differential Methodology (SDM) was adopted to evaluate the perceptions about the memorial monument. These perceptions were then measured with Reliability Factor and then were factorised using Factor Analysis of Principal Component Analysis (PCA) method to acquire concise factors for the monument assessment. The result revealed that there are four factors that influence public’s perception on the monument which are aesthetic, audience, topology, and public reception. The study concludes by proposing the factors for public memorial art assessment for the next future public memorial projects especially in Malaysia.Keywords: factor analysis, public art, public perception, semantical differential methodology
Procedia PDF Downloads 50152346 Urban Resilince and Its Prioritised Components: Analysis of Industrial Township Greater Noida
Authors: N. Mehrotra, V. Ahuja, N. Sridharan
Abstract:
Resilience is an all hazard and a proactive approach, require a multidisciplinary input in the inter related variables of the city system. This research based to identify and operationalize indicators for assessment in domain of institutions, infrastructure and knowledge, all three operating in task oriented community networks. This paper gives a brief account of the methodology developed for assessment of Urban Resilience and its prioritized components for a target population within a newly planned urban complex integrating Surajpur and Kasna village as nodes. People’s perception of Urban Resilience has been examined by conducting questionnaire survey among the target population of Greater Noida. As defined by experts, Urban Resilience of a place is considered to be both a product and process of operation to regain normalcy after an event of disturbance of certain level. Based on this methodology, six indicators are identified that contribute to perception of urban resilience both as in the process of evolution and as an outcome. The relative significance of 6 R’ has also been identified. The dependency factor of various resilience indicators have been explored in this paper, which helps in generating new perspective for future research in disaster management. Based on the stated factors this methodology can be applied to assess urban resilience requirements of a well planned town, which is not an end in itself, but calls for new beginnings.Keywords: disaster, resilience, system, urban
Procedia PDF Downloads 45852345 Performants: A Digital Event Manager-Organizer
Authors: Ioannis Andrianakis, Manolis Falelakis, Maria Pavlidou, Konstantinos Papakonstantinou, Ermioni Avramidou, Dimitrios Kalogiannis, Nikolaos Milios, Katerina Bountakidou, Kiriakos Chatzidimitriou, Panagiotis Panagiotopoulos
Abstract:
Artistic events, such as concerts and performances, are challenging to organize because they involve many people with different skill sets. Small and medium venues often struggle to afford the costs and overheads of booking and hosting remote artists, especially if they lack sponsors or subsidies. This limits the opportunities for both venues and artists, especially those outside of big cities. However, more and more research shows that audiences prefer smaller-scale events and concerts, which benefit local economies and communities. To address this challenge, our project “PerformAnts: Digital Event Manager-Organizer” aims to develop a smart digital tool that automates and optimizes the processes and costs of live shows and tours. By using machine learning, applying best practices and training users through workshops, our platform offers a comprehensive solution for a growing market, enhances the mobility of artists and the accessibility of venues and allows professionals to focus on the creative aspects of concert production.Keywords: event organization, creative industries, event promotion, machine learning
Procedia PDF Downloads 8752344 The Significance of Oranyan Festival among the Oyo Yoruba
Authors: Emmanuel Bole Akinpelu
Abstract:
Festival is a social event that takes place every year which showcase culture and other social activities that usually take place in an environment or town. However, Oranyan Festival is an annual event organized and celebrated in Oyo town in honor of Oranyan the great who is reputed to be the overall head of the Kings of the Yoruba. This event is attended by people from all works of life. The Oyos are used to celebrating various cultural festivals; like Ogun, Oya, Sango, Egungun, Obatala and others. However, Oranyan festival in Oyo is a recent development in honour of Oranyan. He was said to be powerful and an embodiment of a unique cultural tradition. The study examined the significance of the festival to the Oyo Yoruba group. Oyo Yoruba cultural heritage include; Ewi, Ijala, Traditional food ‘Amala and Gbegiri’, Ekun Iyawo, (Bridal Chants), Traditional Music, Traditional Dance, Traditional Game ‘Ayo Olopon’ Eke (Traditional wrestling) and others. Data for this work was gathered through archival sources as journals and relevant publications on the various Oyo Yoruba Traditional Art and Culture. The study is of the opinion that the festival has influence over the religion, Political, economic and other aspects of the modern day traditions. The study also revealed that Oranyan Festival made people to have a better understanding of their rich Cultural Heritage and promoted unity among all and sundry. It also promotes peace among the people. Conclusively, it promotes the rich Cultural Heritage of Oyo Yoruba’s both within and outside NIGERIA and the world at large.Keywords: Yoruba Oyo, arts and culture, Oranyan, festival
Procedia PDF Downloads 30252343 The Establishment of Probabilistic Risk Assessment Analysis Methodology for Dry Storage Concrete Casks Using SAPHIRE 8
Authors: J. R. Wang, W. Y. Cheng, J. S. Yeh, S. W. Chen, Y. M. Ferng, J. H. Yang, W. S. Hsu, C. Shih
Abstract:
To understand the risk for dry storage concrete casks in the cask loading, transfer, and storage phase, the purpose of this research is to establish the probabilistic risk assessment (PRA) analysis methodology for dry storage concrete casks by using SAPHIRE 8 code. This analysis methodology is used to perform the study of Taiwan nuclear power plants (NPPs) dry storage system. The process of research has three steps. First, the data of the concrete casks and Taiwan NPPs are collected. Second, the PRA analysis methodology is developed by using SAPHIRE 8. Third, the PRA analysis is performed by using this methodology. According to the analysis results, the maximum risk is the multipurpose canister (MPC) drop case.Keywords: PRA, dry storage, concrete cask, SAPHIRE
Procedia PDF Downloads 21252342 Case Study Analysis of 2017 European Railway Traffic Management Incident: The Application of System for Investigation of Railway Interfaces Methodology
Authors: Sanjeev Kumar Appicharla
Abstract:
This paper presents the results of the modelling and analysis of the European Railway Traffic Management (ERTMS) safety-critical incident to raise awareness of biases in the systems engineering process on the Cambrian Railway in the UK using the RAIB 17/2019 as a primary input. The RAIB, the UK independent accident investigator, published the Report- RAIB 17/2019 giving the details of their investigation of the focal event in the form of immediate cause, causal factors, and underlying factors and recommendations to prevent a repeat of the safety-critical incident on the Cambrian Line. The Systems for Investigation of Railway Interfaces (SIRI) is the methodology used to model and analyze the safety-critical incident. The SIRI methodology uses the Swiss Cheese Model to model the incident and identify latent failure conditions (potentially less than adequate conditions) by means of the management oversight and risk tree technique. The benefits of the systems for investigation of railway interfaces methodology (SIRI) are threefold: first is that it incorporates the “Heuristics and Biases” approach advanced by 2002 Nobel laureate in Economic Sciences, Prof Daniel Kahneman, in the management oversight and risk tree technique to identify systematic errors. Civil engineering and programme management railway professionals are aware of the role “optimism bias” plays in programme cost overruns and are aware of bow tie (fault and event tree) model-based safety risk modelling techniques. However, the role of systematic errors due to “Heuristics and Biases” is not appreciated as yet. This overcomes the problems of omission of human and organizational factors from accident analysis. Second, the scope of the investigation includes all levels of the socio-technical system, including government, regulatory, railway safety bodies, duty holders, signaling firms and transport planners, and front-line staff such that lessons are learned at the decision making and implementation level as well. Third, the author’s past accident case studies are supplemented with research pieces of evidence drawn from the practitioner's and academic researchers’ publications as well. This is to discuss the role of system thinking to improve the decision-making and risk management processes and practices in the IEC 15288 systems engineering standard and in the industrial context such as the GB railways and artificial intelligence (AI) contexts as well.Keywords: accident analysis, AI algorithm internal audit, bounded rationality, Byzantine failures, heuristics and biases approach
Procedia PDF Downloads 18852341 Hedging and Corporate Governance: Lessons from the Financial Crisis
Authors: Rodrigo Zeidan
Abstract:
The paper identifies failures of decision making and corporate governance that allow non-financial companies around the world to develop hedging strategies that lead to hefty losses in the aftermath of the financial crisis. The sample is comprised of 346 companies from 10 international markets, of which 49 companies (and a subsample of 13 distressed companies) lose a combined US$18.9 billion. An event study shows that most companies that present losses in derivatives experience negative abnormal returns, including a number of companies in which the effect is persistent after a year. The results of a probit model indicate that the lack of a formal hedging policy, no monitoring to the CFOs, and considerations of hubris and remuneration contribute to the mismanagement of hedging policies.Keywords: risk management, hedging, derivatives, monitoring, corporate governance structure, event study, hubris
Procedia PDF Downloads 44252340 Optimal Pressure Control and Burst Detection for Sustainable Water Management
Authors: G. K. Viswanadh, B. Rajasekhar, G. Venkata Ramana
Abstract:
Water distribution networks play a vital role in ensuring a reliable supply of clean water to urban areas. However, they face several challenges, including pressure control, pump speed optimization, and burst event detection. This paper combines insights from two studies to address these critical issues in Water distribution networks, focusing on the specific context of Kapra Municipality, India. The first part of this research concentrates on optimizing pressure control and pump speed in complex Water distribution networks. It utilizes the EPANET- MATLAB Toolkit to integrate EPANET functionalities into the MATLAB environment, offering a comprehensive approach to network analysis. By optimizing Pressure Reduce Valves (PRVs) and variable speed pumps (VSPs), this study achieves remarkable results. In the Benchmark Water Distribution System (WDS), the proposed PRV optimization algorithm reduces average leakage by 20.64%, surpassing the previous achievement of 16.07%. When applied to the South-Central and East zone WDS of Kapra Municipality, it identifies PRV locations that were previously missed by existing algorithms, resulting in average leakage reductions of 22.04% and 10.47%. These reductions translate to significant daily Water savings, enhancing Water supply reliability and reducing energy consumption. The second part of this research addresses the pressing issue of burst event detection and localization within the Water Distribution System. Burst events are a major contributor to Water losses and repair expenses. The study employs wireless sensor technology to monitor pressure and flow rate in real time, enabling the detection of pipeline abnormalities, particularly burst events. The methodology relies on transient analysis of pressure signals, utilizing Cumulative Sum and Wavelet analysis techniques to robustly identify burst occurrences. To enhance precision, burst event localization is achieved through meticulous analysis of time differentials in the arrival of negative pressure waveforms across distinct pressure sensing points, aided by nodal matrix analysis. To evaluate the effectiveness of this methodology, a PVC Water pipeline test bed is employed, demonstrating the algorithm's success in detecting pipeline burst events at flow rates of 2-3 l/s. Remarkably, the algorithm achieves a localization error of merely 3 meters, outperforming previously established algorithms. This research presents a significant advancement in efficient burst event detection and localization within Water pipelines, holding the potential to markedly curtail Water losses and the concomitant financial implications. In conclusion, this combined research addresses critical challenges in Water distribution networks, offering solutions for optimizing pressure control, pump speed, burst event detection, and localization. These findings contribute to the enhancement of Water Distribution System, resulting in improved Water supply reliability, reduced Water losses, and substantial cost savings. The integrated approach presented in this paper holds promise for municipalities and utilities seeking to improve the efficiency and sustainability of their Water distribution networks.Keywords: pressure reduce valve, complex networks, variable speed pump, wavelet transform, burst detection, CUSUM (Cumulative Sum), water pipeline monitoring
Procedia PDF Downloads 8752339 A Case for Q-Methodology: Teachers as Policymakers
Authors: Thiru Vandeyar
Abstract:
The present study set out to determine how Q methodology may be used as an inclusive education policy development process. Utilising Q-methodology as a strategy of inquiry, this qualitative instrumental case study set out to explore how teachers, as a crucial but often neglected human resource, may be included in developing policy. A social constructivist lens and the theoretical moorings of Proudford’s emancipatory approach to educational change anchored in teachers’ ‘writerly’ interpretation of policy text was employed. Findings suggest that Q-method is a unique research approach to include teachers’ voices in policy development. Second, that beliefs, attitudes, and professionalism of teachers to improve teaching and learning using ICT are integral to policy formulation. The study indicates that teachers have unique beliefs about what statements should constitute a school’s information and communication (ICT) policy. Teachers’ experiences are an extremely valuable resource in and should not be ignored in the policy formulation process.Keywords: teachers, q-methodology, education policy, ICT
Procedia PDF Downloads 8552338 The Effects of 2016 Rio Olympics as Nation's Soft Power Strategy
Authors: Keunsu Han
Abstract:
Sports has been used as a valuable tool for countries to enhance brand image and to pursue higher political interests. Olympic games are one of the best examples as a mega sport event to achieve such nations’ purposes. The term, “soft power,” coined by Nye, refers to country’s ability to persuade and attract foreign audiences through non-coercive ways such as cultural, diplomatic, and economic means. This concept of soft power provides significant answers about why countries are willing to host a mega sport event such as Olympics. This paper reviews the concept of soft power by Nye as a theoretical framework of this study to understand critical motivation for countries to host Olympics and examines the effects of 2016 Rio Olympics as the state’s soft power strategy. Thorough data analysis including media, government and private-sector documents, this research analyzes both negative and positive aspects of the nation’s image created during Rio Olympics and discusses the effects of Rio Olympics as Brazil’s chance to showcase its soft power by highlighting the best the state has to present.Keywords: country brand, olympics, soft power, sport diplomacy, mega sport event
Procedia PDF Downloads 45952337 Conflation Methodology Applied to Flood Recovery
Authors: Eva L. Suarez, Daniel E. Meeroff, Yan Yong
Abstract:
Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.Keywords: community resilience, conflation, flood risk, nuisance flooding
Procedia PDF Downloads 103