Search results for: event quantification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1687

Search results for: event quantification

1657 A Study of Mandarin Ba Constructions from the Perspective of Event Structure

Authors: Changyin Zhou

Abstract:

Ba constructions are a special type of constructions in Chinese. Their syntactic behaviors are closely related to their event structural properties. The existing study which treats the semantic function of Ba as causative meets difficulty in treating the discrepancy between Ba constructions and their corresponding constructions without Ba in expressing causativity. This paper holds that Ba in Ba constructions is a functional category expressing affectedness. The affectedness expressed by Ba can be positive or negative. The functional category Ba expressing negative affectedness has the semantic property of being 'expected'. The precondition of Ba construction is the boundedness of the event concerned. This paper, holding the parallelism between motion events and change-of-state events, proposes a syntactic model based on the notions of boundedness and affectedness, discusses the transformations between Ba constructions and the related resultative constructions, and derivates the various Ba constructions concerned.

Keywords: affectedness, Ba constructions, boundedness, event structure, resultative constructions

Procedia PDF Downloads 400
1656 Multimedia Data Fusion for Event Detection in Twitter by Using Dempster-Shafer Evidence Theory

Authors: Samar M. Alqhtani, Suhuai Luo, Brian Regan

Abstract:

Data fusion technology can be the best way to extract useful information from multiple sources of data. It has been widely applied in various applications. This paper presents a data fusion approach in multimedia data for event detection in twitter by using Dempster-Shafer evidence theory. The methodology applies a mining algorithm to detect the event. There are two types of data in the fusion. The first is features extracted from text by using the bag-ofwords method which is calculated using the term frequency-inverse document frequency (TF-IDF). The second is the visual features extracted by applying scale-invariant feature transform (SIFT). The Dempster - Shafer theory of evidence is applied in order to fuse the information from these two sources. Our experiments have indicated that comparing to the approaches using individual data source, the proposed data fusion approach can increase the prediction accuracy for event detection. The experimental result showed that the proposed method achieved a high accuracy of 0.97, comparing with 0.93 with texts only, and 0.86 with images only.

Keywords: data fusion, Dempster-Shafer theory, data mining, event detection

Procedia PDF Downloads 375
1655 Quantification of Lustre in Textile Fibers by Image Analysis

Authors: Neelesh Bharti Shukla, Suvankar Dutta, Esha Sharma, Shrikant Ralebhat, Gurudatt Krishnamurthy

Abstract:

A key component of the physical attribute of textile fibers is lustre. It is a complex phenomenon arising from the interaction of light with fibers, yarn and fabrics. It is perceived as the contrast difference between the bright areas (specular reflection) and duller backgrounds (diffused reflection). Lustre of fibers is affected by their surface structure, morphology, cross-section profile as well as the presence of any additives/registrants. Due to complexities in measurements, objective measurements such as gloss meter do not give reproducible quantification of lustre. Other instruments such as SAMBA hair systems are expensive. In light of this, lustre quantification has largely remained subjective, judged visually by experts, but prone to errors. In this development, a physics-based approach was conceptualized and demonstrated. We have developed an image analysis based technique to quantify visually observed differences in lustre of fibers. Cellulosic fibers, produced with different approaches, with visually different levels of lustre were photographed under controlled optics. These images were subsequently analyzed using a configured software system. The ratio of Intensity of light from bright (specular reflection) and dull (diffused reflection) areas was used to numerically represent lustre. In the next step, the set of samples that were not visually distinguishable easily were also evaluated by the technique and it was established that quantification of lustre is feasible.

Keywords: lustre, fibre, image analysis, measurement

Procedia PDF Downloads 148
1654 Evolving Knowledge Extraction from Online Resources

Authors: Zhibo Xiao, Tharini Nayanika de Silva, Kezhi Mao

Abstract:

In this paper, we present an evolving knowledge extraction system named AKEOS (Automatic Knowledge Extraction from Online Sources). AKEOS consists of two modules, including a one-time learning module and an evolving learning module. The one-time learning module takes in user input query, and automatically harvests knowledge from online unstructured resources in an unsupervised way. The output of the one-time learning is a structured vector representing the harvested knowledge. The evolving learning module automatically schedules and performs repeated one-time learning to extract the newest information and track the development of an event. In addition, the evolving learning module summarizes the knowledge learned at different time points to produce a final knowledge vector about the event. With the evolving learning, we are able to visualize the key information of the event, discover the trends, and track the development of an event.

Keywords: evolving learning, knowledge extraction, knowledge graph, text mining

Procedia PDF Downloads 434
1653 Development and Validation of a Liquid Chromatographic Method for the Quantification of Related Substance in Gentamicin Drug Substances

Authors: Sofiqul Islam, V. Murugan, Prema Kumari, Hari

Abstract:

Gentamicin is a broad spectrum water-soluble aminoglycoside antibiotics produced by the fermentation process of microorganism known as Micromonospora purpurea. It is widely used for the treatment of infection caused by both gram positive and gram negative bacteria. Gentamicin consists of a mixture of aminoglycoside components like C1, C1a, C2a, and C2. The molecular structure of Gentamicin and its related substances showed that it has lack of presence of chromophore group in the molecule due to which the detection of such components were quite critical and challenging. In this study, a simple Reversed Phase-High Performance Liquid Chromatographic (RP-HPLC) method using ultraviolet (UV) detector was developed and validated for quantification of the related substances present in Gentamicin drug substances. The method was achieved by using Thermo Scientific Hypersil Gold analytical column (150 x 4.6 mm, 5 µm particle size) with isocratic elution composed of methanol: water: glacial acetic acid: sodium hexane sulfonate in the ratio 70:25:5:3 % v/v/v/w as a mobile phase at a flow rate of 0.5 mL/min, column temperature was maintained at 30 °C and detection wavelength of 330 nm. The four components of Gentamicin namely Gentamicin C1, C1a, C2a, and C2 were well separated along with the related substance present in Gentamicin. The Limit of Quantification (LOQ) values were found to be at 0.0075 mg/mL. The accuracy of the method was quite satisfactory in which the % recovery was resulted between 95-105% for the related substances. The correlation coefficient (≥ 0.995) shows the linearity response against concentration over the range of Limit of Quantification (LOQ). Precision studies showed the % Relative Standard Deviation (RSD) values less than 5% for its related substance. The method was validated in accordance with the International Conference of Harmonization (ICH) guideline with various parameters like system suitability, specificity, precision, linearity, accuracy, limit of quantification, and robustness. This proposed method was easy and suitable for use for the quantification of related substances in routine analysis of Gentamicin formulations.

Keywords: reversed phase-high performance liquid chromatographic (RP-HPLC), high performance liquid chromatography, gentamicin, isocratic, ultraviolet

Procedia PDF Downloads 139
1652 Cloud-Based Dynamic Routing with Feedback in Formal Methods

Authors: Jawid Ahmad Baktash, Mursal Dawodi, Tomokazu Nagata

Abstract:

With the rapid growth of Cloud Computing, Formal Methods became a good choice for the refinement of message specification and verification for Dynamic Routing in Cloud Computing. Cloud-based Dynamic Routing is becoming increasingly popular. We propose feedback in Formal Methods for Dynamic Routing and Cloud Computing; the model and topologies show how to send messages from index zero to all others formally. The responsibility of proper verification becomes crucial with Dynamic Routing in the cloud. Formal Methods can play an essential role in the routing and development of Networks, and the testing of distributed systems. Event-B is a formal technique that consists of describing the problem rigorously and introduces solutions or details in the refinement steps. Event-B is a variant of B, designed for developing distributed systems and message passing of the dynamic routing. In Event-B and formal methods, the events consist of guarded actions occurring spontaneously rather than being invoked.

Keywords: cloud, dynamic routing, formal method, Pro-B, event-B

Procedia PDF Downloads 386
1651 A Simulation Tool for Projection Mapping Based on Mapbox and Unity

Authors: Noriko Hanakawa, Masaki Obana

Abstract:

A simulation tool has been proposed for big-scale projection mapping events. The tool has four main functions based on Mapbox and Unity utilities. The first function is building a 3D model of real cities by MapBox. The second function is a movie projection to some buildings in real cities by Unity. The third function is a movie sending function from a PC to a virtual projector. The fourth function is mapping movies with fitting buildings. The simulation tool was adapted to a real projection mapping event that was held in 2019. The event has been finished. The event had a serious problem in the movie projection to the target building. The extra tents were set in front of the target building. The tents became the obstacles to the movie projection. The simulation tool can be reappeared the problems of the event. Therefore, if the simulation tool was developed before the 2019 projection mapping event, the problem of the tents’ obstacles could be avoided with the simulation tool. In addition, we confirmed that the simulation tool is useful to make a plan of future projection mapping events in order to avoid obstacles of various extra equipment such as utility poles, planting trees, monument towers.

Keywords: projection mapping, projector position, real 3D map, avoiding obstacles

Procedia PDF Downloads 174
1650 Sleep Apnea Hypopnea Syndrom Diagnosis Using Advanced ANN Techniques

Authors: Sachin Singh, Thomas Penzel, Dinesh Nandan

Abstract:

Accurate identification of Sleep Apnea Hypopnea Syndrom Diagnosis is difficult problem for human expert because of variability among persons and unwanted noise. This paper proposes the diagonosis of Sleep Apnea Hypopnea Syndrome (SAHS) using airflow, ECG, Pulse and SaO2 signals. The features of each type of these signals are extracted using statistical methods and ANN learning methods. These extracted features are used to approximate the patient's Apnea Hypopnea Index(AHI) using sample signals in model. Advance signal processing is also applied to snore sound signal to locate snore event and SaO2 signal is used to support whether determined snore event is true or noise. Finally, Apnea Hypopnea Index (AHI) event is calculated as per true snore event detected. Experiment results shows that the sensitivity can reach up to 96% and specificity to 96% as AHI greater than equal to 5.

Keywords: neural network, AHI, statistical methods, autoregressive models

Procedia PDF Downloads 97
1649 Quality by Design in the Optimization of a Fast HPLC Method for Quantification of Hydroxychloroquine Sulfate

Authors: Pedro J. Rolim-Neto, Leslie R. M. Ferraz, Fabiana L. A. Santos, Pablo A. Ferreira, Ricardo T. L. Maia-Jr., Magaly A. M. Lyra, Danilo A F. Fonte, Salvana P. M. Costa, Amanda C. Q. M. Vieira, Larissa A. Rolim

Abstract:

Initially developed as an antimalarial agent, hydroxychloroquine (HCQ) sulfate is often used as a slow-acting antirheumatic drug in the treatment of disorders of connective tissue. The United States Pharmacopeia (USP) 37 provides a reversed-phase HPLC method for quantification of HCQ. However, this method was not reproducible, producing asymmetric peaks in a long analysis time. The asymmetry of the peak may cause an incorrect calculation of the concentration of the sample. Furthermore, the analysis time is unacceptable, especially regarding the routine of a pharmaceutical industry. The aiming of this study was to develop a fast, easy and efficient method for quantification of HCQ sulfate by High Performance Liquid Chromatography (HPLC) based on the Quality by Design (QbD) methodology. This method was optimized in terms of peak symmetry using the surface area graphic as the Design of Experiments (DoE) and the tailing factor (TF) as an indicator to the Design Space (DS). The reference method used was that described at USP 37 to the quantification of the drug. For the optimized method, was proposed a 33 factorial design, based on the QbD concepts. The DS was created with the TF (in a range between 0.98 and 1.2) in order to demonstrate the ideal analytical conditions. Changes were made in the composition of the USP mobile-phase (USP-MP): USP-MP: Methanol (90:10 v/v, 80:20 v/v and 70:30 v/v), in the flow (0.8, 1.0 and 1.2 mL) and in the oven temperature (30, 35, and 40ºC). The USP method allowed the quantification of drug in a long time (40-50 minutes). In addition, the method uses a high flow rate (1,5 mL.min-1) which increases the consumption of expensive solvents HPLC grade. The main problem observed was the TF value (1,8) that would be accepted if the drug was not a racemic mixture, since the co-elution of the isomers can become an unreliable peak integration. Therefore, the optimization was suggested in order to reduce the analysis time, aiming a better peak resolution and TF. For the optimization method, by the analysis of the surface-response plot it was possible to confirm the ideal setting analytical condition: 45 °C, 0,8 mL.min-1 and 80:20 USP-MP: Methanol. The optimized HPLC method enabled the quantification of HCQ sulfate, with a peak of high resolution, showing a TF value of 1,17. This promotes good co-elution of isomers of the HCQ, ensuring an accurate quantification of the raw material as racemic mixture. This method also proved to be 18 times faster, approximately, compared to the reference method, using a lower flow rate, reducing even more the consumption of the solvents and, consequently, the analysis cost. Thus, an analytical method for the quantification of HCQ sulfate was optimized using QbD methodology. This method proved to be faster and more efficient than the USP method, regarding the retention time and, especially, the peak resolution. The higher resolution in the chromatogram peaks supports the implementation of the method for quantification of the drug as racemic mixture, not requiring the separation of isomers.

Keywords: analytical method, hydroxychloroquine sulfate, quality by design, surface area graphic

Procedia PDF Downloads 610
1648 The Stock Price Effect of Apple Keynotes

Authors: Ethan Petersen

Abstract:

In this paper, we analyze the volatility of Apple’s stock beginning January 3, 2005 up to October 9, 2014, then focus on a range from 30 days prior to each product announcement until 30 days after. Product announcements are filtered; announcements whose 60 day range is devoid of other events are separated. This filtration is chosen to isolate, and study, a potential cross-effect. Concerning Apple keynotes, there are two significant dates: the day the invitations to the event are received and the day of the event itself. As such, the statistical analysis is conducted for both invite-centered and event-centered time frames. A comparison to the VIX is made to determine if the trend is simply following the market or deviating. Regardless of the filtration, we find that there is a clear deviation from the market. Comparing these data sets, there are significantly different trends: isolated events have a constantly decreasing, erratic trend in volatility but an increasing, linear trend is observed for clustered events. According to the Efficient Market Hypothesis, we would expect a change when new information is publicly known and the results of this study support this claim.

Keywords: efficient market hypothesis, event study, volatility, VIX

Procedia PDF Downloads 256
1647 Dow Polyols near Infrared Chemometric Model Reduction Based on Clustering: Reducing Thirty Global Hydroxyl Number (OH) Models to Less Than Five

Authors: Wendy Flory, Kazi Czarnecki, Matthijs Mercy, Mark Joswiak, Mary Beth Seasholtz

Abstract:

Polyurethane Materials are present in a wide range of industrial segments such as Furniture, Building and Construction, Composites, Automotive, Electronics, and more. Dow is one of the leaders for the manufacture of the two main raw materials, Isocyanates and Polyols used to produce polyurethane products. Dow is also a key player for the manufacture of Polyurethane Systems/Formulations designed for targeted applications. In 1990, the first analytical chemometric models were developed and deployed for use in the Dow QC labs of the polyols business for the quantification of OH, water, cloud point, and viscosity. Over the years many models have been added; there are now over 140 models for quantification and hundreds for product identification, too many to be reasonable for support. There are 29 global models alone for the quantification of OH across > 70 products at many sites. An attempt was made to consolidate these into a single model. While the consolidated model proved good statistics across the entire range of OH, several products had a bias by ASTM E1655 with individual product validation. This project summary will show the strategy for global model updates for OH, to reduce the number of models for quantification from over 140 to 5 or less using chemometric methods. In order to gain an understanding of the best product groupings, we identify clusters by reducing spectra to a few dimensions via Principal Component Analysis (PCA) and Uniform Manifold Approximation and Projection (UMAP). Results from these cluster analyses and a separate validation set allowed dow to reduce the number of models for predicting OH from 29 to 3 without loss of accuracy.

Keywords: hydroxyl, global model, model maintenance, near infrared, polyol

Procedia PDF Downloads 108
1646 Memory and Narratives Rereading before and after One Week

Authors: Abigail M. Csik, Gabriel A. Radvansky

Abstract:

As people read through event-based narratives, they construct an event model that captures information about the characters, goals, location, time, and causality. For many reasons, memory for such narratives is represented at different levels, namely, the surface form, textbase, and event model levels. Rereading has been shown to decrease surface form memory, while, at the same time, increasing textbase and event model memories. More generally, distributed practice has consistently shown memory benefits over massed practice for different types of materials, including texts. However, little research has investigated distributed practice of narratives at different inter-study intervals and these effects on these three levels of memory. Recent work in our lab has indicated that there may be dramatic changes in patterns of forgetting around one week, which may affect the three levels of memory. The present experiment aimed to determine the effects of rereading on the three levels of memory as a factor of whether the texts were reread before versus after one week. Participants (N = 42) read a set of stories, re-read them either before or after one week (with an inter-study interval of three days, seven days, or fourteen days), and then took a recognition test, from which the three levels of representation were derived. Signal detection results from this study reveal that differential patterns at the three levels as a factor of whether the narratives were re-read prior to one week or after one week. In particular, an ANOVA revealed that surface form memory was lower (p = .08) while textbase (p = .02) and event model memory (p = .04) were greater if narratives were re-read 14 days later compared to memory when narratives were re-read 3 days later. These results have implications for what type of memory benefits from distributed practice at various inter-study intervals.

Keywords: memory, event cognition, distributed practice, consolidation

Procedia PDF Downloads 194
1645 Event Driven Dynamic Clustering and Data Aggregation in Wireless Sensor Network

Authors: Ashok V. Sutagundar, Sunilkumar S. Manvi

Abstract:

Energy, delay and bandwidth are the prime issues of wireless sensor network (WSN). Energy usage optimization and efficient bandwidth utilization are important issues in WSN. Event triggered data aggregation facilitates such optimal tasks for event affected area in WSN. Reliable delivery of the critical information to sink node is also a major challenge of WSN. To tackle these issues, we propose an event driven dynamic clustering and data aggregation scheme for WSN that enhances the life time of the network by minimizing redundant data transmission. The proposed scheme operates as follows: (1) Whenever the event is triggered, event triggered node selects the cluster head. (2) Cluster head gathers data from sensor nodes within the cluster. (3) Cluster head node identifies and classifies the events out of the collected data using Bayesian classifier. (4) Aggregation of data is done using statistical method. (5) Cluster head discovers the paths to the sink node using residual energy, path distance and bandwidth. (6) If the aggregated data is critical, cluster head sends the aggregated data over the multipath for reliable data communication. (7) Otherwise aggregated data is transmitted towards sink node over the single path which is having the more bandwidth and residual energy. The performance of the scheme is validated for various WSN scenarios to evaluate the effectiveness of the proposed approach in terms of aggregation time, cluster formation time and energy consumed for aggregation.

Keywords: wireless sensor network, dynamic clustering, data aggregation, wireless communication

Procedia PDF Downloads 415
1644 Formal Implementation of Routing Information Protocol Using Event-B

Authors: Jawid Ahmad Baktash, Tadashi Shiroma, Tomokazu Nagata, Yuji Taniguchi, Morikazu Nakamura

Abstract:

The goal of this paper is to explore the use of formal methods for Dynamic Routing, The purpose of network communication with dynamic routing is sending a massage from one node to others by using pacific protocols. In dynamic routing connections are possible based on protocols of Distance vector (Routing Information Protocol, Border Gateway protocol), Link State (Open Shortest Path First, Intermediate system Intermediate System), Hybrid (Enhanced Interior Gateway Routing Protocol). The responsibility for proper verification becomes crucial with Dynamic Routing. Formal methods can play an essential role in the Routing, development of Networks and testing of distributed systems. Event-B is a formal technique consists of describing rigorously the problem; introduce solutions or details in the refinement steps to obtain more concrete specification, and verifying that proposed solutions are correct. The system is modeled in terms of an abstract state space using variables with set theoretic types and the events that modify state variables. Event-B is a variant of B, was designed for developing distributed systems. In Event-B, the events consist of guarded actions occurring spontaneously rather than being invoked. The invariant state properties must be satisfied by the variables and maintained by the activation of the events.

Keywords: dynamic rout RIP, formal method, event-B, pro-B

Procedia PDF Downloads 382
1643 Requirements Definitions of Real-Time System Using the Behavioral Patterns Analysis (BPA) Approach: The Healthcare Multi-Agent System

Authors: Assem El-Ansary

Abstract:

This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach using the Healthcare Multi-Agent System. The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are: The Behavioral Pattern Analysis (BPA) modeling methodology. The development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.

Keywords: analysis, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases, Healthcare Multi-Agent System

Procedia PDF Downloads 520
1642 Object Negotiation Mechanism for an Intelligent Environment Using Event Agents

Authors: Chiung-Hui Chen

Abstract:

With advancements in science and technology, the concept of the Internet of Things (IoT) has gradually developed. The development of the intelligent environment adds intelligence to objects in the living space by using the IoT. In the smart environment, when multiple users share the living space, if different service requirements from different users arise, then the context-aware system will have conflicting situations for making decisions about providing services. Therefore, the purpose of establishing a communication and negotiation mechanism among objects in the intelligent environment is to resolve those service conflicts among users. This study proposes developing a decision-making methodology that uses “Event Agents” as its core. When the sensor system receives information, it evaluates a user’s current events and conditions; analyses object, location, time, and environmental information; calculates the priority of the object; and provides the user services based on the event. Moreover, when the event is not single but overlaps with another, conflicts arise. This study adopts the “Multiple Events Correlation Matrix” in order to calculate the degree values of incidents and support values for each object. The matrix uses these values as the basis for making inferences for system service, and to further determine appropriate services when there is a conflict.

Keywords: internet of things, intelligent object, event agents, negotiation mechanism, degree of similarity

Procedia PDF Downloads 265
1641 Optimizing Coal Yard Management Using Discrete Event Simulation

Authors: Iqbal Felani

Abstract:

A Coal-Fired Power Plant has some integrated facilities to handle coal from three separated coal yards to eight units power plant’s bunker. But nowadays the facilities are not reliable enough for supporting the system. Management planned to invest some facilities to increase the reliability. They also had a plan to make single spesification of coal used all of the units, called Single Quality Coal (SQC). This simulation would compare before and after improvement with two scenarios i.e First In First Out (FIFO) and Last In First Out (LIFO). Some parameters like stay time, reorder point and safety stock is determined by the simulation. Discrete event simulation based software, Flexsim 5.0, is used to help the simulation. Based on the simulation, Single Quality Coal with FIFO scenario has the shortest staytime with 8.38 days.

Keywords: Coal Yard Management, Discrete event simulation First In First Out, Last In First Out.

Procedia PDF Downloads 639
1640 Intelligent Agent Travel Reservation System Requirements Definitions Using the Behavioral Patterns Analysis (BPA) Approach

Authors: Assem El-Ansary

Abstract:

This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Intelligent Agent Reservation System (IARS). The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are developing the Behavioral Pattern Analysis (BPA) modeling methodology, and developing an interactive software tool (DECISION) which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.

Keywords: analysis, intelligent agent, reservation system, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases

Procedia PDF Downloads 455
1639 Risk Management of Natural Disasters on Insurance Stock Market

Authors: Tarah Bouaricha

Abstract:

The impact of worst natural disasters is analysed in terms of insured losses which happened between 2010 and 2014 on S&P insurance index. Event study analysis is used to test whether natural disasters impact insurance index stock market price. There is no negative impact on insurance stock market price around the disasters event. To analyse the reaction of insurance stock market, normal returns (NR), abnormal returns (AR), cumulative abnormal returns (CAR), cumulative average abnormal returns (CAAR) and a parametric test on AR and on CAR are used.

Keywords: study event, natural disasters, insurance, reinsurance, stock market

Procedia PDF Downloads 364
1638 Impacts of Electronic Dance Music towards Social Harmony: The Malaysian Perspective

Authors: Kok Meng Ng, Sulung Veronica

Abstract:

Electronic Dance Music (EDM), a musical event that so sought-after amongst the youth, is getting prevailed around the world. The emergence of this à la mode event has magnetized lots of attentions from the media as well as the public due to its high probabilities in creating social problems and menacing social harmony of one destination, for instance, two death cases occurred during the EDM events in Malaysia caused a feeling of consternation of the society. The arguments over the impacts of such events towards the society are endless. This paper focuses on the study of the impacts of EDM towards social harmony in Klang Valley area, Malaysia by scrutinizing the contradiction of statements from several experts and the local communities. This study sampled 15-20 people that represent different social background with face-to-face and online interview through snowball sampling method. This study helps to understand the social context as a whole based on the impacts of EDM events that take place in Malaysia. It also provides valuable information to EDMs’ organizer as well as local authorities for a proper event management to minimize EDM impacts towards society as part of the sustainable growth of the event industry.

Keywords: electronic dance music, social harmony, impacts, Klang Valley

Procedia PDF Downloads 223
1637 Approach to Quantify Groundwater Recharge Using GIS Based Water Balance Model

Authors: S. S. Rwanga, J. M. Ndambuki

Abstract:

Groundwater quantification needs a method which is not only flexible but also reliable in order to accurately quantify its spatial and temporal variability. As groundwater is dynamic and interdisciplinary in nature, an integrated approach of remote sensing (RS) and GIS technique is very useful in various groundwater management studies. Thus, the GIS water balance model (WetSpass) together with remote sensing (RS) can be used to quantify groundwater recharge. This paper discusses the concept of WetSpass in combination with GIS on the quantification of recharge with a view to managing water resources in an integrated framework. The paper presents the simulation procedures and expected output after simulation. Preliminary data are presented from GIS output only.

Keywords: groundwater, recharge, GIS, WetSpass

Procedia PDF Downloads 426
1636 Competing Risk Analyses in Survival Trials During COVID-19 Pandemic

Authors: Ping Xu, Gregory T. Golm, Guanghan (Frank) Liu

Abstract:

In the presence of competing events, traditional survival analysis may not be appropriate and can result in biased estimates, as it assumes independence between competing events and the event of interest. Instead, competing risk analysis should be considered to correctly estimate the survival probability of the event of interest and the hazard ratio between treatment groups. The COVID-19 pandemic has provided a potential source of competing risks in clinical trials, as participants in trials may experienceCOVID-related competing events before the occurrence of the event of interest, for instance, death due to COVID-19, which can affect the incidence rate of the event of interest. We have performed simulation studies to compare multiple competing risk analysis models, including the cumulative incidence function, the sub-distribution hazard function, and the cause-specific hazard function, to the traditional survival analysis model under various scenarios. We also provide a general recommendation on conducting competing risk analysis in randomized clinical trials during the era of the COVID-19 pandemic based on the extensive simulation results.

Keywords: competing risk, survival analysis, simulations, randomized clinical trial, COVID-19 pandemic

Procedia PDF Downloads 162
1635 Comparison Between a Droplet Digital PCR and Real Time PCR Method in Quantification of HBV DNA

Authors: Surangrat Srisurapanon, Chatchawal Wongjitrat, Navin Horthongkham, Ruengpung Sutthent

Abstract:

HBV infection causes a potential serious public health problem. The ability to detect the HBV DNA concentration is of the importance and improved continuously. By using quantitative Polymerase Chain Reaction (qPCR), several factors in standardized; source of material, calibration standard curve and PCR efficiency are inconsistent. Digital PCR (dPCR) is an alternative PCR-based technique for absolute quantification using Poisson's statistics without requiring a standard curve. Therefore, the aim of this study is to compare the data set of HBV DNA generated between dPCR and qPCR methods. All samples were quantified by Abbott’s real time PCR and 54 samples with 2 -6 log10 HBV DNA were selected for comparison with dPCR. Of these 54 samples, there were two outlier samples defined as negative by dPCR. Of these two, samples were defined as negative by dPCR, whereas 52 samples were positive by both the tests. The difference between the two assays was less than 0.25 log IU/mL in 24/52 samples (46%) of paired samples; less than 0.5 log IU/mL in 46/52 samples (88%) and less than 1 log in 50/52 samples (96%). The correlation coefficient was r=0.788 and P-value <0.0001. Comparison to qPCR, data generated by dPCR tend to be the overestimation in the sample with low HBV DNA concentration and underestimated in the sample with high viral load. The variation in DNA by dPCR measurement might be due to the pre-amplification bias, template. Moreover, a minor drawback of dPCR is the large quantity of DNA had to be used when compare to the qPCR. Since the technology is relatively new, the limitations of this assay will be improved.

Keywords: hepatitis B virus, real time PCR, digital PCR, DNA quantification

Procedia PDF Downloads 447
1634 Multi-Agent Railway Control System: Requirements Definitions of Multi-Agent System Using the Behavioral Patterns Analysis (BPA) Approach

Authors: Assem I. El-Ansary

Abstract:

This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Multi-Agent Railway Control System (MARCS). The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are the Behavioral Pattern Analysis (BPA) modeling methodology, and the development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.

Keywords: analysis, multi-agent, railway control, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases

Procedia PDF Downloads 513
1633 Development of Colorimetric Based Microfluidic Platform for Quantification of Fluid Contaminants

Authors: Sangeeta Palekar, Mahima Rana, Jayu Kalambe

Abstract:

In this paper, a microfluidic-based platform for the quantification of contaminants in the water is proposed. The proposed system uses microfluidic channels with an embedded environment for contaminants detection in water. Microfluidics-based platforms present an evident stage of innovation for fluid analysis, with different applications advancing minimal efforts and simplicity of fabrication. Polydimethylsiloxane (PDMS)-based microfluidics channel is fabricated using a soft lithography technique. Vertical and horizontal connections for fluid dispensing with the microfluidic channel are explored. The principle of colorimetry, which incorporates the use of Griess reagent for the detection of nitrite, has been adopted. Nitrite has high water solubility and water retention, due to which it has a greater potential to stay in groundwater, endangering aquatic life along with human health, hence taken as a case study in this work. The developed platform also compares the detection methodology, containing photodetectors for measuring absorbance and image sensors for measuring color change for quantification of contaminants like nitrite in water. The utilization of image processing techniques offers the advantage of operational flexibility, as the same system can be used to identify other contaminants present in water by introducing minor software changes.

Keywords: colorimetric, fluid contaminants, nitrite detection, microfluidics

Procedia PDF Downloads 173
1632 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach

Authors: Jerry Q. Cheng

Abstract:

Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.

Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing

Procedia PDF Downloads 135
1631 Quantification of Learned Non-Use of the Upper-Limb After a Stroke

Authors: K. K. A. Bakhti, D. Mottet, J. Froger, I. Laffont

Abstract:

Background: After a cerebrovascular accident (or stroke), many patients use excessive trunk movements to move their paretic hand towards a target (while the elbow is maintained flexed) even though they can use the upper-limb when the trunk is restrained. This phenomenon is labelled learned non-use and is known to be detrimental to neuroplasticity and recovery. Objective: The aim of this study is to quantify learned non-use of the paretic upper limb during a hand reaching task using 3D movement analysis. Methods: Thirty-four participants post supratentorial stroke were asked to reach a cone placed in front of them at 80% of their arm length. The reaching movement was repeated 5 times with the paretic hand, and then 5 times with the less-impaired hand. This sequence was first performed with the trunk free, then with the trunk restrained. Learned non-use of the upper-limb (LNUUL) was obtained from the difference of the amount of trunk compensation between the free trunk condition and the restrained trunk condition. Results: LNUUL was significantly higher for the paretic hand, with individual values ranging from 1% to 43%, and one-half of the patients with an LNUUL higher than 15%. Conclusions: Quantification of LNUUL can be used to objectively diagnose patients who need trunk rehabilitation. It can be also used for monitoring the rehabilitation progress. Quantification of LNUUL may guide upper-limb rehabilitation towards more optimal motor recovery avoiding maladaptive trunk compensation and its consequences on neuroplasticity.

Keywords: learned non-use, rehabilitation, stroke, upper limb

Procedia PDF Downloads 214
1630 Performants: A Digital Event Manager-Organizer

Authors: Ioannis Andrianakis, Manolis Falelakis, Maria Pavlidou, Konstantinos Papakonstantinou, Ermioni Avramidou, Dimitrios Kalogiannis, Nikolaos Milios, Katerina Bountakidou, Kiriakos Chatzidimitriou, Panagiotis Panagiotopoulos

Abstract:

Artistic events, such as concerts and performances, are challenging to organize because they involve many people with different skill sets. Small and medium venues often struggle to afford the costs and overheads of booking and hosting remote artists, especially if they lack sponsors or subsidies. This limits the opportunities for both venues and artists, especially those outside of big cities. However, more and more research shows that audiences prefer smaller-scale events and concerts, which benefit local economies and communities. To address this challenge, our project “PerformAnts: Digital Event Manager-Organizer” aims to develop a smart digital tool that automates and optimizes the processes and costs of live shows and tours. By using machine learning, applying best practices and training users through workshops, our platform offers a comprehensive solution for a growing market, enhances the mobility of artists and the accessibility of venues and allows professionals to focus on the creative aspects of concert production.

Keywords: event organization, creative industries, event promotion, machine learning

Procedia PDF Downloads 51
1629 Multi-Agent TeleRobotic Security Control System: Requirements Definitions of Multi-Agent System Using The Behavioral Patterns Analysis (BPA) Approach

Authors: Assem El-Ansary

Abstract:

This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Multi-Agent TeleRobotic Security Control System (MTSCS). The event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are the Behavioral Pattern Analysis (BPA) modeling methodology, and the development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.

Keywords: analysis, multi-agent, TeleRobotics control, security, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases

Procedia PDF Downloads 409
1628 HPTLC Based Qualitative and Quantitative Evaluation of Uraria picta Desv: A Dashmool Species

Authors: Hari O. Saxena, Ganesh

Abstract:

In the present investigation, chemical fingerprints of methanolic extracts of roots, stem and leaves of Uraria picta were developed using HPTLC technique. These fingerprints will be useful for authentication as well as in differentiating the species from adulterants. These will also serve as a biochemical marker for this valuable species in pharmaceutical industries and plant systemic studies. Roots, stem and leaves of Uraria picta were further evaluated for quantification of an active ingredient lupeol to find out alternatives to roots. Results showed more content of lupeol in stem (0.048%, dry wt.) as compare to roots (0.017%, dry wt.) suggesting the utilization of stem in place of roots. It will avoid uprooting of this prestigious plant which ultimately will promote its conservation.

Keywords: chemical fingerprints, lupeol, quantification, Uraria picta

Procedia PDF Downloads 225