Search results for: analysis and real time information about liquefaction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 46800

Search results for: analysis and real time information about liquefaction

46260 Analysis of Real Time Seismic Signal Dataset Using Machine Learning

Authors: Sujata Kulkarni, Udhav Bhosle, Vijaykumar T.

Abstract:

Due to the closeness between seismic signals and non-seismic signals, it is vital to detect earthquakes using conventional methods. In order to distinguish between seismic events and non-seismic events depending on their amplitude, our study processes the data that come from seismic sensors. The authors suggest a robust noise suppression technique that makes use of a bandpass filter, an IIR Wiener filter, recursive short-term average/long-term average (STA/LTA), and Carl short-term average (STA)/long-term average for event identification (LTA). The trigger ratio used in the proposed study to differentiate between seismic and non-seismic activity is determined. The proposed work focuses on significant feature extraction for machine learning-based seismic event detection. This serves as motivation for compiling a dataset of all features for the identification and forecasting of seismic signals. We place a focus on feature vector dimension reduction techniques due to the temporal complexity. The proposed notable features were experimentally tested using a machine learning model, and the results on unseen data are optimal. Finally, a presentation using a hybrid dataset (captured by different sensors) demonstrates how this model may also be employed in a real-time setting while lowering false alarm rates. The planned study is based on the examination of seismic signals obtained from both individual sensors and sensor networks (SN). A wideband seismic signal from BSVK and CUKG station sensors, respectively located near Basavakalyan, Karnataka, and the Central University of Karnataka, makes up the experimental dataset.

Keywords: Carl STA/LTA, features extraction, real time, dataset, machine learning, seismic detection

Procedia PDF Downloads 124
46259 VeriFy: A Solution to Implement Autonomy Safely and According to the Rules

Authors: Michael Naderhirn, Marco Pavone

Abstract:

Problem statement, motivation, and aim of work: So far, the development of control algorithms was done by control engineers in a way that the controller would fit a specification by testing. When it comes to the certification of an autonomous car in highly complex scenarios, the challenge is much higher since such a controller must mathematically guarantee to implement the rules of the road while on the other side guarantee aspects like safety and real time executability. What if it becomes reality to solve this demanding problem by combining Formal Verification and System Theory? The aim of this work is to present a workflow to solve the above mentioned problem. Summary of the presented results / main outcomes: We show the usage of an English like language to transform the rules of the road into system specification for an autonomous car. The language based specifications are used to define system functions and interfaces. Based on that a formal model is developed which formally correctly models the specifications. On the other side, a mathematical model describing the systems dynamics is used to calculate the systems reachability set which is further used to determine the system input boundaries. Then a motion planning algorithm is applied inside the system boundaries to find an optimized trajectory in combination with the formal specification model while satisfying the specifications. The result is a control strategy which can be applied in real time independent of the scenario with a mathematical guarantee to satisfy a predefined specification. We demonstrate the applicability of the method in simulation driving scenarios and a potential certification. Originality, significance, and benefit: To the authors’ best knowledge, it is the first time that it is possible to show an automated workflow which combines a specification in an English like language and a mathematical model in a mathematical formal verified way to synthesizes a controller for potential real time applications like autonomous driving.

Keywords: formal system verification, reachability, real time controller, hybrid system

Procedia PDF Downloads 241
46258 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization

Authors: Hironori Karachi, Haruka Yamashita

Abstract:

Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.

Keywords: data science, non-negative matrix factorization, missing data, quality of services

Procedia PDF Downloads 131
46257 Duplex Real-Time Loop-Mediated Isothermal Amplification Assay for Simultaneous Detection of Beef and Pork

Authors: Mi-Ju Kim, Hae-Yeong Kim

Abstract:

Product mislabeling and adulteration have been increasing the concerns in processed meat products. Relatively inexpensive pork meat compared to meat such as beef was adulterated for economic benefit. These food fraud incidents related to pork were concerned due to economic, religious and health reasons. In this study, a rapid on-site detection method using loop-mediated isothermal amplification (LAMP) was developed for the simultaneous identification of beef and pork. Each specific LAMP primer for beef and pork was designed targeting on mitochondrial D-loop region. The LAMP assay reaction was performed at 65 ℃ for 40 min. The specificity of each primer for beef and pork was evaluated using DNAs extracted from 13 animal species including beef and pork. The sensitivity of duplex LAMP assay was examined by serial dilution of beef and pork DNAs, and reference binary mixtures. This assay was applied to processed meat products including beef and pork meat for monitoring. Each set of primers amplified only the targeted species with no cross-reactivity with animal species. The limit of detection of duplex real-time LAMP was 1 pg for each DNA of beef and pork and 1% pork in a beef-meat mixture. Commercial meat products that declared the presence of beef and/or pork meat on the label showed positive results for those species. This method was successfully applied to detect simultaneous beef and pork meats in processed meat products. The optimized duplex LAMP assay can identify simultaneously beef and pork meat within less than 40 min. A portable real-time fluorescence device used in this study is applicable for on-site detection of beef and pork in processed meat products. Thus, this developed assay was considered to be an efficient tool for monitoring meat products.

Keywords: beef, duplex real-time LAMP, meat identification, pork

Procedia PDF Downloads 224
46256 Visualization-Based Feature Extraction for Classification in Real-Time Interaction

Authors: Ágoston Nagy

Abstract:

This paper introduces a method of using unsupervised machine learning to visualize the feature space of a dataset in 2D, in order to find most characteristic segments in the set. After dimension reduction, users can select clusters by manual drawing. Selected clusters are recorded into a data model that is used for later predictions, based on realtime data. Predictions are made with supervised learning, using Gesture Recognition Toolkit. The paper introduces two example applications: a semantic audio organizer for analyzing incoming sounds, and a gesture database organizer where gestural data (recorded by a Leap motion) is visualized for further manipulation.

Keywords: gesture recognition, machine learning, real-time interaction, visualization

Procedia PDF Downloads 354
46255 An Automated Business Process Management for Smart Medical Records

Authors: K. Malak, A. Nourah, S.Liyakathunisa

Abstract:

Nowadays, healthcare services are facing many challenges since they are becoming more complex and more needed. Every detail of a patient’s interactions with health care providers is maintained in Electronic Health Records (ECR) and Healthcare information systems (HIS). However, most of the existing systems are often focused on documenting what happens in manual health care process, rather than providing the highest quality patient care. Healthcare business processes and stakeholders can no longer rely on manual processes, to provide better patient care and efficient utilization of resources, Healthcare processes must be automated wherever it is possible. In this research, a detail survey and analysis is performed on the existing health care systems in Saudi Arabia, and an automated smart medical healthcare business process model is proposed. The business process management methods and rules are followed in discovering, collecting information, analysis, redesign, implementation and performance improvement analysis in terms of time and cost. From the simulation results, it is evident that our proposed smart medical records system can improve the quality of the service by reducing the time and cost and increasing efficiency

Keywords: business process management, electronic health records, efficiency, cost, time

Procedia PDF Downloads 342
46254 IOT Based Process Model for Heart Monitoring Process

Authors: Dalyah Y. Al-Jamal, Maryam H. Eshtaiwi, Liyakathunisa Syed

Abstract:

Connecting health services with technology has a huge demand as people health situations are becoming worse day by day. In fact, engaging new technologies such as Internet of Things (IOT) into the medical services can enhance the patient care services. Specifically, patients suffering from chronic diseases such as cardiac patients need a special care and monitoring. In reality, some efforts were previously taken to automate and improve the patient monitoring systems. However, the previous efforts have some limitations and lack the real-time feature needed for chronic kind of diseases. In this paper, an improved process model for patient monitoring system specialized for cardiac patients is presented. A survey was distributed and interviews were conducted to gather the needed requirements to improve the cardiac patient monitoring system. Business Process Model and Notation (BPMN) language was used to model the proposed process. In fact, the proposed system uses the IOT Technology to assist doctors to remotely monitor and follow-up with their heart patients in real-time. In order to validate the effectiveness of the proposed solution, simulation analysis was performed using Bizagi Modeler tool. Analysis results show performance improvements in the heart monitoring process. For the future, authors suggest enhancing the proposed system to cover all the chronic diseases.

Keywords: IoT, process model, remote patient monitoring system, smart watch

Procedia PDF Downloads 332
46253 Integrating Artificial Neural Network and Taguchi Method on Constructing the Real Estate Appraisal Model

Authors: Mu-Yen Chen, Min-Hsuan Fan, Chia-Chen Chen, Siang-Yu Jhong

Abstract:

In recent years, real estate prediction or valuation has been a topic of discussion in many developed countries. Improper hype created by investors leads to fluctuating prices of real estate, affecting many consumers to purchase their own homes. Therefore, scholars from various countries have conducted research in real estate valuation and prediction. With the back-propagation neural network that has been popular in recent years and the orthogonal array in the Taguchi method, this study aimed to find the optimal parameter combination at different levels of orthogonal array after the system presented different parameter combinations, so that the artificial neural network obtained the most accurate results. The experimental results also demonstrated that the method presented in the study had a better result than traditional machine learning. Finally, it also showed that the model proposed in this study had the optimal predictive effect, and could significantly reduce the cost of time in simulation operation. The best predictive results could be found with a fewer number of experiments more efficiently. Thus users could predict a real estate transaction price that is not far from the current actual prices.

Keywords: artificial neural network, Taguchi method, real estate valuation model, investors

Procedia PDF Downloads 489
46252 Dams Operation Management Criteria during Floods: Case Study of Dez Dam in Southwest Iran

Authors: Ali Heidari

Abstract:

This paper presents the principles for improving flood mitigation operation in multipurpose dams and maximizing reservoir performance during flood occurrence with a focus on the real-time operation of gated spillways. The criteria of operation include the safety of dams during flood management, minimizing the downstream flood risk by decreasing the flood hazard and fulfilling water supply and other purposes of the dam operation in mid and long terms horizons. The parameters deemed to be important include flood inflow, outlet capacity restrictions, downstream flood inundation damages, economic revenue of dam operation, and environmental and sedimentation restrictions. A simulation model was used to determine the real-time release of the Dez dam located in the Dez rivers in southwest Iran, considering the gate regulation curves for the gated spillway. The results of the simulation model show that there is a possibility to improve the current procedures used in the real-time operation of the dams, particularly using gate regulation curves and early flood forecasting system results. The Dez dam operation data shows that in one of the best flood control records, % 17 of the total active volume and flood control pool of the reservoir have not been used in decreasing the downstream flood hazard despite the availability of a flood forecasting system.

Keywords: dam operation, flood control criteria, Dez dam, Iran

Procedia PDF Downloads 225
46251 Microwave Security System in Museums: Design and Implementation

Authors: Dalia Elsheakh, Hala Elsadek

Abstract:

The objective of this paper is to propose a competitive microwave security system that can be applied with reasonable price at museums in Egypt, considering the priceless elements in 23 Egyptian museums countrywide and the lack of good recent security systems even in big ones. The system main goal is to detect valuable targets to ensure their presence in the pre-defined positions in order to protect them from being stolen. The system is based on real time microwave scanning for the required space volume through transmitting RF waves at consecutive angles and detecting the back scattered waves from required objects to detect their existence at pre-specified locations.

Keywords: microwave security system, object locating system, real time locating system (RTLS), antenna array, array electronic scanning

Procedia PDF Downloads 349
46250 Spatio-Temporal Pest Risk Analysis with ‘BioClass’

Authors: Vladimir A. Todiras

Abstract:

Spatio-temporal models provide new possibilities for real-time action in pest risk analysis. It should be noted that estimation of the possibility and probability of introduction of a pest and of its economic consequences involves many uncertainties. We present a new mapping technique that assesses pest invasion risk using online BioClass software. BioClass is a GIS tool designed to solve multiple-criteria classification and optimization problems based on fuzzy logic and level set methods. This research describes a method for predicting the potential establishment and spread of a plant pest into new areas using a case study: corn rootworm (Diabrotica spp.), tomato leaf miner (Tuta absoluta) and plum fruit moth (Grapholita funebrana). Our study demonstrated that in BioClass we can combine fuzzy logic and geographic information systems with knowledge of pest biology and environmental data to derive new information for decision making. Pests are sensitive to a warming climate, as temperature greatly affects their survival and reproductive rate and capacity. Changes have been observed in the distribution, frequency and severity of outbreaks of Helicoverpa armigera on tomato. BioClass has demonstrated to be a powerful tool for applying dynamic models and map the potential future distribution of a species, enable resource to make decisions about dangerous and invasive species management and control.

Keywords: classification, model, pest, risk

Procedia PDF Downloads 282
46249 Contentious Issues Concerning the Methodology of Using the Lexical Approach in Teaching ESP

Authors: Elena Krutskikh, Elena Khvatova

Abstract:

In tertiary settings expanding students’ vocabulary and teaching discursive competence is seen as one of the chief goals of a professional development course. However, such a focus often is detrimental to students’ cognitive competences, such as analysis, synthesis, and creative processing of information, and deprives students of motivation for self-improvement and self-development of language skills. The presentation is going to argue that in an ESP course special attention should be paid to reading/listening which can promote understanding and using the language as a tool for solving significant real world problems, including professional ones. It is claimed that in the learning process it is necessary to maintain a balance between the content and the linguistic aspect of the educational process as language acquisition is inextricably linked with mental activity and the need to express oneself is a primary stimulus for using a language. A study conducted among undergraduates indicates that they place a premium on quality materials that motivate them and stimulate their further linguistic and professional development. Thus, more demands are placed on study materials that should contain new information for students and serve not only as a source of new vocabulary but also prepare them for real tasks related to professional activities.

Keywords: critical reading, english for professional development, english for specific purposes, high order thinking skills, lexical approach, vocabulary acquisition

Procedia PDF Downloads 167
46248 Design and Development of Fleet Management System for Multi-Agent Autonomous Surface Vessel

Authors: Zulkifli Zainal Abidin, Ahmad Shahril Mohd Ghani

Abstract:

Agent-based systems technology has been addressed as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are sophisticated systems that act autonomously across open and distributed environments in solving problems. Nevertheless, it is impractical to rely on a single agent to do all computing processes in solving complex problems. An increasing number of applications lately require multiple agents to work together. A multi-agent system (MAS) is a loosely coupled network of agents that interact to solve problems that are beyond the individual capacities or knowledge of each problem solver. However, the network of MAS still requires a main system to govern or oversees the operation of the agents in order to achieve a unified goal. We had developed a fleet management system (FMS) in order to manage the fleet of agents, plan route for the agents, perform real-time data processing and analysis, and issue sets of general and specific instructions to the agents. This FMS should be able to perform real-time data processing, communicate with the autonomous surface vehicle (ASV) agents and generate bathymetric map according to the data received from each ASV unit. The first algorithm is developed to communicate with the ASV via radio communication using standard National Marine Electronics Association (NMEA) protocol sentences. Next, the second algorithm will take care of the path planning, formation and pattern generation is tested using various sample data. Lastly, the bathymetry map generation algorithm will make use of data collected by the agents to create bathymetry map in real-time. The outcome of this research is expected can be applied on various other multi-agent systems.

Keywords: autonomous surface vehicle, fleet management system, multi agent system, bathymetry

Procedia PDF Downloads 271
46247 Digital Twin for Retail Store Security

Authors: Rishi Agarwal

Abstract:

Digital twins are emerging as a strong technology used to imitate and monitor physical objects digitally in real time across sectors. It is not only dealing with the digital space, but it is also actuating responses in the physical space in response to the digital space processing like storage, modeling, learning, simulation, and prediction. This paper explores the application of digital twins for enhancing physical security in retail stores. The retail sector still relies on outdated physical security practices like manual monitoring and metal detectors, which are insufficient for modern needs. There is a lack of real-time data and system integration, leading to ineffective emergency response and preventative measures. As retail automation increases, new digital frameworks must control safety without human intervention. To address this, the paper proposes implementing an intelligent digital twin framework. This collects diverse data streams from in-store sensors, surveillance, external sources, and customer devices and then Advanced analytics and simulations enable real-time monitoring, incident prediction, automated emergency procedures, and stakeholder coordination. Overall, the digital twin improves physical security through automation, adaptability, and comprehensive data sharing. The paper also analyzes the pros and cons of implementation of this technology through an Emerging Technology Analysis Canvas that analyzes different aspects of this technology through both narrow and wide lenses to help decision makers in their decision of implementing this technology. On a broader scale, this showcases the value of digital twins in transforming legacy systems across sectors and how data sharing can create a safer world for both retail store customers and owners.

Keywords: digital twin, retail store safety, digital twin in retail, digital twin for physical safety

Procedia PDF Downloads 72
46246 Tourism in the Information Age

Authors: Suleyman Karacor

Abstract:

The main purpose of this study is to investigate tourism marketing in the information age because of the importance and sensitivity. In the twenty-first century as a result of today's the increasing competition and product diversification in the tourism sector, tourism businesses must take into account exogenous variables such as new technological developments, commercial experience and consumer demand. In the information age, tourist product consumers tend to reserve their leisure time and expenditure on more active opportunities for different experiences instead of living the same experience again. Increasing the number of agents in the tourism sector, travel opportunities offering different experiences and more intensive use of modern technology helps to present diversification of leisure activities for tourists. From the perspective of tourists, travel costs are still important for buying the touristic products but maintaining a high level of tourist satisfaction is also of increasing importance. Tourists tend to prefer activities that add value. A real tourist product must be able to create value and new priorities for tourists. Therefore this study aims to review recent significant developments in international tourism marketing research and practices. To this end, this study reviews tourism marketing-focused articles.

Keywords: information age, tourism marketing, tourism marketing mix, management

Procedia PDF Downloads 426
46245 The Temporal Dimension of Narratives: A Construct of Qualitative Time

Authors: Ani Thomas

Abstract:

Every narrative is a temporal construct. Every narrative creates a qualitative experience of time for the viewer. The paper argues for the concept of a qualified time that emerges from the interaction between the narrative and the audience. The paper also challenges the conventional understanding of narrative time as either story time, real time or discourse time. Looking at narratives through the medium of Cinema, the study examines how narratives create and manipulate duration or durée, the qualitative experience of time as theorized by Henri Bergson. The paper further analyzes how Cinema and, by extension, narratives are nothing but Durée and the filmmaker, the artist of durée, who shape and manipulate the perception and emotions of the viewer through the manipulation and construction of durée. The paper draws on cinematic works to look at the techniques to demonstrate how filmmakers use, for example, editing, sound, compositional and production narratives etc., to create various modes of durée that challenge, amplify or unsettle the viewer’s sense of time. Bringing together the Viewer’s durée and exploring its interaction with the narrative construct, the paper explores the emergence of the new qualitative time, the narrative durée, that defines the audience experience.

Keywords: cinema, time, bergson, duree

Procedia PDF Downloads 148
46244 Visual Aid and Imagery Ramification on Decision Making: An Exploratory Study Applicable in Emergency Situations

Authors: Priyanka Bharti

Abstract:

Decades ago designs were based on common sense and tradition, but after an enhancement in visualization technology and research, we are now able to comprehend the cognitive ability involved in the decoding of the visual information. However, many fields in visuals need intense research to deliver an efficient explanation for the events. Visuals are an information representation mode through images, symbols and graphics. It plays an impactful role in decision making by facilitating quick recognition, comprehension, and analysis of a situation. They enhance problem-solving capabilities by enabling the processing of more data without overloading the decision maker. As research proves that, visuals offer an improved learning environment by a factor of 400 compared to textual information. Visual information engages learners at a cognitive level and triggers the imagination, which enables the user to process the information faster (visuals are processed 60,000 times faster in the brain than text). Appropriate information, visualization, and its presentation are known to aid and intensify the decision-making process for the users. However, most literature discusses the role of visual aids in comprehension and decision making during normal conditions alone. Unlike emergencies, in a normal situation (e.g. our day to day life) users are neither exposed to stringent time constraints nor face the anxiety of survival and have sufficient time to evaluate various alternatives before making any decision. An emergency is an unexpected probably fatal real-life situation which may inflict serious ramifications on both human life and material possessions unless corrective measures are taken instantly. The situation demands the exposed user to negotiate in a dynamic and unstable scenario in the absence or lack of any preparation, but still, take swift and appropriate decisions to save life/lives or possessions. But the resulting stress and anxiety restricts cue sampling, decreases vigilance, reduces the capacity of working memory, causes premature closure in evaluating alternative options, and results in task shedding. Limited time, uncertainty, high stakes and vague goals negatively affect cognitive abilities to take appropriate decisions. More so, theory of natural decision making by experts has been understood with far more depth than that of an ordinary user. Therefore, in this study, the author aims to understand the role of visual aids in supporting rapid comprehension to take appropriate decisions during an emergency situation.

Keywords: cognition, visual, decision making, graphics, recognition

Procedia PDF Downloads 268
46243 Information Extraction Based on Search Engine Results

Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk

Abstract:

The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.

Keywords: search engines, information extraction, agent system

Procedia PDF Downloads 430
46242 Stereo Camera Based Speed-Hump Detection Process for Real Time Driving Assistance System in the Daytime

Authors: Hyun-Koo Kim, Yong-Hun Kim, Soo-Young Suk, Ju H. Park, Ho-Youl Jung

Abstract:

This paper presents an effective speed hump detection process at the day-time. we focus only on round types of speed humps in the day-time dynamic road environment. The proposed speed hump detection scheme consists mainly of two process as stereo matching and speed hump detection process. Our proposed process focuses to speed hump detection process. Speed hump detection process consist of noise reduction step, data fusion step, and speed hemp detection step. The proposed system is tested on Intel Core CPU with 2.80 GHz and 4 GB RAM tested in the urban road environments. The frame rate of test videos is 30 frames per second and the size of each frame of grabbed image sequences is 1280 pixels by 670 pixels. Using object-marked sequences acquired with an on-vehicle camera, we recorded speed humps and non-speed humps samples. Result of the tests, our proposed method can be applied in real-time systems by computation time is 13 ms. For instance; our proposed method reaches 96.1 %.

Keywords: data fusion, round types speed hump, speed hump detection, surface filter

Procedia PDF Downloads 510
46241 Monitoring Potential Temblor Localities as a Supplemental Risk Control System

Authors: Mikhail Zimin, Svetlana Zimina, Maxim Zimin

Abstract:

Without question, the basic method of prevention of human and material losses is the provision for adequate strength of constructions. At the same time, seismic load has a stochastic character. So, at all times, there is little danger of earthquake forces exceeding the selected design load. This risk is very low, but the consequences of such events may be extremely serious. Very dangerous are also occasional mistakes in seismic zoning, soil conditions changing before temblors, and failure to take into account hazardous natural phenomena caused by earthquakes. Besides, it is known that temblors detrimentally affect the environmental situation in regions where they occur, resulting in panic and worsening various disease courses. It may lead to mistakes of personnel of hazardous production facilities like the production and distribution of gas and oil, which may provoke severe accidents. In addition, gas and oil pipelines often have long mileage and cross many perilous zones by contrast with buildings. This situation increases the risk of heavy accidents. In such cases, complex monitoring of potential earthquake localities would be relevant. Even though the number of successful real-time forecasts of earthquakes is not great, it is well in excess, such as may be under random guessing. Experimental performed time-lapse study and analysis consist of searching seismic, biological, meteorological, and light earthquake precursors, processing such data with the help of fuzzy sets, collecting weather information, utilizing a database of terrain, and computing risk of slope processes under the temblor in a given setting. Works were done in a real-time environment and broadly acceptable results took place. Observations from already in-place seismic recording systems are used. Furthermore, a look back study of precursors of known earthquakes is done. Situations before Ashkhabad, Tashkent, and Haicheng seismic events are analyzed. Fairish findings are obtained. Results of earthquake forecasts can be used for predicting dangerous natural phenomena caused by temblors such as avalanches and mudslides. They may also be utilized for prophylaxis of some diseases and their complications. Relevant software is worked out too. It should be emphasized that such control does not require serious financial expenses and can be performed by a small group of professionals. Thus, complex monitoring of potential earthquake localities, including short-term earthquake forecasts and analysis of possible hazardous consequences of temblors, may further the safety of pipeline facilities.

Keywords: risk, earthquake, monitoring, forecast, precursor

Procedia PDF Downloads 24
46240 Classification of EEG Signals Based on Dynamic Connectivity Analysis

Authors: Zoran Šverko, Saša Vlahinić, Nino Stojković, Ivan Markovinović

Abstract:

In this article, the classification of target letters is performed using data from the EEG P300 Speller paradigm. Neural networks trained with the results of dynamic connectivity analysis between different brain regions are used for classification. Dynamic connectivity analysis is based on the adaptive window size and the imaginary part of the complex Pearson correlation coefficient. Brain dynamics are analysed using the relative intersection of confidence intervals for the imaginary component of the complex Pearson correlation coefficient method (RICI-imCPCC). The RICI-imCPCC method overcomes the shortcomings of currently used dynamical connectivity analysis methods, such as the low reliability and low temporal precision for short connectivity intervals encountered in constant sliding window analysis with wide window size and the high susceptibility to noise encountered in constant sliding window analysis with narrow window size. This method overcomes these shortcomings by dynamically adjusting the window size using the RICI rule. This method extracts information about brain connections for each time sample. Seventy percent of the extracted brain connectivity information is used for training and thirty percent for validation. Classification of the target word is also done and based on the same analysis method. As far as we know, through this research, we have shown for the first time that dynamic connectivity can be used as a parameter for classifying EEG signals.

Keywords: dynamic connectivity analysis, EEG, neural networks, Pearson correlation coefficients

Procedia PDF Downloads 214
46239 Comparison of the Logistic and the Gompertz Growth Functions Considering a Periodic Perturbation in the Model Parameters

Authors: Avan Al-Saffar, Eun-Jin Kim

Abstract:

Both the logistic growth model and the gompertz growth model are used to describe growth processes. Both models driven by perturbations in different cases are investigated using information theory as a useful measure of sustainability and the variability. Specifically, we study the effect of different oscillatory modulations in the system's parameters on the evolution of the system and Probability Density Function (PDF). We show the maintenance of the initial conditions for a long time. We offer Fisher information analysis in positive and/or negative feedback and explain its implications for the sustainability of population dynamics. We also display a finite amplitude solution due to the purely fluctuating growth rate whereas the periodic fluctuations in negative feedback can lead to break down the system's self-regulation with an exponentially growing solution. In the cases tested, the gompertz and logistic systems show similar behaviour in terms of information and sustainability although they develop differently in time.

Keywords: dynamical systems, fisher information, probability density function (pdf), sustainability

Procedia PDF Downloads 431
46238 A Low-Cost of Foot Plantar Shoes for Gait Analysis

Authors: Zulkifli Ahmad, Mohd Razlan Azizan, Nasrul Hadi Johari

Abstract:

This paper presents a study on development and conducting of a wearable sensor system for gait analysis measurement. For validation, the method of plantar surface measurement by force plate was prepared. In general gait analysis, force plate generally represents a studies about barefoot in whole steps and do not allow analysis of repeating movement step in normal walking and running. The measurements that were usually perform do not represent the whole daily plantar pressures in the shoe insole and only obtain the ground reaction force. The force plate measurement is usually limited a few step and it is done indoor and obtaining coupling information from both feet during walking is not easily obtained. Nowadays, in order to measure pressure for a large number of steps and obtain pressure in each insole part, it could be done by placing sensors within an insole. With this method, it will provide a method for determine the plantar pressures while standing, walking or running of a shoe wearing subject. Inserting pressure sensors in the insole will provide specific information and therefore the point of the sensor placement will result in obtaining the critical part under the insole. In the wearable shoe sensor project, the device consists left and right shoe insole with ten FSR. Arduino Mega was used as a micro-controller that read the analog input from FSR. The analog inputs were transmitted via bluetooth data transmission that gains the force data in real time on smartphone. Blueterm software which is an android application was used as an interface to read the FSR reading on the shoe wearing subject. The subject consist of two healthy men with different age and weight doing test while standing, walking (1.5 m/s), jogging (5 m/s) and running (9 m/s) on treadmill. The data obtain will be saved on the android device and for making an analysis and comparison graph.

Keywords: gait analysis, plantar pressure, force plate, earable sensor

Procedia PDF Downloads 454
46237 Simulation of Utility Accrual Scheduling and Recovery Algorithm in Multiprocessor Environment

Authors: A. Idawaty, O. Mohamed, A. Z. Zuriati

Abstract:

This paper presents the development of an event based Discrete Event Simulation (DES) for a recovery algorithm known Backward Recovery Global Preemptive Utility Accrual Scheduling (BR_GPUAS). This algorithm implements the Backward Recovery (BR) mechanism as a fault recovery solution under the existing Time/Utility Function/ Utility Accrual (TUF/UA) scheduling domain for multiprocessor environment. The BR mechanism attempts to take the faulty tasks back to its initial safe state and then proceeds to re-execute the affected section of the faulty tasks to enable recovery. Considering that faults may occur in the components of any system; a fault tolerance system that can nullify the erroneous effect is necessary to be developed. Current TUF/UA scheduling algorithm uses the abortion recovery mechanism and it simply aborts the erroneous task as their fault recovery solution. None of the existing algorithm in TUF/UA scheduling domain in multiprocessor scheduling environment have considered the transient fault and implement the BR mechanism as a fault recovery mechanism to nullify the erroneous effect and solve the recovery problem in this domain. The developed BR_GPUAS simulator has derived the set of parameter, events and performance metrics according to a detailed analysis of the base model. Simulation results revealed that BR_GPUAS algorithm can saved almost 20-30% of the accumulated utilities making it reliable and efficient for the real-time application in the multiprocessor scheduling environment.

Keywords: real-time system (RTS), time utility function/ utility accrual (TUF/UA) scheduling, backward recovery mechanism, multiprocessor, discrete event simulation (DES)

Procedia PDF Downloads 306
46236 Hybrid Control Mode Based on Multi-Sensor Information by Fuzzy Approach for Navigation Task of Autonomous Mobile Robot

Authors: Jonqlan Lin, C. Y. Tasi, K. H. Lin

Abstract:

This paper addresses the issue of the autonomous mobile robot (AMR) navigation task based on the hybrid control modes. The novel hybrid control mode, based on multi-sensors information by using the fuzzy approach, has been presented in this research. The system operates in real time, is robust, enables the robot to operate with imprecise knowledge, and takes into account the physical limitations of the environment in which the robot moves, obtaining satisfactory responses for a large number of different situations. An experiment is simulated and carried out with a pioneer mobile robot. From the experimental results, the effectiveness and usefulness of the proposed AMR obstacle avoidance and navigation scheme are confirmed. The experimental results show the feasibility, and the control system has improved the navigation accuracy. The implementation of the controller is robust, has a low execution time, and allows an easy design and tuning of the fuzzy knowledge base.

Keywords: autonomous mobile robot, obstacle avoidance, MEMS, hybrid control mode, navigation control

Procedia PDF Downloads 466
46235 Robust Variable Selection Based on Schwarz Information Criterion for Linear Regression Models

Authors: Shokrya Saleh A. Alshqaq, Abdullah Ali H. Ahmadini

Abstract:

The Schwarz information criterion (SIC) is a popular tool for selecting the best variables in regression datasets. However, SIC is defined using an unbounded estimator, namely, the least-squares (LS), which is highly sensitive to outlying observations, especially bad leverage points. A method for robust variable selection based on SIC for linear regression models is thus needed. This study investigates the robustness properties of SIC by deriving its influence function and proposes a robust SIC based on the MM-estimation scale. The aim of this study is to produce a criterion that can effectively select accurate models in the presence of vertical outliers and high leverage points. The advantages of the proposed robust SIC is demonstrated through a simulation study and an analysis of a real dataset.

Keywords: influence function, robust variable selection, robust regression, Schwarz information criterion

Procedia PDF Downloads 140
46234 A Study of Relational Factors Associated with Online Celebrity Business and Consumer Purchase Intention

Authors: Sixing Chen, Shuai Yang

Abstract:

Online celebrity business, also known as Internet celebrity business (or Wanghong business in Chinese), is an emerging relational C2C business model, and an alternative to traditional C2C transactional business models. There are already millions of these consumers, and this number is growing. In this model, consumer purchase decisions are driven by recommendations and endorsements in videos posted online by celebrities. The purpose of this paper is to determine the relational constructs within consumer relationships in the Internet celebrity business model and to investigate relationships between the constructs and consumer purchase intention. A questionnaire-based study was conducted with consumers who had an awareness of, or prior purchase experience with online celebrities. The results of exploratory factor analysis (EFA) and multiple regression analysis revealed three valid relational constructs: product experience sharing, lifestyle association, and real-time interaction. This study indicated that these constructs had the direct effect on consumer preference and purchase intention. The findings of this study provide insight into a business model in which online shopping is driven by celebrities. They suggest that online celebrities should pay more attention to product experience sharing, life style association and real-time interaction for managing their product promotions. These are the most salient factors with respect to the relational constructs identified in this study.

Keywords: customer relationship, customer to customer, Internet celebrity, online celebrity, online marketing, purchase intention

Procedia PDF Downloads 318
46233 Comparison of the Results of a Parkinson’s Holter Monitor with Patient Diaries, in Real Conditions of Use: A Sub-Analysis of the MoMoPa-EC Clinical Trial

Authors: Alejandro Rodríguez-Molinero, Carlos Pérez-López, Jorge Hernández-Vara, Àngels Bayes-Rusiñol, Juan Carlos Martínez-Castrillo, David A. Pérez-Martínez

Abstract:

Background: Monitoring motor symptoms in Parkinson's patients is often a complex and time-consuming task for clinicians, as Hauser's diaries are often poorly completed by patients. Recently, new automatic devices (Parkinson's holter: STAT-ON®) have been developed capable of monitoring patients' motor fluctuations. The MoMoPa-EC clinical trial (NCT04176302) investigates which of the two methods produces better clinical results. In this sub-analysis, the concordance between both methods is analyzed. Methods: In the MoMoPa-EC clinical trial, 164 patients with moderate-severe Parkinson's disease and at least two hours a day of Off will be included. At the time of patient recruitment, all of them completed a seven-day motor fluctuation diary at home (Hauser’s diary) while wearing the Parkinson's holter. In this sub-analysis, 71 patients with complete data for the purpose of this comparison were included. The intraclass correlation coefficient was calculated between the patient diary entries and the Parkinson's holter data in terms of time On, Off, and time with dyskinesias. Results: The intra-class correlation coefficient of both methods was 0.57 (95% CI: 0.3-0.74) for daily time in Off (%), 0.48 (95% CI: 0.14-0.68) for daily time in On (%), and 0.37 (95% CI %: -0.04-0.62) for daily time with dyskinesias (%). Conclusions: Both methods have a moderate agreement with each other. We will have to wait for the results of the MoMoPa-EC project to estimate which of them has the greatest clinical benefits. Acknowledgment: This work is supported by AbbVie S.L.U, the Instituto de Salud Carlos III [DTS17/00195], and the European Fund for Regional Development, 'A way to make Europe'.

Keywords: Parkinson, sensor, motor fluctuations, dyskinesia

Procedia PDF Downloads 232
46232 Designing a Method to Control and Determine the Financial Performance of the Real Cost Sub-System in the Information Management System of Construction Projects

Authors: Alireza Ghaffari, Hassan Saghi

Abstract:

Project management is more complex than managing the day-to-day affairs of an organization. When the project dimensions are broad and multiple projects have to be monitored in different locations, the integrated management becomes even more complicated. One of the main concerns of project managers is the integrated project management, which is mainly rooted in the lack of accurate and accessible information from different projects in various locations. The collection of dispersed information from various parts of the network, their integration and finally the selective reporting of this information is among the goals of integrated information systems. It can help resolve the main problem, which is bridging the information gap between executives and senior managers in the organization. Therefore, the main objective of this study is to design and implement an important subset of a project management information system in order to successfully control the cost of construction projects so that its results can be used to design raw software forms and proposed relationships between different project units for the collection of necessary information.

Keywords: financial performance, cost subsystem, PMIS, project management

Procedia PDF Downloads 109
46231 Local Binary Patterns-Based Statistical Data Analysis for Accurate Soccer Match Prediction

Authors: Mohammad Ghahramani, Fahimeh Saei Manesh

Abstract:

Winning a soccer game is based on thorough and deep analysis of the ongoing match. On the other hand, giant gambling companies are in vital need of such analysis to reduce their loss against their customers. In this research work, we perform deep, real-time analysis on every soccer match around the world that distinguishes our work from others by focusing on particular seasons, teams and partial analytics. Our contributions are presented in the platform called “Analyst Masters.” First, we introduce various sources of information available for soccer analysis for teams around the world that helped us record live statistical data and information from more than 50,000 soccer matches a year. Our second and main contribution is to introduce our proposed in-play performance evaluation. The third contribution is developing new features from stable soccer matches. The statistics of soccer matches and their odds before and in-play are considered in the image format versus time including the halftime. Local Binary patterns, (LBP) is then employed to extract features from the image. Our analyses reveal incredibly interesting features and rules if a soccer match has reached enough stability. For example, our “8-minute rule” implies if 'Team A' scores a goal and can maintain the result for at least 8 minutes then the match would end in their favor in a stable match. We could also make accurate predictions before the match of scoring less/more than 2.5 goals. We benefit from the Gradient Boosting Trees, GBT, to extract highly related features. Once the features are selected from this pool of data, the Decision trees decide if the match is stable. A stable match is then passed to a post-processing stage to check its properties such as betters’ and punters’ behavior and its statistical data to issue the prediction. The proposed method was trained using 140,000 soccer matches and tested on more than 100,000 samples achieving 98% accuracy to select stable matches. Our database from 240,000 matches shows that one can get over 20% betting profit per month using Analyst Masters. Such consistent profit outperforms human experts and shows the inefficiency of the betting market. Top soccer tipsters achieve 50% accuracy and 8% monthly profit in average only on regional matches. Both our collected database of more than 240,000 soccer matches from 2012 and our algorithm would greatly benefit coaches and punters to get accurate analysis.

Keywords: soccer, analytics, machine learning, database

Procedia PDF Downloads 238