Search results for: multivariate time series data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 37821

Search results for: multivariate time series data

35991 Non-Invasive Data Extraction from Machine Display Units Using Video Analytics

Authors: Ravneet Kaur, Joydeep Acharya, Sudhanshu Gaur

Abstract:

Artificial Intelligence (AI) has the potential to transform manufacturing by improving shop floor processes such as production, maintenance and quality. However, industrial datasets are notoriously difficult to extract in a real-time, streaming fashion thus, negating potential AI benefits. The main example is some specialized industrial controllers that are operated by custom software which complicates the process of connecting them to an Information Technology (IT) based data acquisition network. Security concerns may also limit direct physical access to these controllers for data acquisition. To connect the Operational Technology (OT) data stored in these controllers to an AI application in a secure, reliable and available way, we propose a novel Industrial IoT (IIoT) solution in this paper. In this solution, we demonstrate how video cameras can be installed in a factory shop floor to continuously obtain images of the controller HMIs. We propose image pre-processing to segment the HMI into regions of streaming data and regions of fixed meta-data. We then evaluate the performance of multiple Optical Character Recognition (OCR) technologies such as Tesseract and Google vision to recognize the streaming data and test it for typical factory HMIs and realistic lighting conditions. Finally, we use the meta-data to match the OCR output with the temporal, domain-dependent context of the data to improve the accuracy of the output. Our IIoT solution enables reliable and efficient data extraction which will improve the performance of subsequent AI applications.

Keywords: human machine interface, industrial internet of things, internet of things, optical character recognition, video analytics

Procedia PDF Downloads 90
35990 Cybersecurity Assessment of Decentralized Autonomous Organizations in Smart Cities

Authors: Claire Biasco, Thaier Hayajneh

Abstract:

A smart city is the integration of digital technologies in urban environments to enhance the quality of life. Smart cities capture real-time information from devices, sensors, and network data to analyze and improve city functions such as traffic analysis, public safety, and environmental impacts. Current smart cities face controversy due to their reliance on real-time data tracking and surveillance. Internet of Things (IoT) devices and blockchain technology are converging to reshape smart city infrastructure away from its centralized model. Connecting IoT data to blockchain applications would create a peer-to-peer, decentralized model. Furthermore, blockchain technology powers the ability for IoT device data to shift from the ownership and control of centralized entities to individuals or communities with Decentralized Autonomous Organizations (DAOs). In the context of smart cities, DAOs can govern cyber-physical systems to have a greater influence over how urban services are being provided. This paper will explore how the core components of a smart city now apply to DAOs. We will also analyze different definitions of DAOs to determine their most important aspects in relation to smart cities. Both categorizations will provide a solid foundation to conduct a cybersecurity assessment of DAOs in smart cities. It will identify the benefits and risks of adopting DAOs as they currently operate. The paper will then provide several mitigation methods to combat cybersecurity risks of DAO integrations. Finally, we will give several insights into what challenges will be faced by DAO and blockchain spaces in the coming years before achieving a higher level of maturity.

Keywords: blockchain, IoT, smart city, DAO

Procedia PDF Downloads 90
35989 Personal Characteristics Related to Hasty Behaviour in Korea

Authors: Sun Jin Park, Kyung-Ja Cho

Abstract:

This study focused on characteristics related to hasty behaviour. To investigate the relation between personal characteristics and hasty behaviour, 601 data were collected, 335 males and 256 females answered their own 'social avoidance and distress’, ‘anxiety’, ‘sensation seeking', 'hope', and ' hasty behaviour. And then 591 data were used for the analysis. The factor analysis resulted hasty behaviour consisted of 5 factors, time pressure, isolation, uncomfortable situation, boring condition, and expectation of reward. The result showed anxiety, sensation seeking, and hope related to hasty behaviour. Specifically, anxiety was involved in every hasty behaviour. This result means that psychological tension and worry are related to hasty behaviour in common. 'Social avoidance and distress', 'sensation seeking' and 'hope' influenced on hasty behaviour under time pressure, in isolation, in expectation of rewards respectively. This means that each factor of hasty behaviour has anxiety as its basis, expressed through a varied nature.

Keywords: hasty behaviour, social avoidance and distress, anxiety, sensation seeking, hope

Procedia PDF Downloads 298
35988 Effects of Food Habits on Road Accidents Due to Micro-Sleepiness and Analysis of Attitudes to Develop a Food Product as a Preventive Measure

Authors: Rumesh Liyanage, S. B. Nawaratne, K. K. D. S. Ranaweera, Indira Wickramasinghe, K. G. S. C. Katukurunda

Abstract:

Study it was attempted to identify an effect of food habits and publics’ attitudes on micro-sleepiness and preventive measures to develop a food product to combat. Statistical data pertaining to road accidents were collected from, Sri Lanka Police Traffic Division and a pre-tested questionnaire was used to collect data from 250 respondents. They were selected representing drivers (especially highway drivers), private and public sector workers (shift based) and cramming students (university and school). Questionnaires were directed to fill independently and personally and collected data were analyzed statistically. Results revealed that 76.84, 96.39 and 80.93% out of total respondents consumed rice for all three meals which lead to ingesting higher glycemic meals. Taking two hyper glycemic meals before 14.00h was identified as a cause of micro-sleepiness within these respondents. Peak level of road accidents were observed at 14.00 - 20.00h (38.2%)and intensity of micro-sleepiness falls at the same time period (37.36%) while 14.00 to 16.00h was the peak time, 16.00 to 18.00h was the least; again 18.00 to 20.00h it reappears slightly. Even though respondents of the survey expressed that peak hours of micro- sleepiness is 14.00-16.00h, according to police reports, peak hours fall in between 18.00-20.00h. Out of the interviewees, 69.27% strongly wanted to avoid micro-sleepiness and intend to spend LKR 10-20 on a commercial product to combat micro sleepiness. As age-old practices to suppress micro-sleepiness are time taken, modern day respondents (51.64%) like to have a quick solution through a drink. Therefore, food habits of morning and noon may cause for micro- sleepiness while dinner may cause for both, natural and micro-sleepiness due to the heavy glycemic load of food. According to the study micro-sleepiness, can be categorized into three zones such as low-risk zone (08.00-10.00h and 18.00-20.00h), manageable zone (10.00-12.00h), and high- risk zone (14.00-16.00h).

Keywords: food habits, glycemic load, micro-sleepiness, road accidents

Procedia PDF Downloads 526
35987 Biosorption of Phenol onto Water Hyacinth Activated Carbon: Kinetics and Isotherm Study

Authors: Manoj Kumar Mahapatra, Arvind Kumar

Abstract:

Batch adsorption experiments were carried out for the removal of phenol from its aqueous solution using water hyancith activated carbon (WHAC) as an adsorbent. The sorption kinetics were analysed using pseudo-first order kinetics and pseudo-second order model, and it was observed that the sorption data tend to fit very well in pseudo-second order model for the entire sorption time. The experimental data were analyzed by the Langmuir and Freundlich isotherm models. Equilibrium data fitted well to the Freundlich model with a maximum biosorption capacity of 31.45 mg/g estimated using Langmuir model. The adsorption intensity 3.7975 represents a favorable adsorption condition.

Keywords: adsorption, isotherm, kinetics, phenol

Procedia PDF Downloads 429
35986 “Octopub”: Geographical Sentiment Analysis Using Named Entity Recognition from Social Networks for Geo-Targeted Billboard Advertising

Authors: Oussama Hafferssas, Hiba Benyahia, Amina Madani, Nassima Zeriri

Abstract:

Although data nowadays has multiple forms; from text to images, and from audio to videos, yet text is still the most used one at a public level. At an academical and research level, and unlike other forms, text can be considered as the easiest form to process. Therefore, a brunch of Data Mining researches has been always under its shadow, called "Text Mining". Its concept is just like data mining’s, finding valuable patterns in data, from large collections and tremendous volumes of data, in this case: Text. Named entity recognition (NER) is one of Text Mining’s disciplines, it aims to extract and classify references such as proper names, locations, expressions of time and dates, organizations and more in a given text. Our approach "Octopub" does not aim to find new ways to improve named entity recognition process, rather than that it’s about finding a new, and yet smart way, to use NER in a way that we can extract sentiments of millions of people using Social Networks as a limitless information source, and Marketing for product promotion as the main domain of application.

Keywords: textmining, named entity recognition(NER), sentiment analysis, social media networks (SN, SMN), business intelligence(BI), marketing

Procedia PDF Downloads 563
35985 Modeling and Prediction of Zinc Extraction Efficiency from Concentrate by Operating Condition and Using Artificial Neural Networks

Authors: S. Mousavian, D. Ashouri, F. Mousavian, V. Nikkhah Rashidabad, N. Ghazinia

Abstract:

PH, temperature, and time of extraction of each stage, agitation speed, and delay time between stages effect on efficiency of zinc extraction from concentrate. In this research, efficiency of zinc extraction was predicted as a function of mentioned variable by artificial neural networks (ANN). ANN with different layer was employed and the result show that the networks with 8 neurons in hidden layer has good agreement with experimental data.

Keywords: zinc extraction, efficiency, neural networks, operating condition

Procedia PDF Downloads 522
35984 The Use of Ward Linkage in Cluster Integration with a Path Analysis Approach

Authors: Adji Achmad Rinaldo Fernandes

Abstract:

Path analysis is an analytical technique to study the causal relationship between independent and dependent variables. In this study, the integration of Clusters in the Ward Linkage method was used in a variety of clusters with path analysis. The variables used are character (x₁), capacity (x₂), capital (x₃), collateral (x₄), and condition of economy (x₄) to on time pay (y₂) through the variable willingness to pay (y₁). The purpose of this study was to compare the Ward Linkage method cluster integration in various clusters with path analysis to classify willingness to pay (y₁). The data used are primary data from questionnaires filled out by customers of Bank X, using purposive sampling. The measurement method used is the average score method. The results showed that the Ward linkage method cluster integration with path analysis on 2 clusters is the best method, by comparing the coefficient of determination. Variable character (x₁), capacity (x₂), capital (x₃), collateral (x₄), and condition of economy (x₅) to on time pay (y₂) through willingness to pay (y₁) can be explained by 58.3%, while the remaining 41.7% is explained by variables outside the model.

Keywords: cluster integration, linkage, path analysis, compliant paying behavior

Procedia PDF Downloads 158
35983 Building Data Infrastructure for Public Use and Informed Decision Making in Developing Countries-Nigeria

Authors: Busayo Fashoto, Abdulhakeem Shaibu, Justice Agbadu, Samuel Aiyeoribe

Abstract:

Data has gone from just rows and columns to being an infrastructure itself. The traditional medium of data infrastructure has been managed by individuals in different industries and saved on personal work tools; one of such is the laptop. This hinders data sharing and Sustainable Development Goal (SDG) 9 for infrastructure sustainability across all countries and regions. However, there has been a constant demand for data across different agencies and ministries by investors and decision-makers. The rapid development and adoption of open-source technologies that promote the collection and processing of data in new ways and in ever-increasing volumes are creating new data infrastructure in sectors such as lands and health, among others. This paper examines the process of developing data infrastructure and, by extension, a data portal to provide baseline data for sustainable development and decision making in Nigeria. This paper employs the FAIR principle (Findable, Accessible, Interoperable, and Reusable) of data management using open-source technology tools to develop data portals for public use. eHealth Africa, an organization that uses technology to drive public health interventions in Nigeria, developed a data portal which is a typical data infrastructure that serves as a repository for various datasets on administrative boundaries, points of interest, settlements, social infrastructure, amenities, and others. This portal makes it possible for users to have access to datasets of interest at any point in time at no cost. A skeletal infrastructure of this data portal encompasses the use of open-source technology such as Postgres database, GeoServer, GeoNetwork, and CKan. These tools made the infrastructure sustainable, thus promoting the achievement of SDG 9 (Industries, Innovation, and Infrastructure). As of 6th August 2021, a wider cross-section of 8192 users had been created, 2262 datasets had been downloaded, and 817 maps had been created from the platform. This paper shows the use of rapid development and adoption of technologies that facilitates data collection, processing, and publishing in new ways and in ever-increasing volumes. In addition, the paper is explicit on new data infrastructure in sectors such as health, social amenities, and agriculture. Furthermore, this paper reveals the importance of cross-sectional data infrastructures for planning and decision making, which in turn can form a central data repository for sustainable development across developing countries.

Keywords: data portal, data infrastructure, open source, sustainability

Procedia PDF Downloads 73
35982 Study and Analysis of the Factors Affecting Road Safety Using Decision Tree Algorithms

Authors: Naina Mahajan, Bikram Pal Kaur

Abstract:

The purpose of traffic accident analysis is to find the possible causes of an accident. Road accidents cannot be totally prevented but by suitable traffic engineering and management the accident rate can be reduced to a certain extent. This paper discusses the classification techniques C4.5 and ID3 using the WEKA Data mining tool. These techniques use on the NH (National highway) dataset. With the C4.5 and ID3 technique it gives best results and high accuracy with less computation time and error rate.

Keywords: C4.5, ID3, NH(National highway), WEKA data mining tool

Procedia PDF Downloads 311
35981 Glycemic Control on Self-Efficacy and Self-Care Behaviors among Omani Adults with Type 2 Diabetes

Authors: Melba Sheila D'Souza, Anandhi Amirtharaj, Shreedevi Balachandran

Abstract:

Background: Type 2 diabetes has a significant impact on individuals’ health and well-being. Glycemic control may influence self-efficacy and self-care behaviors, and reduce the risk of complications among adults with type 2 diabetes. Type 2 diabetes has substantial morbidity and mortality and 60% of adults’ poor self-care. Glycemic control is associated with reported self-efficacy and self-care behavior. Adults with type 2 diabetes with less information were less likely to take diabetes self-care. Aim: To examine the relationship between glycemic control, demographic factors, clinical factors on self-efficacy, self-care behaviors among Omani adults with type 2 diabetes. Methods: A correlational, descriptive study was used. Omani adults with type 2 diabetes (n=140) were recruited from a public hospital in Oman. The data were collected during January-March 2015. Ethical approval was given by the college research and ethics committee, College of Nursing, and the Hospital, Sultan Qaboos University Data was collected on self-efficacy, self-care behaviors and glycemic control. The study was approved by the Institution Ethics and Research Committee. Bivariate and multivariate analyses were conducted. Results: Most adults had a fasting blood glucose >7.2mmol/L (90.7%), with the majority demonstrating ‘uncontrolled or poor HbA1c of > 8%’ (65%). Variance of self-care behavior (20.6%) and 31.3% of the variance of the self-efficacy was explained by the age, duration of diabetes, medication, HbA1c and prevention of activities of living. Adults with type 2 diabetes with poor glycemic control were more likely to have poor self-efficacy and poor self-care behaviors. Conclusion: This study confirms that self-efficacy model on outcome predicts self-efficacy and self-care behavior. Higher understanding of diabetes, prevention of normal daily activities, higher ability to fit diabetes life in a positive manner and high patient-physician communication were significant with self-efficacy and self-care behaviors. Hence, glycemic control has a high effect on improving self-care behaviors like diet, exercise, medication, foot care and self-efficacy among type 2 diabetes. Implications: Using these findings to improve self-efficacy, individualized self-care management is recommended for better self-efficacy and self-care behaviors among adults with type 2 diabetes.

Keywords: self-efficacy, self-care behaviors, self-care management, glycemic control, type 2 diabetes, nurse

Procedia PDF Downloads 385
35980 The Economic Limitations of Defining Data Ownership Rights

Authors: Kacper Tomasz Kröber-Mulawa

Abstract:

This paper will address the topic of data ownership from an economic perspective, and examples of economic limitations of data property rights will be provided, which have been identified using methods and approaches of economic analysis of law. To properly build a background for the economic focus, in the beginning a short perspective of data and data ownership in the EU’s legal system will be provided. It will include a short introduction to its political and social importance and highlight relevant viewpoints. This will stress the importance of a Single Market for data but also far-reaching regulations of data governance and privacy (including the distinction of personal and non-personal data, data held by public bodies and private businesses). The main discussion of this paper will build upon the briefly referred to legal basis as well as methods and approaches of economic analysis of law.

Keywords: antitrust, data, data ownership, digital economy, property rights

Procedia PDF Downloads 61
35979 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels

Authors: Joshua Buli, David Pietrowski, Samuel Britton

Abstract:

Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.

Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization

Procedia PDF Downloads 52
35978 Customer Data Analysis Model Using Business Intelligence Tools in Telecommunication Companies

Authors: Monica Lia

Abstract:

This article presents a customer data analysis model using business intelligence tools for data modelling, transforming, data visualization and dynamic reports building. Economic organizational customer’s analysis is made based on the information from the transactional systems of the organization. The paper presents how to develop the data model starting for the data that companies have inside their own operational systems. The owned data can be transformed into useful information about customers using business intelligence tool. For a mature market, knowing the information inside the data and making forecast for strategic decision become more important. Business Intelligence tools are used in business organization as support for decision-making.

Keywords: customer analysis, business intelligence, data warehouse, data mining, decisions, self-service reports, interactive visual analysis, and dynamic dashboards, use cases diagram, process modelling, logical data model, data mart, ETL, star schema, OLAP, data universes

Procedia PDF Downloads 411
35977 Image-Based (RBG) Technique for Estimating Phosphorus Levels of Different Crops

Authors: M. M. Ali, Ahmed Al- Ani, Derek Eamus, Daniel K. Y. Tan

Abstract:

In this glasshouse study, we developed the new image-based non-destructive technique for detecting leaf P status of different crops such as cotton, tomato and lettuce. Plants were allowed to grow on nutrient media containing different P concentrations, i.e. 0%, 50% and 100% of recommended P concentration (P0 = no P, L; P1 = 2.5 mL 10 L-1 of P and P2 = 5 mL 10 L-1 of P as NaH2PO4). After 10 weeks of growth, plants were harvested and data on leaf P contents were collected using the standard destructive laboratory method and at the same time leaf images were collected by a handheld crop image sensor. We calculated leaf area, leaf perimeter and RGB (red, green and blue) values of these images. This data was further used in the linear discriminant analysis (LDA) to estimate leaf P contents, which successfully classified these plants on the basis of leaf P contents. The data indicated that P deficiency in crop plants can be predicted using the image and morphological data. Our proposed non-destructive imaging method is precise in estimating P requirements of different crop species.

Keywords: image-based techniques, leaf area, leaf P contents, linear discriminant analysis

Procedia PDF Downloads 357
35976 Unlocking Academic Success: A Comprehensive Exploration of Shaguf Bites’s Impact on Learning and Retention

Authors: Joud Zagzoog, Amira Aldabbagh, Radiyah Hamidaddin

Abstract:

This research aims to test out and observe whether artificial intelligence (AI) software and applications could actually be effective, useful, and time-saving for those who use them. Shaguf Bites, a web application that uses AI technology, claims to help students study and memorize information more effectively in less time. The website uses smart learning, or AI-powered bite-sized repetitive learning, by transforming documents or PDFs with the help of AI into summarized interactive smart flashcards (Bites, n.d.). To properly test out the websites’ effectiveness, both qualitative and quantitative methods were used in this research. An experiment was conducted on a number of students where they were first requested to use Shaguf Bites without any prior knowledge or explanation of how to use it. Second, they were asked for feedback through a survey on how their experience was after using it and whether it was helpful, efficient, time-saving, and easy to use for studying. After reviewing the collected data, we found out that the majority of students found the website to be straightforward and easy to use. 58% of the respondents agreed that the website accurately formulated the flashcard questions. And 53% of them reported that they are most likely to use the website again in the future as well as recommend it to others. Overall, from the given results, it is clear that Shaguf Bites have proved to be very beneficial, accurate, and time saving for the majority of the students.

Keywords: artificial intelligence (AI), education, memorization, spaced repetition, flashcards.

Procedia PDF Downloads 156
35975 3D Dentofacial Surgery Full Planning Procedures

Authors: Oliveira M., Gonçalves L., Francisco I., Caramelo F., Vale F., Sanz D., Domingues M., Lopes M., Moreia D., Lopes T., Santos T., Cardoso H.

Abstract:

The ARTHUR project consists of a platform that allows the virtual performance of maxillofacial surgeries, offering, in a photorealistic concept, the possibility for the patient to have an idea of the surgical changes before they are performed on their face. For this, the system brings together several image formats, dicoms and objs that, after loading, will generate the bone volume, soft tissues and hard tissues. The system also incorporates the patient's stereophotogrammetry, in addition to their data and clinical history. After loading and inserting data, the clinician can virtually perform the surgical operation and present the final result to the patient, generating a new facial surface that contemplates the changes made in the bone and tissues of the maxillary area. This tool acts in different situations that require facial reconstruction, however this project focuses specifically on two types of use cases: bone congenital disfigurement and acquired disfiguration such as oral cancer with bone attainment. Being developed a cloud based solution, with mobile support, the tool aims to reduce the decision time window of patient. Because the current simulations are not realistic or, if realistic, need time due to the need of building plaster models, patient rates on decision, rely on a long time window (1,2 months), because they don’t identify themselves with the presented surgical outcome. On the other hand, this planning was performed time based on average estimated values of the position of the maxilla and mandible. The team was based on averages of the facial measurements of the population, without specifying racial variability, so the proposed solution was not adjusted to the real individual physiognomic needs.

Keywords: 3D computing, image processing, image registry, image reconstruction

Procedia PDF Downloads 185
35974 Productivity and Structural Design of Manufacturing Systems

Authors: Ryspek Usubamatov, Tan San Chin, Sarken Kapaeva

Abstract:

Productivity of the manufacturing systems depends on technological processes, a technical data of machines and a structure of systems. Technology is presented by the machining mode and data, a technical data presents reliability parameters and auxiliary time for discrete production processes. The term structure of manufacturing systems includes the number of serial and parallel production machines and links between them. Structures of manufacturing systems depend on the complexity of technological processes. Mathematical models of productivity rate for manufacturing systems are important attributes that enable to define best structure by criterion of a productivity rate. These models are important tool in evaluation of the economical efficiency for production systems.

Keywords: productivity, structure, manufacturing systems, structural design

Procedia PDF Downloads 565
35973 Worldwide GIS Based Earthquake Information System/Alarming System for Microzonation/Liquefaction and It’s Application for Infrastructure Development

Authors: Rajinder Kumar Gupta, Rajni Kant Agrawal, Jaganniwas

Abstract:

One of the most frightening phenomena of nature is the occurrence of earthquake as it has terrible and disastrous effects. Many earthquakes occur every day worldwide. There is need to have knowledge regarding the trends in earthquake occurrence worldwide. The recoding and interpretation of data obtained from the establishment of the worldwide system of seismological stations made this possible. From the analysis of recorded earthquake data, the earthquake parameters and source parameters can be computed and the earthquake catalogues can be prepared. These catalogues provide information on origin, time, epicenter locations (in term of latitude and longitudes) focal depths, magnitude and other related details of the recorded earthquakes. Theses catalogues are used for seismic hazard estimation. Manual interpretation and analysis of these data is tedious and time consuming. A geographical information system is a computer based system designed to store, analyzes and display geographic information. The implementation of integrated GIS technology provides an approach which permits rapid evaluation of complex inventor database under a variety of earthquake scenario and allows the user to interactively view results almost immediately. GIS technology provides a powerful tool for displaying outputs and permit to users to see graphical distribution of impacts of different earthquake scenarios and assumptions. An endeavor has been made in present study to compile the earthquake data for the whole world in visual Basic on ARC GIS Plate form so that it can be used easily for further analysis to be carried out by earthquake engineers. The basic data on time of occurrence, location and size of earthquake has been compiled for further querying based on various parameters. A preliminary analysis tool is also provided in the user interface to interpret the earthquake recurrence in region. The user interface also includes the seismic hazard information already worked out under GHSAP program. The seismic hazard in terms of probability of exceedance in definite return periods is provided for the world. The seismic zones of the Indian region are included in the user interface from IS 1893-2002 code on earthquake resistant design of buildings. The City wise satellite images has been inserted in Map and based on actual data the following information could be extracted in real time: • Analysis of soil parameters and its effect • Microzonation information • Seismic hazard and strong ground motion • Soil liquefaction and its effect in surrounding area • Impacts of liquefaction on buildings and infrastructure • Occurrence of earthquake in future and effect on existing soil • Propagation of earth vibration due of occurrence of Earthquake GIS based earthquake information system has been prepared for whole world in Visual Basic on ARC GIS Plate form and further extended micro level based on actual soil parameters. Individual tools has been developed for liquefaction, earthquake frequency etc. All information could be used for development of infrastructure i.e. multi story structure, Irrigation Dam & Its components, Hydro-power etc in real time for present and future.

Keywords: GIS based earthquake information system, microzonation, analysis and real time information about liquefaction, infrastructure development

Procedia PDF Downloads 300
35972 Multiple Empowerments: How Work Team Shapes the Village Governance in China

Authors: Yang Liu

Abstract:

The work team has been being adopted by the CCP for special missions in a limited time. Since the 18th National Congress of CCP, the unprecedented practice of the work team has had impacts beyond the original goal of poverty alleviation, their functions in village governance have still not been well studied. As the state agents that come from the outside of the village community, this article argues that the work team is a group that represents the coexistence of political, economic, and cultural capital, which contributes to effectively empower the state, and the village cadres and the peasants. For the state, more accurate bottom-up information could be collected by the work team, and policies could be made scientifically and implemented without distortion. For the village cadres, they can learn leadership skills and share more resources owned or mobilized by the work team. For the peasants, they have more access to participate the public affairs of their village and express their claims. The multiple empowerments have greatly improved the relationship among the state, the peasants, and the village cadres since a series of reforms from 1980s to 2000s that alienated the relationship among them.

Keywords: state, village cadre, empowerment, work team, peasants

Procedia PDF Downloads 92
35971 Economized Sensor Data Processing with Vehicle Platooning

Authors: Henry Hexmoor, Kailash Yelasani

Abstract:

We present vehicular platooning as a special case of crowd-sensing framework where sharing sensory information among a crowd is used for their collective benefit. After offering an abstract policy that governs processes involving a vehicular platoon, we review several common scenarios and components surrounding vehicular platooning. We then present a simulated prototype that illustrates efficiency of road usage and vehicle travel time derived from platooning. We have argued that one of the paramount benefits of platooning that is overlooked elsewhere, is the substantial computational savings (i.e., economizing benefits) in acquisition and processing of sensory data among vehicles sharing the road. The most capable vehicle can share data gathered from its sensors with nearby vehicles grouped into a platoon.

Keywords: cloud network, collaboration, internet of things, social network

Procedia PDF Downloads 174
35970 Aggregate Supply Response of Some Livestock Commodities in Algeria: Cointegration- Vector Error Correction Model Approach

Authors: Amine M. Benmehaia, Amine Oulmane

Abstract:

The supply response of agricultural commodities to changes in price incentives is an important issue for the success of any policy reform in the agricultural sector. This study aims to quantify the responsiveness of producers of some livestock commodities to price incentives in Algerian context. Time series analysis is used on annual data for a period of 52 years (1966-2018). Both co-integration and vector error correction model (VECM) are used through the Nerlove model of partial adjustment. The study attempts to determine the long-run and short-run relationships along with the magnitudes of disequilibria in the selected commodities. Results show that the short-run price elasticities are low in cow and sheep meat sectors (8.7 and 8% respectively), while their respective long-run elasticities are 16.5 and 10.5, whereas eggs and milk have very high short-run price elasticities (82 and 90% respectively) with long-run elasticities of 40 and 46 respectively. The error correction coefficient, reflecting the speed of adjustment towards the long-run equilibrium, is statistically significant and have the expected negative sign. Its estimates are 12.7 for cow meat, 33.5 for sheep meat, 46.7 for eggs and 8.4 for milk. It seems that cow meat and milk producers have a weak feedback of about 12.7% and 8.4% respectively of the previous year's disequilibrium from the long-run price elasticity, whereas sheep meat and eggs producers adjust to correct long run disequilibrium with a high speed of adjustment (33.5% and 46.7 % respectively). The implication of this is that much more in-depth research is needed to identify those factors that affect agricultural supply and to describe the effect of factors that shift supply in response to price incentives. This could provide valuable information for government in the use of appropriate policy measures.

Keywords: Algeria, cointegration, livestock, supply response, vector error correction model

Procedia PDF Downloads 114
35969 Generic Model for Timetabling Problems by Integer Linear Programmimg Approach

Authors: Nur Aidya Hanum Aizam, Vikneswary Uvaraja

Abstract:

The agenda of showing the scheduled time for performing certain tasks is known as timetabling. It widely used in many departments such as transportation, education, and production. Some difficulties arise to ensure all tasks happen in the time and place allocated. Therefore, many researchers invented various programming model to solve the scheduling problems from several fields. However, the studies in developing the general integer programming model for many timetabling problems are still questionable. Meanwhile, this thesis describe about creating a general model which solve different types of timetabling problems by considering the basic constraints. Initially, the common basic constraints from five different fields are selected and analyzed. A general basic integer programming model was created and then verified by using the medium set of data obtained randomly which is much similar to realistic data. The mathematical software, AIMMS with CPLEX as a solver has been used to solve the model. The model obtained is significant in solving many timetabling problems easily since it is modifiable to all types of scheduling problems which have same basic constraints.

Keywords: AIMMS mathematical software, integer linear programming, scheduling problems, timetabling

Procedia PDF Downloads 417
35968 Fast Fourier Transform-Based Steganalysis of Covert Communications over Streaming Media

Authors: Jinghui Peng, Shanyu Tang, Jia Li

Abstract:

Steganalysis seeks to detect the presence of secret data embedded in cover objects, and there is an imminent demand to detect hidden messages in streaming media. This paper shows how a steganalysis algorithm based on Fast Fourier Transform (FFT) can be used to detect the existence of secret data embedded in streaming media. The proposed algorithm uses machine parameter characteristics and a network sniffer to determine whether the Internet traffic contains streaming channels. The detected streaming data is then transferred from the time domain to the frequency domain through FFT. The distributions of power spectra in the frequency domain between original VoIP streams and stego VoIP streams are compared in turn using t-test, achieving the p-value of 7.5686E-176 which is below the threshold. The results indicate that the proposed FFT-based steganalysis algorithm is effective in detecting the secret data embedded in VoIP streaming media.

Keywords: steganalysis, security, Fast Fourier Transform, streaming media

Procedia PDF Downloads 123
35967 Automated Human Balance Assessment Using Contactless Sensors

Authors: Justin Tang

Abstract:

Balance tests are frequently used to diagnose concussions on the sidelines of sporting events. Manual scoring, however, is labor intensive and subjective, and many concussions go undetected. This study institutes a novel approach to conducting the Balance Error Scoring System (BESS) more quantitatively using Microsoft’s gaming system Kinect, which uses a contactless sensor and several cameras to receive data and estimate body limb positions. Using a machine learning approach, Visual Gesture Builder, and a deterministic approach, MATLAB, we tested whether the Kinect can differentiate between “correct” and erroneous stances of the BESS. We created the two separate solutions by recording test videos to teach the Kinect correct stances and by developing a code using Java. Twenty-two subjects were asked to perform a series of BESS tests while the Kinect was collecting data. The Kinect recorded the subjects and mapped key joints onto their bodies to obtain angles and measurements that are interpreted by the software. Through VGB and MATLAB, the videos are analyzed to enumerate the number of errors committed during testing. The resulting statistics demonstrate a high correlation between manual scoring and the Kinect approaches, indicating the viability of the use of remote tracking devices in conducting concussion tests.

Keywords: automated, concussion detection, contactless sensors, microsoft kinect

Procedia PDF Downloads 302
35966 Assessment of the Association between Serum Thrombospondin-1 Levels at the Time of Admission and the Severity of Neurological Deficit in Patients with Ischemic Stroke

Authors: A. Alhusban, M. Alqawasmeh, F. Alfawares

Abstract:

Introduction: Despite improvements in stroke management, it remains the leading cause of disability worldwide. It has been suggested that enhancing brain angiogenesis after stroke will improve stroke outcome. Promoting post stroke angiogenesis requires the upregulation of angiogenic factors with a simultaneous reduction of anti-angiogenic factors. Thrombospondin-1 is the main anti-angiogenic protein in the living cells. Counterintuitively, it has been shown that animals with Thrombospondin-1 knockdown will have better stroke outcome. Data about the clinical significance of Thrombspondin-1 levels at the time of admission is still lacking. The objective of this work is to assess the association between serum Thrombospondin-1 levels measured at the time of admission and baseline neurologic severity after stroke. Patients and Methods: Blood samples were collected from patients admitted to the King Abdullah University Hospital (KAUH) with ischemic stroke at the time of admission and serum Thrombopsondin-1 levels were measured using ELISA. Patients neurologic severity was evaluated using the National Institute of Health Stroke Scale (NIHSS). Results: Samples from 50 patients admitted between January 2016 and December 2016 were collected. The median age of participants was 68 years and the median NIHSS was 3. Multinomial regression identified serum Thrombospondin-1 as an independent predictor of stroke outcome (p=0.003). Baseline serum Thrombsopondin-1 was negatively associated with NIHSS at the time of admission (spearman rho correlation coefficient=0.272, p=0.032). Conclusion: Serum Thrombospondin-1 at the time of admission may be a useful marker of stroke severity that predicts more severe neurologic severity.

Keywords: thrombospondin, stroke, neuroprotection, biomarkers

Procedia PDF Downloads 117
35965 Implementation of a Non-Poissonian Model in a Low-Seismicity Area

Authors: Ludivine Saint-Mard, Masato Nakajima, Gloria Senfaute

Abstract:

In areas with low to moderate seismicity, the probabilistic seismic hazard analysis frequently uses a Poisson approach, which assumes independence in time and space of events to determine the annual probability of earthquake occurrence. Nevertheless, in countries with high seismic rate, such as Japan, it is frequently use non-poissonian model which assumes that next earthquake occurrence depends on the date of previous one. The objective of this paper is to apply a non-poissonian models in a region of low to moderate seismicity to get a feedback on the following questions: can we overcome the lack of data to determine some key parameters?, and can we deal with uncertainties to apply largely this methodology on an industrial context?. The Brownian-Passage-Time model was applied to a fault located in France and conclude that even if the lack of data can be overcome with some calculations, the amount of uncertainties and number of scenarios leads to a numerous branches in PSHA, making this method difficult to apply on a large scale of low to moderate seismicity areas and in an industrial context.

Keywords: probabilistic seismic hazard, non-poissonian model, earthquake occurrence, low seismicity

Procedia PDF Downloads 37
35964 Synergism in the Inquiry Lab: An Analysis of Time Targets and Achievement

Authors: John M. Basey, Clinton D. Francis, Maxwell B. Joseph

Abstract:

After gathering data from experimental procedures, inquiry-oriented-science labs often allow students the freedom to stay and complete the write up in class or leave lab early and complete the write up later. Teachers must decide whether to allow students this freedom to self-regulate this time. Student interviews have indicated four time-target strategies that may influence how students utilize this time: grade-target-A, grade-target-C, time-limited, and proficiency. The hypothesis tested was that variability in class composition relative to the four grade-target strategies has an impact on when students leave class, which in turn may influence their overall learning as exemplified by grades. Students were divided into the four indicated groups with a survey. Class composition and the GTA teaching the class had significant impacts on how long students stayed in class with class composition having the greatest impact. A factor analysis identified two factors. Factor 1 included classes with percentages of grade-target students opposite time-limited/proficiency students and explained 43% of the variance. Factor 2 included classes with percentages of grade-target-A/proficiency students opposite grade-target-C students and explained 33% of the variance. Students who stayed longer received significantly higher grades (P = 0.008) with no significant relationships between grade and Factor 1 or Factor 2 (P > 0.05). The time students stayed in class was significantly positively related to Factor 1 (P = 0.006) and significantly negatively related to Factor 2 (P = 0.008). These results support the hypothesis and indicate that teachers may want to know the composition of student-target strategies before deciding on how to have students allocate study time at the end of inquiry-oriented labs. According to these results, ideal classes for self-regulation have a high proportion of proficiency and time-limited students and a low proportion of grade-target students, or a high proportion of grade-target-A and proficiency students and a low proportion of grade-target-C students. Non-ideal classes for self-regulation were comprised of the inverse proportions.

Keywords: grades, inquiry lab design, synergism in student motivation, class composition

Procedia PDF Downloads 106
35963 Multidimensional Poverty and Child Cognitive Development

Authors: Bidyadhar Dehury, Sanjay Kumar Mohanty

Abstract:

According to the Right to Education Act of India, education is the fundamental right of all children of age group 6-14 year irrespective of their status. Using the unit level data from India Human Development Survey (IHDS), we tried to understand the inter-relationship between the level of poverty and the academic performance of the children aged 8-11 years. The level of multidimensional poverty is measured using five dimensions and 10 indicators using Alkire-Foster approach. The weighted deprivation score was obtained by giving equal weight to each dimension and indicators within the dimension. The weighted deprivation score varies from 0 to 1 and grouped into four categories as non-poor, vulnerable, multidimensional poor and sever multidimensional poor. The academic performance index was measured using three variables reading skills, math skills and writing skills using PCA. The bivariate and multivariate analysis was used in the analysis. The outcome variable was ordinal. So the predicted probabilities were calculated using the ordinal logistic regression. The predicted probabilities of good academic performance index was 0.202 if the child was sever multidimensional poor, 0.235 if the child was multidimensional poor, 0.264 if the child was vulnerable, and 0.316 if the child was non-poor. Hence, if the level of poverty among the children decreases from sever multidimensional poor to non-poor, the probability of good academic performance increases.

Keywords: multidimensional poverty, academic performance index, reading skills, math skills, writing skills, India

Procedia PDF Downloads 570
35962 Discovering Event Outliers for Drug as Commercial Products

Authors: Arunas Burinskas, Aurelija Burinskiene

Abstract:

On average, ten percent of drugs - commercial products are not available in pharmacies due to shortage. The shortage event disbalance sales and requires a recovery period, which is too long. Therefore, one of the critical issues that pharmacies do not record potential sales transactions during shortage and recovery periods. The authors suggest estimating outliers during shortage and recovery periods. To shorten the recovery period, the authors suggest using average sales per sales day prediction, which helps to protect the data from being downwards or upwards. Authors use the outlier’s visualization method across different drugs and apply the Grubbs test for significance evaluation. The researched sample is 100 drugs in a one-month time frame. The authors detected that high demand variability products had outliers. Among analyzed drugs, which are commercial products i) High demand variability drugs have a one-week shortage period, and the probability of facing a shortage is equal to 69.23%. ii) Mid demand variability drugs have three days shortage period, and the likelihood to fall into deficit is equal to 34.62%. To avoid shortage events and minimize the recovery period, real data must be set up. Even though there are some outlier detection methods for drug data cleaning, they have not been used for the minimization of recovery period once a shortage has occurred. The authors use Grubbs’ test real-life data cleaning method for outliers’ adjustment. In the paper, the outliers’ adjustment method is applied with a confidence level of 99%. In practice, the Grubbs’ test was used to detect outliers for cancer drugs and reported positive results. The application of the Grubbs’ test is used to detect outliers which exceed boundaries of normal distribution. The result is a probability that indicates the core data of actual sales. The application of the outliers’ test method helps to represent the difference of the mean of the sample and the most extreme data considering the standard deviation. The test detects one outlier at a time with different probabilities from a data set with an assumed normal distribution. Based on approximation data, the authors constructed a framework for scaling potential sales and estimating outliers with Grubbs’ test method. The suggested framework is applicable during the shortage event and recovery periods. The proposed framework has practical value and could be used for the minimization of the recovery period required after the shortage of event occurrence.

Keywords: drugs, Grubbs' test, outlier, shortage event

Procedia PDF Downloads 117