Search results for: real time data processing
38473 Change Point Detection Using Random Matrix Theory with Application to Frailty in Elderly Individuals
Authors: Malika Kharouf, Aly Chkeir, Khac Tuan Huynh
Abstract:
Detecting change points in time series data is a challenging problem, especially in scenarios where there is limited prior knowledge regarding the data’s distribution and the nature of the transitions. We present a method designed for detecting changes in the covariance structure of high-dimensional time series data, where the number of variables closely matches the data length. Our objective is to achieve unbiased test statistic estimation under the null hypothesis. We delve into the utilization of Random Matrix Theory to analyze the behavior of our test statistic within a high-dimensional context. Specifically, we illustrate that our test statistic converges pointwise to a normal distribution under the null hypothesis. To assess the effectiveness of our proposed approach, we conduct evaluations on a simulated dataset. Furthermore, we employ our method to examine changes aimed at detecting frailty in the elderly.Keywords: change point detection, hypothesis tests, random matrix theory, frailty in elderly
Procedia PDF Downloads 5338472 A Fast Community Detection Algorithm
Authors: Chung-Yuan Huang, Yu-Hsiang Fu, Chuen-Tsai Sun
Abstract:
Community detection represents an important data-mining tool for analyzing and understanding real-world complex network structures and functions. We believe that at least four criteria determine the appropriateness of a community detection algorithm: (a) it produces useable normalized mutual information (NMI) and modularity results for social networks, (b) it overcomes resolution limitation problems associated with synthetic networks, (c) it produces good NMI results and performance efficiency for Lancichinetti-Fortunato-Radicchi (LFR) benchmark networks, and (d) it produces good modularity and performance efficiency for large-scale real-world complex networks. To our knowledge, no existing community detection algorithm meets all four criteria. In this paper, we describe a simple hierarchical arc-merging (HAM) algorithm that uses network topologies and rule-based arc-merging strategies to identify community structures that satisfy the criteria. We used five well-studied social network datasets and eight sets of LFR benchmark networks to validate the ground-truth community correctness of HAM, eight large-scale real-world complex networks to measure its performance efficiency, and two synthetic networks to determine its susceptibility to resolution limitation problems. Our results indicate that the proposed HAM algorithm is capable of providing satisfactory performance efficiency and that HAM-identified communities were close to ground-truth communities in social and LFR benchmark networks while overcoming resolution limitation problems.Keywords: complex network, social network, community detection, network hierarchy
Procedia PDF Downloads 22838471 Local Differential Privacy-Based Data-Sharing Scheme for Smart Utilities
Authors: Veniamin Boiarkin, Bruno Bogaz Zarpelão, Muttukrishnan Rajarajan
Abstract:
The manufacturing sector is a vital component of most economies, which leads to a large number of cyberattacks on organisations, whereas disruption in operation may lead to significant economic consequences. Adversaries aim to disrupt the production processes of manufacturing companies, gain financial advantages, and steal intellectual property by getting unauthorised access to sensitive data. Access to sensitive data helps organisations to enhance the production and management processes. However, the majority of the existing data-sharing mechanisms are either susceptible to different cyber attacks or heavy in terms of computation overhead. In this paper, a privacy-preserving data-sharing scheme for smart utilities is proposed. First, a customer’s privacy adjustment mechanism is proposed to make sure that end-users have control over their privacy, which is required by the latest government regulations, such as the General Data Protection Regulation. Secondly, a local differential privacy-based mechanism is proposed to ensure the privacy of the end-users by hiding real data based on the end-user preferences. The proposed scheme may be applied to different industrial control systems, whereas in this study, it is validated for energy utility use cases consisting of smart, intelligent devices. The results show that the proposed scheme may guarantee the required level of privacy with an expected relative error in utility.Keywords: data-sharing, local differential privacy, manufacturing, privacy-preserving mechanism, smart utility
Procedia PDF Downloads 7638470 Transferring Data from Glucometer to Mobile Device via Bluetooth with Arduino Technology
Authors: Tolga Hayit, Ucman Ergun, Ugur Fidan
Abstract:
Being healthy is undoubtedly an indispensable necessity for human life. With technological improvements, in the literature, various health monitoring and imaging systems have been developed to satisfy your health needs. In this context, the work of monitoring and recording the data of individual health monitoring data via wireless technology is also being part of these studies. Nowadays, mobile devices which are located in almost every house and which become indispensable of our life and have wireless technology infrastructure have an important place of making follow-up health everywhere and every time because these devices were using in the health monitoring systems. In this study, Arduino an open-source microcontroller card was used in which a sample sugar measuring device was connected in series. In this way, the glucose data (glucose ratio, time) obtained with the glucometer is transferred to the mobile device based on the Android operating system with the Bluetooth technology channel. A mobile application was developed using the Apache Cordova framework for listing data, presenting graphically and reading data over Arduino. Apache Cordova, HTML, Javascript and CSS are used in coding section. The data received from the glucometer is stored in the local database of the mobile device. It is intended that people can transfer their measurements to their mobile device by using wireless technology and access the graphical representations of their data. In this context, the aim of the study is to be able to perform health monitoring by using different wireless technologies in mobile devices that can respond to different wireless technologies at present. Thus, that will contribute the other works done in this area.Keywords: Arduino, Bluetooth, glucose measurement, mobile health monitoring
Procedia PDF Downloads 32338469 Fuzzy Wavelet Model to Forecast the Exchange Rate of IDR/USD
Authors: Tri Wijayanti Septiarini, Agus Maman Abadi, Muhammad Rifki Taufik
Abstract:
The exchange rate of IDR/USD can be the indicator to analysis Indonesian economy. The exchange rate as a important factor because it has big effect in Indonesian economy overall. So, it needs the analysis data of exchange rate. There is decomposition data of exchange rate of IDR/USD to be frequency and time. It can help the government to monitor the Indonesian economy. This method is very effective to identify the case, have high accurate result and have simple structure. In this paper, data of exchange rate that used is weekly data from December 17, 2010 until November 11, 2014.Keywords: the exchange rate, fuzzy mamdani, discrete wavelet transforms, fuzzy wavelet
Procedia PDF Downloads 57138468 Utilizing AI Green Grader Scope to Promote Environmental Responsibility Among University Students
Authors: Tarek Taha Kandil
Abstract:
In higher education, the use of automated grading systems is on the rise, automating the assessment of students' work and providing practical feedback. Sustainable Grader Scope addresses the environmental impact of these computational tasks. This system uses an AI-powered algorithm and is designed to minimize grading process emissions. It reduces carbon emissions through energy-efficient computing and carbon-conscious scheduling. Students submit their computational workloads to the system, which evaluates submissions using containers and a distributed infrastructure. A carbon-conscious scheduler manages workloads across global campuses, optimizing emissions using real-time carbon intensity data. This ensures the university stays within government-set emission limits while tracking and reducing its carbon footprint.Keywords: sustainability, green graders, digital sustainable grader scope, environmental responsibility; higher education.
Procedia PDF Downloads 438467 Impact Location From Instrumented Mouthguard Kinematic Data In Rugby
Authors: Jazim Sohail, Filipe Teixeira-Dias
Abstract:
Mild traumatic brain injury (mTBI) within non-helmeted contact sports is a growing concern due to the serious risk of potential injury. Extensive research is being conducted looking into head kinematics in non-helmeted contact sports utilizing instrumented mouthguards that allow researchers to record accelerations and velocities of the head during and after an impact. This does not, however, allow the location of the impact on the head, and its magnitude and orientation, to be determined. This research proposes and validates two methods to quantify impact locations from instrumented mouthguard kinematic data, one using rigid body dynamics, the other utilizing machine learning. The rigid body dynamics technique focuses on establishing and matching moments from Euler’s and torque equations in order to find the impact location on the head. The methodology is validated with impact data collected from a lab test with the dummy head fitted with an instrumented mouthguard. Additionally, a Hybrid III Dummy head finite element model was utilized to create synthetic kinematic data sets for impacts from varying locations to validate the impact location algorithm. The algorithm calculates accurate impact locations; however, it will require preprocessing of live data, which is currently being done by cross-referencing data timestamps to video footage. The machine learning technique focuses on eliminating the preprocessing aspect by establishing trends within time-series signals from instrumented mouthguards to determine the impact location on the head. An unsupervised learning technique is used to cluster together impacts within similar regions from an entire time-series signal. The kinematic signals established from mouthguards are converted to the frequency domain before using a clustering algorithm to cluster together similar signals within a time series that may span the length of a game. Impacts are clustered within predetermined location bins. The same Hybrid III Dummy finite element model is used to create impacts that closely replicate on-field impacts in order to create synthetic time-series datasets consisting of impacts in varying locations. These time-series data sets are used to validate the machine learning technique. The rigid body dynamics technique provides a good method to establish accurate impact location of impact signals that have already been labeled as true impacts and filtered out of the entire time series. However, the machine learning technique provides a method that can be implemented with long time series signal data but will provide impact location within predetermined regions on the head. Additionally, the machine learning technique can be used to eliminate false impacts captured by sensors saving additional time for data scientists using instrumented mouthguard kinematic data as validating true impacts with video footage would not be required.Keywords: head impacts, impact location, instrumented mouthguard, machine learning, mTBI
Procedia PDF Downloads 21738466 A Multi-Release Software Reliability Growth Models Incorporating Imperfect Debugging and Change-Point under the Simulated Testing Environment and Software Release Time
Authors: Sujit Kumar Pradhan, Anil Kumar, Vijay Kumar
Abstract:
The testing process of the software during the software development time is a crucial step as it makes the software more efficient and dependable. To estimate software’s reliability through the mean value function, many software reliability growth models (SRGMs) were developed under the assumption that operating and testing environments are the same. Practically, it is not true because when the software works in a natural field environment, the reliability of the software differs. This article discussed an SRGM comprising change-point and imperfect debugging in a simulated testing environment. Later on, we extended it in a multi-release direction. Initially, the software was released to the market with few features. According to the market’s demand, the software company upgraded the current version by adding new features as time passed. Therefore, we have proposed a generalized multi-release SRGM where change-point and imperfect debugging concepts have been addressed in a simulated testing environment. The failure-increasing rate concept has been adopted to determine the change point for each software release. Based on nine goodness-of-fit criteria, the proposed model is validated on two real datasets. The results demonstrate that the proposed model fits the datasets better. We have also discussed the optimal release time of the software through a cost model by assuming that the testing and debugging costs are time-dependent.Keywords: software reliability growth models, non-homogeneous Poisson process, multi-release software, mean value function, change-point, environmental factors
Procedia PDF Downloads 7438465 Image Processing on Geosynthetic Reinforced Layers to Evaluate Shear Strength and Variations of the Strain Profiles
Authors: S. K. Khosrowshahi, E. Güler
Abstract:
This study investigates the reinforcement function of geosynthetics on the shear strength and strain profile of sand. Conducting a series of simple shear tests, the shearing behavior of the samples under static and cyclic loads was evaluated. Three different types of geosynthetics including geotextile and geonets were used as the reinforcement materials. An image processing analysis based on the optical flow method was performed to measure the lateral displacements and estimate the shear strains. It is shown that besides improving the shear strength, the geosynthetic reinforcement leads a remarkable reduction on the shear strains. The improved layer reduces the required thickness of the soil layer to resist against shear stresses. Consequently, the geosynthetic reinforcement can be considered as a proper approach for the sustainable designs, especially in the projects with huge amount of geotechnical applications like subgrade of the pavements, roadways, and railways.Keywords: image processing, soil reinforcement, geosynthetics, simple shear test, shear strain profile
Procedia PDF Downloads 22038464 Real Fictions: Converging Landscapes and Imagination in an English Village
Authors: Edoardo Lomi
Abstract:
A problem of central interest in anthropology concerns the ethnographic displacement of modernity’s conceptual sovereignty over that of native collectives worldwide. Part of this critical project has been the association of Western modernity with a dualist, naturalist ontology. Despite its demonstrated value for comparative work, this association often comes at the cost of reproducing ideas that lack an empirical ethnographic basis. This paper proposes a way forward by bringing to bear some of the results produced by an ethnographic study of a village in Wiltshire, South England. Due to its picturesque qualities, this village has served for decades as a ready-made set for fantasy movies and a backdrop to fictional stories. These forms of mediation have in turn generated some apparent paradoxes, such as fictitious characters that affect actual material changes, films that become more real than history, and animated stories that, while requiring material grounds to unfold, inhabit a time and space in other respects distinct from that of material processes. Drawing on ongoing fieldwork and interviews with locals and tourists, this paper considers the ways villagers engage with fiction as part of their everyday lives. The resulting image is one of convergence, in the same landscape, of people and things having different ontological status. This study invites reflection on the implications of this image for diversifying our imagery of Western lifeworlds. To this end, the notion of ‘real fictions’ is put forth, connecting the ethnographic blurring of modernist distinctions–such as sign and signified, mind and matter, materiality and immateriality–with discussions on anthropology’s own reliance on fictions for critical comparative work.Keywords: England, ethnography, landscape, modernity, mediation, ontology, post-structural theory
Procedia PDF Downloads 12138463 Calculation of Inflation from Salaries Instead of Consumer Products: A Logical Exercise
Authors: E. Dahlen
Abstract:
Inflation can be calculated from either the prices of consumer products or from salaries. This paper presents a logical exercise that shows it is easier to calculate inflation from salaries than from consumer products. While the prices of consumer products may change due to technological advancement, such as automation, which must be corrected for, salaries do not. If technological advancements are not accounted for within calculations based on consumer product prices, inflation can be confused with real wage changes, since both inflation and real wage changes affect the prices of consumer products. The method employed in this paper is a logical exercise. Logical arguments are presented that suggest the existence of many different feasible ways by which inflation can be determined. Then a short mathematical exercise will be presented which shows that one of these methods –using salaries – contains the fewest number of unknown parameters, and hence, is the preferred method, since the risk of mistakes is lower. From the results, it can be concluded that salaries, rather than consumer products, should be used to calculate inflation.Keywords: inflation, logic, math, real wages
Procedia PDF Downloads 33038462 Spatial Information and Urbanizing Futures
Authors: Mohammad Talei, Neda Ranjbar Nosheri, Reza Kazemi Gorzadini
Abstract:
Today municipalities are searching for the new tools for increasing the public participation in different levels of urban planning. This approach of urban planning involves the community in planning process using participatory approaches instead of the long traditional top-down planning methods. These tools can be used to obtain the particular problems of urban furniture form the residents’ point of view. One of the tools that is designed with this goal is public participation GIS (PPGIS) that enables citizen to record and following up their feeling and spatial knowledge regarding main problems of the city, specifically urban furniture, in the form of maps. However, despite the good intentions of PPGIS, its practical implementation in developing countries faces many problems including the lack of basic supporting infrastructure and services and unavailability of sophisticated public participatory models. In this research we develop a PPGIS using of Web 2 to collect voluntary geodataand to perform spatial analysis based on Spatial OnLine Analytical Processing (SOLAP) and Spatial Data Mining (SDM). These tools provide urban planners with proper informationregarding the type, spatial distribution and the clusters of reported problems. This system is implemented in a case study area in Tehran, Iran and the challenges to make it applicable and its potential for real urban planning have been evaluated. It helps decision makers to better understand, plan and allocate scarce resources for providing most requested urban furniture.Keywords: PPGIS, spatial information, urbanizing futures, urban planning
Procedia PDF Downloads 72638461 The Effect of Irgafos 168 in the Thermostabilization of High Density Polyethylene
Authors: Mahdi Almaky
Abstract:
The thermostabilization of High Density Polyethylene (HDPE) is realized through the action of primary antioxidant such as phenolic antioxidants and secondary antioxidants as aryl phosphates. The efficiency of two secondary antioxidants, commercially named Irgafos 168 and Weston 399, was investigated using different physical, mechanical, spectroscopic, and calorimetric methods. The effect of both antioxidants on the processing stability and long term stability of HDPE produced in Ras Lanuf oil and gas processing Company were measured and compared. The combination of Irgafos 168 with Irganox 1010, as used in smaller concentration, results in a synergetic effect against thermo-oxidation and protect better than the combination of Weston 399 with Irganox 1010 against the colour change at processing temperature and during long term oxidation process.Keywords: thermostabilization, high density polyethylene, primary antioxidant, phenolic antioxidant, Irgafos 168, Irganox 1010, Weston 399
Procedia PDF Downloads 35438460 Time Series Modelling and Prediction of River Runoff: Case Study of Karkheh River, Iran
Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh
Abstract:
Rainfall and runoff phenomenon is a chaotic and complex outcome of nature which requires sophisticated modelling and simulation methods for explanation and use. Time Series modelling allows runoff data analysis and can be used as forecasting tool. In the paper attempt is made to model river runoff data and predict the future behavioural pattern of river based on annual past observations of annual river runoff. The river runoff analysis and predict are done using ARIMA model. For evaluating the efficiency of prediction to hydrological events such as rainfall, runoff and etc., we use the statistical formulae applicable. The good agreement between predicted and observation river runoff coefficient of determination (R2) display that the ARIMA (4,1,1) is the suitable model for predicting Karkheh River runoff at Iran.Keywords: time series modelling, ARIMA model, river runoff, Karkheh River, CLS method
Procedia PDF Downloads 34138459 The Impact of Recurring Events in Fake News Detection
Authors: Ali Raza, Shafiq Ur Rehman Khan, Raja Sher Afgun Usmani, Asif Raza, Basit Umair
Abstract:
Detection of Fake news and missing information is gaining popularity, especially after the advancement in social media and online news platforms. Social media platforms are the main and speediest source of fake news propagation, whereas online news websites contribute to fake news dissipation. In this study, we propose a framework to detect fake news using the temporal features of text and consider user feedback to identify whether the news is fake or not. In recent studies, the temporal features in text documents gain valuable consideration from Natural Language Processing and user feedback and only try to classify the textual data as fake or true. This research article indicates the impact of recurring and non-recurring events on fake and true news. We use two models BERT and Bi-LSTM to investigate, and it is concluded from BERT we get better results and 70% of true news are recurring and rest of 30% are non-recurring.Keywords: natural language processing, fake news detection, machine learning, Bi-LSTM
Procedia PDF Downloads 2338458 Tensile Force Estimation for Real-Size Pre-Stressed Concrete Girder using Embedded Elasto-Magnetic Sensor
Authors: Junkyeong Kim, Jooyoung Park, Aoqi Zhang, Seunghee Park
Abstract:
The tensile force of Pre-Stressed Concrete (PSC) girder is the most important factor for evaluating the performance of PSC girder bridges. To measure the tensile force of PSC girder, several NDT methods were studied. However, conventional NDT method cannot be applied to the real-size PSC girder because the PS tendons could not be approached. To measure the tensile force of real-size PSC girder, this study proposed embedded EM sensor based tensile force estimation method. The embedded EM sensor could be installed inside of PSC girder as a sheath joint before the concrete casting. After curing process, the PS tendons were installed, and the tensile force was induced step by step using hydraulic jacking machine. The B-H loop was measured using embedded EM sensor at each tensile force steps and to compare with actual tensile force, the load cell was installed at each end of girder. The magnetization energy loss, that is the closed area of B-H loop, was decreased according to the increase of tensile force with regular pattern. Thus, the tensile force could be estimated by the tracking the change of magnetization energy loss of PS tendons. Through the experimental result, the proposed method can be used to estimate the tensile force of the in-situ real-size PSC girder bridge.Keywords: tensile force estimation, embedded EM sensor, magnetization energy loss, PSC girder
Procedia PDF Downloads 33838457 IIROC's Enforcement Performance: Funnel in, Funnel out, and Funnel away
Authors: Mark Lokanan
Abstract:
The paper analyzes the processing of complaints against investment brokers and dealer members through the Investment Industry Regulatory Organization of Canada (IIROC) from 2008 to 2017. IIROC is the self-regulatory organization (SRO) that is responsible for policing investment dealers and brokerage firms that trade in Canada’s securities market. Data from the study came from IIROC's enforcement annual reports for the years examined. The case processing is evaluated base on the misconduct funnel that was originally designed for street crime and applies to the enforcement of investment fraud. The misconduct funnel is used as a framework to examine IIROC’s claim that it brought in more complaints (funnel in) than government regulators and shows how these complaints are funneled out and funneled away as they are processed through IIROC’s enforcement system. The results indicate that IIROC is ineffective in disciplining its members and is unable to handle the more serious quasi-criminal and improper sales practices offenses. It is hard not to see the results of the paper being used by the legislator in Ottawa to show the importance of a federal securities regulatory agency such as the Securities and Exchange Commission (SEC) in the United States.Keywords: investment fraud, securities regulation, compliance, enforcement
Procedia PDF Downloads 16038456 Entropy Measures on Neutrosophic Soft Sets and Its Application in Multi Attribute Decision Making
Authors: I. Arockiarani
Abstract:
The focus of the paper is to furnish the entropy measure for a neutrosophic set and neutrosophic soft set which is a measure of uncertainty and it permeates discourse and system. Various characterization of entropy measures are derived. Further we exemplify this concept by applying entropy in various real time decision making problems.Keywords: entropy measure, Hausdorff distance, neutrosophic set, soft set
Procedia PDF Downloads 25738455 Thermal and Mechanical Properties of Powder Injection Molded Alumina Nano-Powder
Authors: Mostafa Rezaee Saraji, Ali Keshavarz Panahi
Abstract:
In this work, the processing steps for producing alumina parts using powder injection molding (PIM) technique and nano-powder were investigated and the thermal conductivity and flexural strength of samples were determined as a function of sintering temperature and holding time. In the first step, the feedstock with 58 vol. % of alumina nano-powder with average particle size of 100nm was prepared using Extrumixing method to obtain appropriate homogeneity. This feedstock was injection molded into the two cavity mold with rectangular shape. After injection molding step, thermal and solvent debinding methods were used for debinding of molded samples and then these debinded samples were sintered in different sintering temperatures and holding times. From the results, it was found that the flexural strength and thermal conductivity of samples increased by increasing sintering temperature and holding time; in sintering temperature of 1600ºC and holding time of 5h, the flexural strength and thermal conductivity of sintered samples reached to maximum values of 488MPa and 40.8 W/mK, respectively.Keywords: alumina nano-powder, thermal conductivity, flexural strength, powder injection molding
Procedia PDF Downloads 32938454 Big Data and Health: An Australian Perspective Which Highlights the Importance of Data Linkage to Support Health Research at a National Level
Authors: James Semmens, James Boyd, Anna Ferrante, Katrina Spilsbury, Sean Randall, Adrian Brown
Abstract:
‘Big data’ is a relatively new concept that describes data so large and complex that it exceeds the storage or computing capacity of most systems to perform timely and accurate analyses. Health services generate large amounts of data from a wide variety of sources such as administrative records, electronic health records, health insurance claims, and even smart phone health applications. Health data is viewed in Australia and internationally as highly sensitive. Strict ethical requirements must be met for the use of health data to support health research. These requirements differ markedly from those imposed on data use from industry or other government sectors and may have the impact of reducing the capacity of health data to be incorporated into the real time demands of the Big Data environment. This ‘big data revolution’ is increasingly supported by national governments, who have invested significant funds into initiatives designed to develop and capitalize on big data and methods for data integration using record linkage. The benefits to health following research using linked administrative data are recognised internationally and by the Australian Government through the National Collaborative Research Infrastructure Strategy Roadmap, which outlined a multi-million dollar investment strategy to develop national record linkage capabilities. This led to the establishment of the Population Health Research Network (PHRN) to coordinate and champion this initiative. The purpose of the PHRN was to establish record linkage units in all Australian states, to support the implementation of secure data delivery and remote access laboratories for researchers, and to develop the Centre for Data Linkage for the linkage of national and cross-jurisdictional data. The Centre for Data Linkage has been established within Curtin University in Western Australia; it provides essential record linkage infrastructure necessary for large-scale, cross-jurisdictional linkage of health related data in Australia and uses a best practice ‘separation principle’ to support data privacy and security. Privacy preserving record linkage technology is also being developed to link records without the use of names to overcome important legal and privacy constraint. This paper will present the findings of the first ‘Proof of Concept’ project selected to demonstrate the effectiveness of increased record linkage capacity in supporting nationally significant health research. This project explored how cross-jurisdictional linkage can inform the nature and extent of cross-border hospital use and hospital-related deaths. The technical challenges associated with national record linkage, and the extent of cross-border population movements, were explored as part of this pioneering research project. Access to person-level data linked across jurisdictions identified geographical hot spots of cross border hospital use and hospital-related deaths in Australia. This has implications for planning of health service delivery and for longitudinal follow-up studies, particularly those involving mobile populations.Keywords: data integration, data linkage, health planning, health services research
Procedia PDF Downloads 21638453 Comparison of Developed Statokinesigram and Marker Data Signals by Model Approach
Authors: Boris Barbolyas, Kristina Buckova, Tomas Volensky, Cyril Belavy, Ladislav Dedik
Abstract:
Background: Based on statokinezigram, the human balance control is often studied. Approach to human postural reaction analysis is based on a combination of stabilometry output signal with retroreflective marker data signal processing, analysis, and understanding, in this study. The study shows another original application of Method of Developed Statokinesigram Trajectory (MDST), too. Methods: In this study, the participants maintained quiet bipedal standing for 10 s on stabilometry platform. Consequently, bilateral vibration stimuli to Achilles tendons in 20 s interval was applied. Vibration stimuli caused that human postural system took the new pseudo-steady state. Vibration frequencies were 20, 60 and 80 Hz. Participant's body segments - head, shoulders, hips, knees, ankles and little fingers were marked by 12 retroreflective markers. Markers positions were scanned by six cameras system BTS SMART DX. Registration of their postural reaction lasted 60 s. Sampling frequency was 100 Hz. For measured data processing were used Method of Developed Statokinesigram Trajectory. Regression analysis of developed statokinesigram trajectory (DST) data and retroreflective marker developed trajectory (DMT) data were used to find out which marker trajectories most correlate with stabilometry platform output signals. Scaling coefficients (λ) between DST and DMT by linear regression analysis were evaluated, too. Results: Scaling coefficients for marker trajectories were identified for all body segments. Head markers trajectories reached maximal value and ankle markers trajectories had a minimal value of scaling coefficient. Hips, knees and ankles markers were approximately symmetrical in the meaning of scaling coefficient. Notable differences of scaling coefficient were detected in head and shoulders markers trajectories which were not symmetrical. The model of postural system behavior was identified by MDST. Conclusion: Value of scaling factor identifies which body segment is predisposed to postural instability. Hypothetically, if statokinesigram represents overall human postural system response to vibration stimuli, then markers data represented particular postural responses. It can be assumed that cumulative sum of particular marker postural responses is equal to statokinesigram.Keywords: center of pressure (CoP), method of developed statokinesigram trajectory (MDST), model of postural system behavior, retroreflective marker data
Procedia PDF Downloads 35038452 Survey of Communication Technologies for IoT Deployments in Developing Regions
Authors: Namugenyi Ephrance Eunice, Julianne Sansa Otim, Marco Zennaro, Stephen D. Wolthusen
Abstract:
The Internet of Things (IoT) is a network of connected data processing devices, mechanical and digital machinery, items, animals, or people that may send data across a network without requiring human-to-human or human-to-computer interaction. Each component has sensors that can pick up on specific phenomena, as well as processing software and other technologies that can link to and communicate with other systems and/or devices over the Internet or other communication networks and exchange data with them. IoT is increasingly being used in fields other than consumer electronics, such as public safety, emergency response, industrial automation, autonomous vehicles, the Internet of Medical Things (IoMT), and general environmental monitoring. Consumer-based IoT applications, like smart home gadgets and wearables, are also becoming more prevalent. This paper presents the main IoT deployment areas for environmental monitoring in developing regions and the backhaul options suitable for them. A detailed review of each of the list of papers selected for the study is included in section III of this document. The study includes an overview of existing IoT deployments, the underlying communication architectures, protocols, and technologies that support them. This overview shows that Low Power Wireless Area Networks (LPWANs), as summarized in Table 1, are very well suited for monitoring environment architectures designed for remote locations. LoRa technology, particularly the LoRaWAN protocol, has an advantage over other technologies due to its low power consumption, adaptability, and suitable communication range. The prevailing challenges of the different architectures are discussed and summarized in Table 3 of the IV section, where the main problem is the obstruction of communication paths by buildings, trees, hills, etc.Keywords: communication technologies, environmental monitoring, Internet of Things, IoT deployment challenges
Procedia PDF Downloads 8538451 Numerical Modeling of Air Shock Wave Generated by Explosive Detonation and Dynamic Response of Structures
Authors: Michał Lidner, Zbigniew SzcześNiak
Abstract:
The ability to estimate blast load overpressure properly plays an important role in safety design of buildings. The issue of studying of blast loading on structural elements has been explored for many years. However, in many literature reports shock wave overpressure is estimated with simplified triangular or exponential distribution in time. This indicates some errors when comparing real and numerical reaction of elements. Nonetheless, it is possible to further improve setting similar to the real blast load overpressure function versus time. The paper presents a method of numerical analysis of the phenomenon of the air shock wave propagation. It uses Finite Volume Method and takes into account energy losses due to a heat transfer with respect to an adiabatic process rule. A system of three equations (conservation of mass, momentum and energy) describes the flow of a volume of gaseous medium in the area remote from building compartments, which can inhibit the movement of gas. For validation three cases of a shock wave flow were analyzed: a free field explosion, an explosion inside a steel insusceptible tube (the 1D case) and an explosion inside insusceptible cube (the 3D case). The results of numerical analysis were compared with the literature reports. Values of impulse, pressure, and its duration were studied. Finally, an overall good convergence of numerical results with experiments was achieved. Also the most important parameters were well reflected. Additionally analyses of dynamic response of one of considered structural element were made.Keywords: adiabatic process, air shock wave, explosive, finite volume method
Procedia PDF Downloads 19238450 Dynamic Thermomechanical Behavior of Adhesively Bonded Composite Joints
Authors: Sonia Sassi, Mostapha Tarfaoui, Hamza Benyahia
Abstract:
Composite materials are increasingly being used as a substitute for metallic materials in many technological applications like aeronautics, aerospace, marine and civil engineering applications. For composite materials, the thermomechanical response evolves with the strain rate. The energy balance equation for anisotropic, elastic materials includes heat source terms that govern the conversion of some of the kinetic work into heat. The remainder contributes to the stored energy creating the damage process in the composite material. In this paper, we investigate the bulk thermomechanical behavior of adhesively-bonded composite assemblies to quantitatively asses the temperature rise which accompanies adiabatic deformations. In particular, adhesively bonded joints in glass/vinylester composite material are subjected to in-plane dynamic loads under a range of strain rates. Dynamic thermomechanical behavior of this material is investigated using compression Split Hopkinson Pressure Bars (SHPB) coupled with a high speed infrared camera and a high speed camera to measure in real time the dynamic behavior, the damage kinetic and the temperature variation in the material. The interest of using high speed IR camera is in order to view in real time the evolution of heat dissipation in the material when damage occurs. But, this technique does not produce thermal values in correlation with the stress-strain curves of composite material because of its high time response in comparison with the dynamic test time. For this reason, the authors revisit the application of specific thermocouples placed on the surface of the material to ensure the real thermal measurements under dynamic loading using small thermocouples. Experiments with dynamically loaded material show that the thermocouples record temperatures values with a short typical rise time as a result of the conversion of kinetic work into heat during compression test. This results show that small thermocouples can be used to provide an important complement to other noncontact techniques such as the high speed infrared camera. Significant temperature rise was observed in in-plane compression tests especially under high strain rates. During the tests, it has been noticed that sudden temperature rise occur when macroscopic damage occur. This rise in temperature is linked to the rate of damage. The more serve the damage is, a higher localized temperature is detected. This shows the strong relationship between the occurrence of damage and induced heat dissipation. For the case of the in plane tests, the damage takes place more abruptly as the strain rate is increased. The difference observed in the obtained thermomechanical response in plane compression is explained only by the difference in the damage process being active during the compression tests. In this study, we highlighted the dependence of the thermomechanical response on the strain rate of bonded specimens. The effect of heat dissipation of this material cannot hence be ignored and should be taken into account when defining damage models during impact loading.Keywords: adhesively-bonded composite joints, damage, dynamic compression tests, energy balance, heat dissipation, SHPB, thermomechanical behavior
Procedia PDF Downloads 21338449 Performance Evaluation of REST and GraphQL API Models in Microservices Software Development Domain
Authors: Mohamed S. M. Elghazal, Adel Aneiba, Essa Q. Shahra
Abstract:
This study presents a comprehensive comparative analysis of REST and GraphQL API models within the context of microservices development, offering empirical insights into the strengths and limitations of each approach. The research explores the effectiveness and efficiency of GraphQL versus REST, focusing on their impact on critical software quality metrics and user experience. Using a controlled experimental setup, the study evaluates key performance indicators, including response time, data transfer efficiency, and error rates. The findings reveal that REST APIs demonstrate superior memory efficiency and faster response times, particularly under high-load conditions, making them a reliable choice for performance-critical microservices. On the other hand, GraphQL excels in offering greater flexibility for data fetching but exhibits higher response times and increased error rates when handling complex queries. This research provides a nuanced understanding of the trade-offs between REST and GraphQL API interaction models, offering actionable guidance for developers and researchers in selecting the optimal API model for microservice-based applications. The insights are particularly valuable for balancing considerations such as performance, flexibility, and reliability in real-world implementations.Keywords: REST API, GraphQL AP, microservice, software development
Procedia PDF Downloads 338448 Effect of Chemistry Museum Artifacts on Students’ Memory Enhancement and Interest in Radioactivity in Calabar Education Zone, Cross River State, Nigeria
Authors: Hope Amba Neji
Abstract:
The study adopted a quasi-experimental design. Two schools were used for the experimental study, while one school was used for the control. The experimental groups were subjected to treatment for four weeks with chemistry museum artifacts and a visit as made to the museum so that learners would have real-life learning experiences with museum resources, while the control group was taught with the conventional method. The instrument for the study was a 20-item Chemistry Memory Test (CMT) and a 10-item Chemistry Interest Questionnaire (CIQ). The reliability was ascertained using (KR-20) and alpha reliability coefficient, which yielded a reliability coefficient of .83 and .81, respectively. Data obtained was analyzed using Analysis of Covariance (ANCOVA) and Analysis of variance (ANOVA) at 0.05 level of significance. Findings revealed that museum artifacts have a significant effect on students’ memory enhancement and interest in chemistry. It was recommended chemistry learning should be enhanced, motivating and real with museum artifacts, which significantly aid memory enhancement and interest in chemistry.Keywords: museum artifacts, memory, chemistry, atitude
Procedia PDF Downloads 7538447 Cepstrum Analysis of Human Walking Signal
Authors: Koichi Kurita
Abstract:
In this study, we propose a real-time data collection technique for the detection of human walking motion from the charge generated on the human body. This technique is based on the detection of a sub-picoampere electrostatic induction current, generated by the motion, flowing through the electrode of a wireless portable sensor attached to the subject. An FFT analysis of the wave-forms of the electrostatic induction currents generated by the walking motions showed that the currents generated under normal and restricted walking conditions were different. Moreover, we carried out a cepstrum analysis to detect any differences in the walking style. Results suggest that a slight difference in motion, either due to the individual’s gait or a splinted leg, is directly reflected in the electrostatic induction current generated by the walking motion. The proposed wireless portable sensor enables the detection of even subtle differences in walking motion.Keywords: human walking motion, motion measurement, current measurement, electrostatic induction
Procedia PDF Downloads 34438446 Flowing Online Vehicle GPS Data Clustering Using a New Parallel K-Means Algorithm
Authors: Orhun Vural, Oguz Bayat, Rustu Akay, Osman N. Ucan
Abstract:
This study presents a new parallel approach clustering of GPS data. Evaluation has been made by comparing execution time of various clustering algorithms on GPS data. This paper aims to propose a parallel based on neighborhood K-means algorithm to make it faster. The proposed parallelization approach assumes that each GPS data represents a vehicle and to communicate between vehicles close to each other after vehicles are clustered. This parallelization approach has been examined on different sized continuously changing GPS data and compared with serial K-means algorithm and other serial clustering algorithms. The results demonstrated that proposed parallel K-means algorithm has been shown to work much faster than other clustering algorithms.Keywords: parallel k-means algorithm, parallel clustering, clustering algorithms, clustering on flowing data
Procedia PDF Downloads 22238445 Intelligent Rescheduling Trains for Air Pollution Management
Authors: Kainat Affrin, P. Reshma, G. Narendra Kumar
Abstract:
Optimization of timetable is the need of the day for the rescheduling and routing of trains in real time. Trains are scheduled in parallel with the road transport vehicles to the same destination. As the number of trains is restricted due to single track, customers usually opt for road transport to use frequently. The air pollution increases as the density of vehicles on road transport is increased. Use of an alternate mode of transport like train helps in reducing air-pollution. This paper mainly aims at attracting the passengers to Train transport by proper rescheduling of trains using hybrid of stop-skip algorithm and iterative convex programming algorithm. Rescheduling of train bi-directionally is achieved on a single track with dynamic dual time and varying stops. Introduction of more trains attract customers to use rail transport frequently, thereby decreasing the pollution. The results are simulated using Network Simulator (NS-2).Keywords: air pollution, AODV, re-scheduling, WSNs
Procedia PDF Downloads 36138444 An Efficient FPGA Realization of Fir Filter Using Distributed Arithmetic
Authors: M. Iruleswari, A. Jeyapaul Murugan
Abstract:
Most fundamental part used in many Digital Signal Processing (DSP) application is a Finite Impulse Response (FIR) filter because of its linear phase, stability and regular structure. Designing a high-speed and hardware efficient FIR filter is a very challenging task as the complexity increases with the filter order. In most applications the higher order filters are required but the memory usage of the filter increases exponentially with the order of the filter. Using multipliers occupy a large chip area and need high computation time. Multiplier-less memory-based techniques have gained popularity over past two decades due to their high throughput processing capability and reduced dynamic power consumption. This paper describes the design and implementation of highly efficient Look-Up Table (LUT) based circuit for the implementation of FIR filter using Distributed arithmetic algorithm. It is a multiplier less FIR filter. The LUT can be subdivided into a number of LUT to reduce the memory usage of the LUT for higher order filter. Analysis on the performance of various filter orders with different address length is done using Xilinx 14.5 synthesis tool. The proposed design provides less latency, less memory usage and high throughput.Keywords: finite impulse response, distributed arithmetic, field programmable gate array, look-up table
Procedia PDF Downloads 457