Search results for: real anthropometric database
6129 Harmonic Mitigation and Total Harmonic Distortion Reduction in Grid-Connected PV Systems: A Case Study Using Real-Time Data and Filtering Techniques
Authors: Atena Tazikeh Lemeski, Ismail Ozdamar
Abstract:
This study presents a detailed analysis of harmonic distortion in a grid-connected photovoltaic (PV) system using real-time data captured from a solar power plant. Harmonics introduced by inverters in PV systems can degrade power quality and lead to increased Total Harmonic Distortion (THD), which poses challenges such as transformer overheating, increased power losses, and potential grid instability. This research addresses these issues by applying Fast Fourier Transform (FFT) to identify significant harmonic components and employing notch filters to target specific frequencies, particularly the 3rd harmonic (150 Hz), which was identified as the largest contributor to THD. Initial analysis of the unfiltered voltage signal revealed a THD of 21.15%, with prominent harmonic peaks at 150 Hz, 250 Hz and 350 Hz, corresponding to the 3rd, 5th, and 7th harmonics, respectively. After implementing the notch filters, the THD was reduced to 5.72%, demonstrating the effectiveness of this approach in mitigating harmonic distortion without affecting the fundamental frequency. This paper provides practical insights into the application of real-time filtering techniques in PV systems and their role in improving overall grid stability and power quality. The results indicate that targeted harmonic mitigation is crucial for the sustainable integration of renewable energy sources into modern electrical grids.Keywords: grid-connected photovoltaic systems, fast Fourier transform, harmonic filtering, inverter-induced harmonics
Procedia PDF Downloads 416128 Investigation of Various Physical and Physiological Properties of Ethiopian Elite Men Distances Runners
Authors: Getaye Fisseha Gelaw
Abstract:
The purpose of this study was to investigate the key physical and physiological characteristics of 16 elite male Ethiopian national team distance runners, who have an average age of 28.1±4.3 years, a height of 175.0 ±5.6 cm, a weight of 59.1 ±3.9 kg, a BMI of 19.6 ±1.5, and training age of 10.1 ±5.1 yrs. The average weekly distance is 196.3±13.8 km, the average 10,000m time is 27:14±0.5 min sec, the average half marathon time is 59:30±0.6 min sec, the average marathon time is 2hr 03min 39sec±0.02. In addition, the average Cooper test (12-minute run test) is 4525.4±139.7 meters, and the average VO2 max is 90.8±3.1ml/kg/m. All athletes have a high profile and compete on the international label, and according to the World Athletics athletes' ranking system in 2021, 56.3% of the 16 participants were platinum label status, while the remaining 43.7 % were gold label status-completed an incremental treadmill test for the assessment of VO2peak, submaximal running, lactate threshold and test during which they ran continuously at 21 km/h. The laboratory determined VO2peak was 91.4 ± 1.7 mL/kg/min with anaerobic threshold of 74.2±1.6 mL/min/Kg and VO2max 81%. The speed at the AT is 15.9 ±0.6 Kmh and the altitude is 4,0%. The respiratory compensation RC point was reached at 88.7±1.1 mL/min/Kg and 97% of VO2 max. On RCP, the speed is 17.6 ±0.4 km/h and the altitude/slope are 5.5% percent, and the speed at Maximum effort is 19.5 ±1.5 and the elevation is 6.0%. The data also suggest that Ethiopian distance top athletes have considerably higher VO2 max values than those found in earlier research.Keywords: long-distance running, Ethiopians, VO2 max, world athletics, anthropometric
Procedia PDF Downloads 1296127 Ebola Virus Glycoprotein Inhibitors from Natural Compounds: Computer-Aided Drug Design
Authors: Driss Cherqaoui, Nouhaila Ait Lahcen, Ismail Hdoufane, Mehdi Oubahmane, Wissal Liman, Christelle Delaite, Mohammed M. Alanazi
Abstract:
The Ebola virus is a highly contagious and deadly pathogen that causes Ebola virus disease. The Ebola virus glycoprotein (EBOV-GP) is a key factor in viral entry into host cells, making it a critical target for therapeutic intervention. Using a combination of computational approaches, this study focuses on the identification of natural compounds that could serve as potent inhibitors of EBOV-GP. The 3D structure of EBOV-GP was selected, with missing residues modeled, and this structure was minimized and equilibrated. Two large natural compound databases, COCONUT and NPASS, were chosen and filtered based on toxicity risks and Lipinski’s Rule of Five to ensure drug-likeness. Following this, a pharmacophore model, built from 22 reported active inhibitors, was employed to refine the selection of compounds with a focus on structural relevance to known Ebola inhibitors. The filtered compounds were subjected to virtual screening via molecular docking, which identified ten promising candidates (five from each database) with strong binding affinities to EBOV-GP. These compounds were then validated through molecular dynamics simulations to evaluate their binding stability and interactions with the target. The top three compounds from each database were further analyzed using ADMET profiling, confirming their favorable pharmacokinetic properties, stability, and safety. These results suggest that the selected compounds have the potential to inhibit EBOV-GP, offering new avenues for antiviral drug development against the Ebola virus.Keywords: EBOV-GP, Ebola virus glycoprotein, high-throughput drug screening, molecular docking, molecular dynamics, natural compounds, pharmacophore modeling, virtual screening
Procedia PDF Downloads 246126 New Gas Geothermometers for the Prediction of Subsurface Geothermal Temperatures: An Optimized Application of Artificial Neural Networks and Geochemometric Analysis
Authors: Edgar Santoyo, Daniel Perez-Zarate, Agustin Acevedo, Lorena Diaz-Gonzalez, Mirna Guevara
Abstract:
Four new gas geothermometers have been derived from a multivariate geo chemometric analysis of a geothermal fluid chemistry database, two of which use the natural logarithm of CO₂ and H2S concentrations (mmol/mol), respectively, and the other two use the natural logarithm of the H₂S/H₂ and CO₂/H₂ ratios. As a strict compilation criterion, the database was created with gas-phase composition of fluids and bottomhole temperatures (BHTM) measured in producing wells. The calibration of the geothermometers was based on the geochemical relationship existing between the gas-phase composition of well discharges and the equilibrium temperatures measured at bottomhole conditions. Multivariate statistical analysis together with the use of artificial neural networks (ANN) was successfully applied for correlating the gas-phase compositions and the BHTM. The predicted or simulated bottomhole temperatures (BHTANN), defined as output neurons or simulation targets, were statistically compared with measured temperatures (BHTM). The coefficients of the new geothermometers were obtained from an optimized self-adjusting training algorithm applied to approximately 2,080 ANN architectures with 15,000 simulation iterations each one. The self-adjusting training algorithm used the well-known Levenberg-Marquardt model, which was used to calculate: (i) the number of neurons of the hidden layer; (ii) the training factor and the training patterns of the ANN; (iii) the linear correlation coefficient, R; (iv) the synaptic weighting coefficients; and (v) the statistical parameter, Root Mean Squared Error (RMSE) to evaluate the prediction performance between the BHTM and the simulated BHTANN. The prediction performance of the new gas geothermometers together with those predictions inferred from sixteen well-known gas geothermometers (previously developed) was statistically evaluated by using an external database for avoiding a bias problem. Statistical evaluation was performed through the analysis of the lowest RMSE values computed among the predictions of all the gas geothermometers. The new gas geothermometers developed in this work have been successfully used for predicting subsurface temperatures in high-temperature geothermal systems of Mexico (e.g., Los Azufres, Mich., Los Humeros, Pue., and Cerro Prieto, B.C.) as well as in a blind geothermal system (known as Acoculco, Puebla). The last results of the gas geothermometers (inferred from gas-phase compositions of soil-gas bubble emissions) compare well with the temperature measured in two wells of the blind geothermal system of Acoculco, Puebla (México). Details of this new development are outlined in the present research work. Acknowledgements: The authors acknowledge the funding received from CeMIE-Geo P09 project (SENER-CONACyT).Keywords: artificial intelligence, gas geochemistry, geochemometrics, geothermal energy
Procedia PDF Downloads 3546125 Design and Implementation of Active Radio Frequency Identification on Wireless Sensor Network-Based System
Authors: Che Z. Zulkifli, Nursyahida M. Noor, Siti N. Semunab, Shafawati A. Malek
Abstract:
Wireless sensors, also known as wireless sensor nodes, have been making a significant impact on human daily life. The Radio Frequency Identification (RFID) and Wireless Sensor Network (WSN) are two complementary technologies; hence, an integrated implementation of these technologies expands the overall functionality in obtaining long-range and real-time information on the location and properties of objects and people. An approach for integrating ZigBee and RFID networks is proposed in this paper, to create an energy-efficient network improved by the benefits of combining ZigBee and RFID architecture. Furthermore, the compatibility and requirements of the ZigBee device and communication links in the typical RFID system which is presented with the real world experiment on the capabilities of the proposed RFID system.Keywords: mesh network, RFID, wireless sensor network, zigbee
Procedia PDF Downloads 4626124 Real-Time Kinetic Analysis of Labor-Intensive Repetitive Tasks Using Depth-Sensing Camera
Authors: Sudip Subedi, Nipesh Pradhananga
Abstract:
The musculoskeletal disorders, also known as MSDs, are common in construction workers. MSDs include lower back injuries, knee injuries, spinal injuries, and joint injuries, among others. Since most construction tasks are still manual, construction workers often need to perform repetitive, labor-intensive tasks. And they need to stay in the same or an awkward posture for an extended time while performing such tasks. It induces significant stress to the joints and spines, increasing the risk of getting into MSDs. Manual monitoring of such tasks is virtually impossible with the handful of safety managers in a construction site. This paper proposes a methodology for performing kinetic analysis of the working postures while performing such tasks in real-time. Skeletal of different workers will be tracked using a depth-sensing camera while performing the task to create training data for identifying the best posture. For this, the kinetic analysis will be performed using a human musculoskeletal model in an open-source software system (OpenSim) to visualize the stress induced by essential joints. The “safe posture” inducing lowest stress on essential joints will be computed for different actions involved in the task. The identified “safe posture” will serve as a basis for real-time monitoring and identification of awkward and unsafe postural behaviors of construction workers. Besides, the temporal simulation will be carried out to find the associated long-term effect of repetitive exposure to such observed postures. This will help to create awareness in workers about potential future health hazards and encourage them to work safely. Furthermore, the collected individual data can then be used to provide need-based personalized training to the construction workers.Keywords: construction workers’ safety, depth sensing camera, human body kinetics, musculoskeletal disorders, real time monitoring, repetitive labor-intensive tasks
Procedia PDF Downloads 1326123 Q-Efficient Solutions of Vector Optimization via Algebraic Concepts
Authors: Elham Kiyani
Abstract:
In this paper, we first introduce the concept of Q-efficient solutions in a real linear space not necessarily endowed with a topology, where Q is some nonempty (not necessarily convex) set. We also used the scalarization technique including the Gerstewitz function generated by a nonconvex set to characterize these Q-efficient solutions. The algebraic concepts of interior and closure are useful to study optimization problems without topology. Studying nonconvex vector optimization is valuable since topological interior is equal to algebraic interior for a convex cone. So, we use the algebraic concepts of interior and closure to define Q-weak efficient solutions and Q-Henig proper efficient solutions of set-valued optimization problems, where Q is not a convex cone. Optimization problems with set-valued maps have a wide range of applications, so it is expected that there will be a useful analytical tool in optimization theory for set-valued maps. These kind of optimization problems are closely related to stochastic programming, control theory, and economic theory. The paper focus on nonconvex problems, the results are obtained by assuming generalized non-convexity assumptions on the data of the problem. In convex problems, main mathematical tools are convex separation theorems, alternative theorems, and algebraic counterparts of some usual topological concepts, while in nonconvex problems, we need a nonconvex separation function. Thus, we consider the Gerstewitz function generated by a general set in a real linear space and re-examine its properties in the more general setting. A useful approach for solving a vector problem is to reduce it to a scalar problem. In general, scalarization means the replacement of a vector optimization problem by a suitable scalar problem which tends to be an optimization problem with a real valued objective function. The Gerstewitz function is well known and widely used in optimization as the basis of the scalarization. The essential properties of the Gerstewitz function, which are well known in the topological framework, are studied by using algebraic counterparts rather than the topological concepts of interior and closure. Therefore, properties of the Gerstewitz function, when it takes values just in a real linear space are studied, and we use it to characterize Q-efficient solutions of vector problems whose image space is not endowed with any particular topology. Therefore, we deal with a constrained vector optimization problem in a real linear space without assuming any topology, and also Q-weak efficient and Q-proper efficient solutions in the senses of Henig are defined. Moreover, by means of the Gerstewitz function, we provide some necessary and sufficient optimality conditions for set-valued vector optimization problems.Keywords: algebraic interior, Gerstewitz function, vector closure, vector optimization
Procedia PDF Downloads 2176122 Trends, Status, and Future Directions of Artificial Intelligence in Human Resources Disciplines: A Bibliometric Analysis
Authors: Gertrude I. Hewapathirana, Loi A. Nguyen, Mohammed M. Mostafa
Abstract:
Artificial intelligence (AI) technologies and tools are swiftly integrating into many functions of all organizations as a competitive drive to enhance innovations, productivity, efficiency, faster and precise decision making to keep up with rapid changes in the global business arena. Despite increasing research on AI technologies in production, manufacturing, and information management, AI in human resource disciplines is still lagging. Though a few research studies on HR informatics, recruitment, and HRM in general, how to integrate AI in other HR functional disciplines (e.g., compensation, training, mentoring and coaching, employee motivation) is rarely researched. Many inconsistencies of research hinder developing up-to-date knowledge on AI in HR disciplines. Therefore, exploring eight research questions, using bibliometric network analysis combined with a meta-analysis of published research literature. The authors attempt to generate knowledge on the role of AI in improving the efficiency of HR functional disciplines. To advance the knowledge for the benefit of researchers, academics, policymakers, and practitioners, the study highlights the types of AI innovations and outcomes, trends, gaps, themes and topics, fast-moving disciplines, key players, and future directions.AI in HR informatics in high tech firms is the dominant theme in many research publications. While there is increasing attention from researchers and practitioners, there are many gaps between the promise, potential, and real AI applications in HR disciplines. A higher knowledge gap raised many unanswered questions regarding legal, ethical, and morale aspects of AI in HR disciplines as well as the potential contributions of AI in HR disciplines that may guide future research directions. Though the study provides the most current knowledge, it is limited to peer-reviewed empirical, theoretical, and conceptual research publications stored in the WoS database. The implications for theory, practice, and future research are discussed.Keywords: artificial intelligence, human resources, bibliometric analysis, research directions
Procedia PDF Downloads 986121 Twitter's Impact on Print Media with Respect to Real World Events
Authors: Basit Shahzad, Abdullatif M. Abdullatif
Abstract:
Recent advancements in Information and Communication Technologies (ICT) and easy access to Internet have made social media the first choice for information sharing related to any important events or news. On Twitter, trend is a common feature that quantifies the level of popularity of a certain news or event. In this work, we examine the impact of Twitter trends on real world events by hypothesizing that Twitter trends have an influence on print media in Pakistan. For this, Twitter is used as a platform and Twitter trends as a base line. We first collect data from two sources (Twitter trends and print media) in the period May to August 2016. Obtained data from two sources is analyzed and it is observed that social media is significantly influencing the print media and majority of the news printed in newspaper are posted on Twitter earlier.Keywords: twitter trends, text mining, effectiveness of trends, print media
Procedia PDF Downloads 2606120 Decision Support System for the Management of the Shandong Peninsula, China
Authors: Natacha Fery, Guilherme L. Dalledonne, Xiangyang Zheng, Cheng Tang, Roberto Mayerle
Abstract:
A Decision Support System (DSS) for supporting decision makers in the management of the Shandong Peninsula has been developed. Emphasis has been given to coastal protection, coastal cage aquaculture and harbors. The investigations were done in the framework of a joint research project funded by the German Ministry of Education and Research (BMBF) and the Chinese Academy of Sciences (CAS). In this paper, a description of the DSS, the development of its components, and results of its application are presented. The system integrates in-situ measurements, process-based models, and a database management system. Numerical models for the simulation of flow, waves, sediment transport and morphodynamics covering the entire Bohai Sea are set up based on the Delft3D modelling suite (Deltares). Calibration and validation of the models were realized based on the measurements of moored Acoustic Doppler Current Profilers (ADCP) and High Frequency (HF) radars. In order to enable cost-effective and scalable applications, a database management system was developed. It enhances information processing, data evaluation, and supports the generation of data products. Results of the application of the DSS to the management of coastal protection, coastal cage aquaculture and harbors are presented here. Model simulations covering the most severe storms observed during the last decades were carried out leading to an improved understanding of hydrodynamics and morphodynamics. Results helped in the identification of coastal stretches subjected to higher levels of energy and improved support for coastal protection measures.Keywords: coastal protection, decision support system, in-situ measurements, numerical modelling
Procedia PDF Downloads 1956119 A Gradient Orientation Based Efficient Linear Interpolation Method
Authors: S. Khan, A. Khan, Abdul R. Soomrani, Raja F. Zafar, A. Waqas, G. Akbar
Abstract:
This paper proposes a low-complexity image interpolation method. Image interpolation is used to convert a low dimension video/image to high dimension video/image. The objective of a good interpolation method is to upscale an image in such a way that it provides better edge preservation at the cost of very low complexity so that real-time processing of video frames can be made possible. However, low complexity methods tend to provide real-time interpolation at the cost of blurring, jagging and other artifacts due to errors in slope calculation. Non-linear methods, on the other hand, provide better edge preservation, but at the cost of high complexity and hence they can be considered very far from having real-time interpolation. The proposed method is a linear method that uses gradient orientation for slope calculation, unlike conventional linear methods that uses the contrast of nearby pixels. Prewitt edge detection is applied to separate uniform regions and edges. Simple line averaging is applied to unknown uniform regions, whereas unknown edge pixels are interpolated after calculation of slopes using gradient orientations of neighboring known edge pixels. As a post-processing step, bilateral filter is applied to interpolated edge regions in order to enhance the interpolated edges.Keywords: edge detection, gradient orientation, image upscaling, linear interpolation, slope tracing
Procedia PDF Downloads 2616118 Making the Right Call for Falls: Evaluating the Efficacy of a Multi-Faceted Trust Wide Approach to Improving Patient Safety Post Falls
Authors: Jawaad Saleem, Hannah Wright, Peter Sommerville, Adrian Hopper
Abstract:
Introduction: Inpatient falls are the most commonly reported patient safety incidents, and carry a significant burden on resources, morbidity, and mortality. Ensuring adequate post falls management of patients by staff is therefore paramount to maintaining patient safety especially in out of hours and resource stretched settings. Aims: This quality improvement project aims to improve the current practice of falls management at Guys St Thomas Hospital, London as compared to our 2016 Quality Improvement Project findings. Furthermore, it looks to increase current junior doctors confidence in managing falls and their use of new guidance protocols. Methods: Multifaceted Interventions implemented included: the development of new trust wide guidelines detailing management pathways for patients post falls, available for intranet access. Furthermore, the production of 2000 lanyard cards distributed amongst junior doctors and staff which summarised these guidelines. Additionally, a ‘safety signal’ email was sent from the Trust chief medical officer to all staff raising awareness of falls and the guidelines. Formal falls teaching was also implemented for new doctors at induction. Using an established incident database, 189 consecutive falls in 2017were retrospectively analysed electronically to assess and compared to the variables measured in 2016 post interventions. A separate serious incident database was used to analyse 50 falls from May 2015 to March 2018 to ascertain the statistical significance of the impact of our interventions on serious incidents. A similar questionnaire for the 2017 cohort of foundation year one (FY1) doctors was performed and compared to 2016 results. Results: Questionnaire data demonstrated improved awareness and utility of guidelines and increased confidence as well as an increase in training. 97% of FY1 trainees felt that the interventions had increased their awareness of the impact of falls on patients in the trust. Data from the incident database demonstrated the time to review patients post fall had decreased from an average of 130 to 86 minutes. Improvement was also demonstrated in the reduced time to order and schedule X-ray and CT imaging, 3 and 5 hours respectively. Data from the serious incident database show that ‘the time from fall until harm was detected’ was statistically significantly lower (P = 0.044) post intervention. We also showed the incidence of significant delays in detecting harm ( > 10 hours) reduced post intervention. Conclusions: Our interventions have helped to significantly reduce the average time to assess, order and schedule appropriate imaging post falls. Delays of over ten hours to detect serious injuries after falls were commonplace; since the intervention, their frequency has markedly reduced. We suggest this will lead to identifying patient harm sooner, reduced clinical incidents relating to falls and thus improve overall patient safety. Our interventions have also helped increase clinical staff confidence, management, and awareness of falls in the trust. Next steps include expanding teaching sessions, improving multidisciplinary team involvement to aid this improvement.Keywords: patient safety, quality improvement, serious incidents, falls, clinical care
Procedia PDF Downloads 1246117 Communication Infrastructure Required for a Driver Behaviour Monitoring System, ‘SiaMOTO’ IT Platform
Authors: Dogaru-Ulieru Valentin, Sălișteanu Ioan Corneliu, Ardeleanu Mihăiță Nicolae, Broscăreanu Ștefan, Sălișteanu Bogdan, Mihai Mihail
Abstract:
The SiaMOTO system is a communications and data processing platform for vehicle traffic. The human factor is the most important factor in the generation of this data, as the driver is the one who dictates the trajectory of the vehicle. Like any trajectory, specific parameters refer to position, speed and acceleration. Constant knowledge of these parameters allows complex analyses. Roadways allow many vehicles to travel through their confined space, and the overlapping trajectories of several vehicles increase the likelihood of collision events, known as road accidents. Any such event has causes that lead to its occurrence, so the conditions for its occurrence are known. The human factor is predominant in deciding the trajectory parameters of the vehicle on the road, so monitoring it by knowing the events reported by the DiaMOTO device over time, will generate a guide to target any potentially high-risk driving behavior and reward those who control the driving phenomenon well. In this paper, we have focused on detailing the communication infrastructure of the DiaMOTO device with the traffic data collection server, the infrastructure through which the database that will be used for complex AI/DLM analysis is built. The central element of this description is the data string in CODEC-8 format sent by the DiaMOTO device to the SiaMOTO collection server database. The data presented are specific to a functional infrastructure implemented in an experimental model stage, by installing on a number of 50 vehicles DiaMOTO unique code devices, integrating ADAS and GPS functions, through which vehicle trajectories can be monitored 24 hours a day.Keywords: DiaMOTO, Codec-8, ADAS, GPS, driver monitoring
Procedia PDF Downloads 796116 Incorporating Moving Authority Limits Into Driving Advice
Authors: Peng Zhou, Peter Pudney
Abstract:
Driver advice systems are used by many rail operators to help train drivers to improve timekeeping while minimising energy use. These systems typically operate independently of the safeworking system, because information on how far the train is allowed to travel -the “limit of authority"- is usually not available as real-time data that can be used when generating driving advice. This is not an issue when there is sufficient separation between trains. But on systems with low headways, driving advice could conflict with safeworking requirements. We describe a method for generating driving advice that takes into account a moving limit of authority that is communicated to the train in real-time. We illustrate the method with four simulated examples using data from the Zhengzhou Metro. The method will allow driver advice systems to be used more effectively on railways with low headways.Keywords: railway transportation, energy efficient train operation, optimal train control, safe separation
Procedia PDF Downloads 156115 Real-Time Big-Data Warehouse a Next-Generation Enterprise Data Warehouse and Analysis Framework
Authors: Abbas Raza Ali
Abstract:
Big Data technology is gradually becoming a dire need of large enterprises. These enterprises are generating massively large amount of off-line and streaming data in both structured and unstructured formats on daily basis. It is a challenging task to effectively extract useful insights from the large scale datasets, even though sometimes it becomes a technology constraint to manage transactional data history of more than a few months. This paper presents a framework to efficiently manage massively large and complex datasets. The framework has been tested on a communication service provider producing massively large complex streaming data in binary format. The communication industry is bound by the regulators to manage history of their subscribers’ call records where every action of a subscriber generates a record. Also, managing and analyzing transactional data allows service providers to better understand their customers’ behavior, for example, deep packet inspection requires transactional internet usage data to explain internet usage behaviour of the subscribers. However, current relational database systems limit service providers to only maintain history at semantic level which is aggregated at subscriber level. The framework addresses these challenges by leveraging Big Data technology which optimally manages and allows deep analysis of complex datasets. The framework has been applied to offload existing Intelligent Network Mediation and relational Data Warehouse of the service provider on Big Data. The service provider has 50+ million subscriber-base with yearly growth of 7-10%. The end-to-end process takes not more than 10 minutes which involves binary to ASCII decoding of call detail records, stitching of all the interrogations against a call (transformations) and aggregations of all the call records of a subscriber.Keywords: big data, communication service providers, enterprise data warehouse, stream computing, Telco IN Mediation
Procedia PDF Downloads 1766114 Indicators for Success of Obesity Reduction Programs in Adolescents; Body Composition and Body Mass Index: Evaluating a School-Based Health Promotion Project in Iran after 12 Weeks of Intervention
Authors: Saeid Doaei
Abstract:
Background: Obesity in adolescence is a primary risk factor for obesity in adulthood. The objective of this study was the assessment of the effect of a comprehensive lifestyle intervention on different anthropometric indices in 12 to 16 years old boy adolescents. Methods: 96 adolescent boys of two schools of District 5 of Tehran have participated in this study. The schools were randomly assigned as intervention school (n=53) and control school (n=43). The height and weight of students were measured with a calibrated tape line and digital scale respectively and their BMI were calculated. The amounts of body fat percent (BF) and body muscle (BM) percent were determined by Bio Impedance Analyzer (BIA) considering the age, gender and height of students at baseline and after intervention. The intervention was implemented in the intervention school, according to the Ottawa charter principles. Results: 12 weeks of intervention decreased body fat percent in the intervention group in comparison with the control group (decreased by 1.81 % in the intervention group and increased by .39 % in the control group, P < .01). However, weight, BMI and BM did not change significantly. Conclusion: The result of this study showed that the implementation of comprehensive intervention in obese adolescents may improve the body composition, although these changes may not be reflected in BMI. It is possible that BMI is not a good indicator in assessment of the success of obesity management intervention.Keywords: obesity, childhood, BMI, nutrition
Procedia PDF Downloads 2716113 Developing a Driving Simulator with a Navigation System to Measure Driver Distraction, Workload, Driving Safety and Performance
Authors: Tamer E. Yared
Abstract:
The use of driving simulators has made laboratory testing easier. It has been proven to be valid for testing driving ability by many researchers. One benefit of using driving simulators is keeping the human subjects away from traffic hazards, which drivers usually face in a real driving environment while performing a driving experiment. In this study, a driving simulator was developed with a navigation system using a game development software (Unity 3D) and C-sharp codes to measure and evaluate driving performance, safety, and workload for different driving tasks. The driving simulator hardware included a gaming steering wheel and pedals as well as a monitor to view the driving tasks. Moreover, driver distraction was evaluated by utilizing an eye-tracking system working in conjunction with the driving simulator. Twenty subjects were recruited to evaluate driver distraction, workload, driving safety, and performance, as well as provide their feedback about the driving simulator. The subjects’ feedback was obtained by filling a survey after conducting several driving tasks. The main question of that survey was asking the subjects to compare driving on the driving simulator with real driving. Furthermore, other aspects of the driving simulator were evaluated by the subjects in the survey. The survey revealed that the recruited subjects gave an average score of 7.5 out of 10 to the driving simulator when compared to real driving, where the scores ranged between 6 and 8.5. This study is a preliminary effort that opens the door for more improvements to the driving simulator in terms of hardware and software development, which will contribute significantly to driving ability testing.Keywords: driver distraction, driving performance, driving safety, driving simulator, driving workload, navigation system
Procedia PDF Downloads 1796112 Multivariate Data Analysis for Automatic Atrial Fibrillation Detection
Authors: Zouhair Haddi, Stephane Delliaux, Jean-Francois Pons, Ismail Kechaf, Jean-Claude De Haro, Mustapha Ouladsine
Abstract:
Atrial fibrillation (AF) has been considered as the most common cardiac arrhythmia, and a major public health burden associated with significant morbidity and mortality. Nowadays, telemedical approaches targeting cardiac outpatients situate AF among the most challenged medical issues. The automatic, early, and fast AF detection is still a major concern for the healthcare professional. Several algorithms based on univariate analysis have been developed to detect atrial fibrillation. However, the published results do not show satisfactory classification accuracy. This work was aimed at resolving this shortcoming by proposing multivariate data analysis methods for automatic AF detection. Four publicly-accessible sets of clinical data (AF Termination Challenge Database, MIT-BIH AF, Normal Sinus Rhythm RR Interval Database, and MIT-BIH Normal Sinus Rhythm Databases) were used for assessment. All time series were segmented in 1 min RR intervals window and then four specific features were calculated. Two pattern recognition methods, i.e., Principal Component Analysis (PCA) and Learning Vector Quantization (LVQ) neural network were used to develop classification models. PCA, as a feature reduction method, was employed to find important features to discriminate between AF and Normal Sinus Rhythm. Despite its very simple structure, the results show that the LVQ model performs better on the analyzed databases than do existing algorithms, with high sensitivity and specificity (99.19% and 99.39%, respectively). The proposed AF detection holds several interesting properties, and can be implemented with just a few arithmetical operations which make it a suitable choice for telecare applications.Keywords: atrial fibrillation, multivariate data analysis, automatic detection, telemedicine
Procedia PDF Downloads 2696111 Analysis of the Effect of Increased Self-Awareness on the Amount of Food Thrown Away
Authors: Agnieszka Dubiel, Artur Grabowski, Tomasz Przerywacz, Mateusz Roganowicz, Patrycja Zioty
Abstract:
Food waste is one of the most significant challenges humanity is facing nowadays. Every year, reports from global organizations show the scale of the phenomenon, although society's awareness is still insufficient. One-third of the food produced in the world is wasted at various points in the food supply chain. Wastes are present from the delivery through the food preparation and distribution to the end of the sale and consumption. The first step in understanding and resisting the phenomenon is a thorough analysis of the everyday behaviors of humanity. This concept is understood as finding the correlation between the type of food and the reason for throwing it out and wasting it. Those actions were identified as a critical step in the start of work to develop technology to prevent food waste. In this paper, the problem mentioned above was analyzed by focusing on the inhabitants of Central Europe, especially Poland, aged 20-30. This paper provides an insight into collecting data through dedicated software and an organized database. The proposed database contains information on the amount, type, and reasons for wasting food in households. A literature review supported the work to answer research questions, compare the situation in Poland with the problem analyzed in other countries, and find research gaps. The proposed article examines the cause of food waste and its quantity in detail. This review complements previous reviews by emphasizing social and economic innovation in Poland's food waste management. The paper recommends a course of action for future research on food waste management and prevention related to the handling and disposal of food, emphasizing households, i.e., the last link in the supply chain.Keywords: food waste, food waste reduction, consumer food waste, human-food interaction
Procedia PDF Downloads 1196110 Statistical Analysis of Natural Images after Applying ICA and ISA
Authors: Peyman Sheikholharam Mashhadi
Abstract:
Difficulties in analyzing real world images in classical image processing and machine vision framework have motivated researchers towards considering the biology-based vision. It is a common belief that mammalian visual cortex has been adapted to the statistics of the real world images through the evolution process. There are two well-known successful models of mammalian visual cortical cells: Independent Component Analysis (ICA) and Independent Subspace Analysis (ISA). In this paper, we statistically analyze the dependencies which remain in the components after applying these models to the natural images. Also, we investigate the response of feature detectors to gratings with various parameters in order to find optimal parameters of the feature detectors. Finally, the selectiveness of feature detectors to phase, in both models is considered.Keywords: statistics, independent component analysis, independent subspace analysis, phase, natural images
Procedia PDF Downloads 3406109 Hybrid Renewable Energy System Development Towards Autonomous Operation: The Deployment Potential in Greece
Authors: Afroditi Zamanidou, Dionysios Giannakopoulos, Konstantinos Manolitsis
Abstract:
A notable amount of electrical energy demand in many countries worldwide is used to cover public energy demand for road, square and other public spaces’ lighting. Renewable energy can contribute in a significant way to the electrical energy demand coverage for public lighting. This paper focuses on the sizing and design of a hybrid energy system (HES) exploiting the solar-wind energy potential to meet the electrical energy needs of lighting roads, squares and other public spaces. Moreover, the proposed HES provides coverage of the electrical energy demand for a Wi-Fi hotspot and a charging hotspot for the end-users. Alongside the sizing of the energy production system of the proposed HES, in order to ensure a reliable supply without interruptions, a storage system is added and sized. Multiple scenarios of energy consumption are assumed and applied in order to optimize the sizing of the energy production system and the energy storage system. A database with meteorological prediction data for 51 areas in Greece is developed in order to assess the possible deployment of the proposed HES. Since there are detailed meteorological prediction data for all 51 areas under investigation, the use of these data is evaluated, comparing them to real meteorological data. The meteorological prediction data are exploited to form three hourly production profiles for each area for every month of the year; minimum, average and maximum energy production. The energy production profiles are combined with the energy consumption scenarios and the sizing results of the energy production system and the energy storage system are extracted and presented for every area. Finally, the economic performance of the proposed HES in terms of Levelized cost of energy is estimated by calculating and assessing construction, operation and maintenance costs.Keywords: energy production system sizing, Greece’s deployment potential, meteorological prediction data, wind-solar hybrid energy system, levelized cost of energy
Procedia PDF Downloads 1566108 Online Multilingual Dictionary Using Hamburg Notation for Avatar-Based Indian Sign Language Generation System
Authors: Sugandhi, Parteek Kumar, Sanmeet Kaur
Abstract:
Sign Language (SL) is used by deaf and other people who cannot speak but can hear or have a problem with spoken languages due to some disability. It is a visual gesture language that makes use of either one hand or both hands, arms, face, body to convey meanings and thoughts. SL automation system is an effective way which provides an interface to communicate with normal people using a computer. In this paper, an avatar based dictionary has been proposed for text to Indian Sign Language (ISL) generation system. This research work will also depict a literature review on SL corpus available for various SL s over the years. For ISL generation system, a written form of SL is required and there are certain techniques available for writing the SL. The system uses Hamburg sign language Notation System (HamNoSys) and Signing Gesture Mark-up Language (SiGML) for ISL generation. It is developed in PHP using Web Graphics Library (WebGL) technology for 3D avatar animation. A multilingual ISL dictionary is developed using HamNoSys for both English and Hindi Language. This dictionary will be used as a database to associate signs with words or phrases of a spoken language. It provides an interface for admin panel to manage the dictionary, i.e., modification, addition, or deletion of a word. Through this interface, HamNoSys can be developed and stored in a database and these notations can be converted into its corresponding SiGML file manually. The system takes natural language input sentence in English and Hindi language and generate 3D sign animation using an avatar. SL generation systems have potential applications in many domains such as healthcare sector, media, educational institutes, commercial sectors, transportation services etc. This research work will help the researchers to understand various techniques used for writing SL and generation of Sign Language systems.Keywords: avatar, dictionary, HamNoSys, hearing impaired, Indian sign language (ISL), sign language
Procedia PDF Downloads 2326107 Light-Weight Network for Real-Time Pose Estimation
Authors: Jianghao Hu, Hongyu Wang
Abstract:
The effective and efficient human pose estimation algorithm is an important task for real-time human pose estimation on mobile devices. This paper proposes a light-weight human key points detection algorithm, Light-Weight Network for Real-Time Pose Estimation (LWPE). LWPE uses light-weight backbone network and depthwise separable convolutions to reduce parameters and lower latency. LWPE uses the feature pyramid network (FPN) to fuse the high-resolution, semantically weak features with the low-resolution, semantically strong features. In the meantime, with multi-scale prediction, the predicted result by the low-resolution feature map is stacked to the adjacent higher-resolution feature map to intermediately monitor the network and continuously refine the results. At the last step, the key point coordinates predicted in the highest-resolution are used as the final output of the network. For the key-points that are difficult to predict, LWPE adopts the online hard key points mining strategy to focus on the key points that hard predicting. The proposed algorithm achieves excellent performance in the single-person dataset selected in the AI (artificial intelligence) challenge dataset. The algorithm maintains high-precision performance even though the model only contains 3.9M parameters, and it can run at 225 frames per second (FPS) on the generic graphics processing unit (GPU).Keywords: depthwise separable convolutions, feature pyramid network, human pose estimation, light-weight backbone
Procedia PDF Downloads 1546106 Early Diagnosis of Myocardial Ischemia Based on Support Vector Machine and Gaussian Mixture Model by Using Features of ECG Recordings
Authors: Merve Begum Terzi, Orhan Arikan, Adnan Abaci, Mustafa Candemir
Abstract:
Acute myocardial infarction is a major cause of death in the world. Therefore, its fast and reliable diagnosis is a major clinical need. ECG is the most important diagnostic methodology which is used to make decisions about the management of the cardiovascular diseases. In patients with acute myocardial ischemia, temporary chest pains together with changes in ST segment and T wave of ECG occur shortly before the start of myocardial infarction. In this study, a technique which detects changes in ST/T sections of ECG is developed for the early diagnosis of acute myocardial ischemia. For this purpose, a database of real ECG recordings that contains a set of records from 75 patients presenting symptoms of chest pain who underwent elective percutaneous coronary intervention (PCI) is constituted. 12-lead ECG’s of the patients were recorded before and during the PCI procedure. Two ECG epochs, which are the pre-inflation ECG which is acquired before any catheter insertion and the occlusion ECG which is acquired during balloon inflation, are analyzed for each patient. By using pre-inflation and occlusion recordings, ECG features that are critical in the detection of acute myocardial ischemia are identified and the most discriminative features for the detection of acute myocardial ischemia are extracted. A classification technique based on support vector machine (SVM) approach operating with linear and radial basis function (RBF) kernels to detect ischemic events by using ST-T derived joint features from non-ischemic and ischemic states of the patients is developed. The dataset is randomly divided into training and testing sets and the training set is used to optimize SVM hyperparameters by using grid-search method and 10fold cross-validation. SVMs are designed specifically for each patient by tuning the kernel parameters in order to obtain the optimal classification performance results. As a result of implementing the developed classification technique to real ECG recordings, it is shown that the proposed technique provides highly reliable detections of the anomalies in ECG signals. Furthermore, to develop a detection technique that can be used in the absence of ECG recording obtained during healthy stage, the detection of acute myocardial ischemia based on ECG recordings of the patients obtained during ischemia is also investigated. For this purpose, a Gaussian mixture model (GMM) is used to represent the joint pdf of the most discriminating ECG features of myocardial ischemia. Then, a Neyman-Pearson type of approach is developed to provide detection of outliers that would correspond to acute myocardial ischemia. Neyman – Pearson decision strategy is used by computing the average log likelihood values of ECG segments and comparing them with a range of different threshold values. For different discrimination threshold values and number of ECG segments, probability of detection and probability of false alarm values are computed, and the corresponding ROC curves are obtained. The results indicate that increasing number of ECG segments provide higher performance for GMM based classification. Moreover, the comparison between the performances of SVM and GMM based classification showed that SVM provides higher classification performance results over ECG recordings of considerable number of patients.Keywords: ECG classification, Gaussian mixture model, Neyman–Pearson approach, support vector machine
Procedia PDF Downloads 1626105 Nano Generalized Topology
Authors: M. Y. Bakeir
Abstract:
Rough set theory is a recent approach for reasoning about data. It has achieved a large amount of applications in various real-life fields. The main idea of rough sets corresponds to the lower and upper set approximations. These two approximations are exactly the interior and the closure of the set with respect to a certain topology on a collection U of imprecise data acquired from any real-life field. The base of the topology is formed by equivalence classes of an equivalence relation E defined on U using the available information about data. The theory of generalized topology was studied by Cs´asz´ar. It is well known that generalized topology in the sense of Cs´asz´ar is a generalization of the topology on a set. On the other hand, many important collections of sets related with the topology on a set form a generalized topology. The notion of Nano topology was introduced by Lellis Thivagar, which was defined in terms of approximations and boundary region of a subset of an universe using an equivalence relation on it. The purpose of this paper is to introduce a new generalized topology in terms of rough set called nano generalized topologyKeywords: rough sets, topological space, generalized topology, nano topology
Procedia PDF Downloads 4316104 The Effects of a Digital Dialogue Game on Higher Education Students’ Argumentation-Based Learning
Authors: Omid Noroozi
Abstract:
Digital dialogue games have opened up opportunities for learning skills by engaging students in complex problem solving that mimic real world situations, without importing unwanted constraints and risks of the real world. Digital dialogue games can be motivating and engaging to students for fun, creative thinking, and learning. This study explored how undergraduate students engage with argumentative discourse activities which have been designed to intensify debate. A pre-test, post-test design was used with students who were assigned to groups of four and asked to debate a controversial topic with the aim of exploring various 'pros and cons' on the 'Genetically Modified Organisms (GMOs)'. Findings reveal that the Digital dialogue game can facilitate argumentation-based learning. The digital Dialogue game was also evaluated positively in terms of students’ satisfaction and learning experiences.Keywords: argumentation, dialogue, digital game, learning, motivation
Procedia PDF Downloads 3216103 The Implications of the Lacanian Concept of 'Lalangue' for Lacanian Theory and Clinical Practice
Authors: Dries Dulsster
Abstract:
This research we want to discuss the implications of the concept of ‘lalangue’ and illustrate its importance for lacanian psychoanalysis and its clinical practice. We will look at this concept through an in depth reading of Lacan’s later seminars, his lectures at the North-American universities and his study on James Joyce. We will illustrate the importance of this concept with a case study from a clinical practice. We will argue that the introduction of ‘lalangue’ has several theoretical and clinical implications that will radically change Lacans teachings. We will illustrate the distinction between language and lalangue. Language serves communication, but this is not the case with lalangue. We will claim that there is jouissance in language and will approach this by introducing the concept of ‘lalangue’. We will ask ourselves what the effect will be of this distinction and how we can use this in clinical practice. The concept of ‘lalangue’ will introduce a new way of thinking about the unconscious. It will force us to no longer view the unconscious as Symbolic, but as Imaginary or Real. Another implication will be the approach on the symptom, no longer approaching it as a formation of the unconscious. It will be renamed as ‘sinthome’, as function of the real. Last of all it will force us to rethink the lacanian interpretation and how we direct the treatment. The implications on a clinical level will be how we think about the lacanian interpretation and the direction of the treatment. We will no longer focus on language and meaning, but focus on jouissance and the ways in which the subject deals with this. We will illustrate this importance with a clinical case study. To summarize, the concept of lalangue forces us to radically rethink lacanian psychoanalysis, with major implications on a theoretical and clinical level. It introduces new concepts such as the real unconscious and the sinthome. It will also make us rethink the way we work as lacanian psychoanalysts.Keywords: Lacan's later teaching, language, Lalangue, the unconscious
Procedia PDF Downloads 2296102 Digi-Buddy: A Smart Cane with Artificial Intelligence and Real-Time Assistance
Authors: Amaladhithyan Krishnamoorthy, Ruvaitha Banu
Abstract:
Vision is considered as the most important sense in humans, without which leading a normal can be often difficult. There are many existing smart canes for visually impaired with obstacle detection using ultrasonic transducer to help them navigate. Though the basic smart cane increases the safety of the users, it does not help in filling the void of visual loss. This paper introduces the concept of Digi-Buddy which is an evolved smart cane for visually impaired. The cane consists for several modules, apart from the basic obstacle detection features; the Digi-Buddy assists the user by capturing video/images and streams them to the server using a wide-angled camera, which then detects the objects using Deep Convolutional Neural Network. In addition to determining what the particular image/object is, the distance of the object is assessed by the ultrasonic transducer. The sound generation application, modelled with the help of Natural Language Processing is used to convert the processed images/object into audio. The object detected is signified by its name which is transmitted to the user with the help of Bluetooth hear phones. The object detection is extended to facial recognition which maps the faces of the person the user meets in the database of face images and alerts the user about the person. One of other crucial function consists of an automatic-intimation-alarm which is triggered when the user is in an emergency. If the user recovers within a set time, a button is provisioned in the cane to stop the alarm. Else an automatic intimation is sent to friends and family about the whereabouts of the user using GPS. In addition to safety and security by the existing smart canes, the proposed concept devices to be implemented as a prototype helping visually-impaired visualize their surroundings through audio more in an amicable way.Keywords: artificial intelligence, facial recognition, natural language processing, internet of things
Procedia PDF Downloads 3556101 A Bibliometric Analysis of Ukrainian Research Articles on SARS-COV-2 (COVID-19) in Compliance with the Standards of Current Research Information Systems
Authors: Sabina Auhunas
Abstract:
These days in Ukraine, Open Science dramatically develops for the sake of scientists of all branches, providing an opportunity to take a more close look on the studies by foreign scientists, as well as to deliver their own scientific data to national and international journals. However, when it comes to the generalization of data on science activities by Ukrainian scientists, these data are often integrated into E-systems that operate inconsistent and barely related information sources. In order to resolve these issues, developed countries productively use E-systems, designed to store and manage research data, such as Current Research Information Systems that enable combining uncompiled data obtained from different sources. An algorithm for selecting SARS-CoV-2 research articles was designed, by means of which we collected the set of papers published by Ukrainian scientists and uploaded by August 1, 2020. Resulting metadata (document type, open access status, citation count, h-index, most cited documents, international research funding, author counts, the bibliographic relationship of journals) were taken from Scopus and Web of Science databases. The study also considered the info from COVID-19/SARS-CoV-2-related documents published from December 2019 to September 2020, directly from documents published by authors depending on territorial affiliation to Ukraine. These databases are enabled to get the necessary information for bibliometric analysis and necessary details: copyright, which may not be available in other databases (e.g., Science Direct). Search criteria and results for each online database were considered according to the WHO classification of the virus and the disease caused by this virus and represented (Table 1). First, we identified 89 research papers that provided us with the final data set after consolidation and removing duplication; however, only 56 papers were used for the analysis. The total number of documents by results from the WoS database came out at 21641 documents (48 affiliated to Ukraine among them) in the Scopus database came out at 32478 documents (41 affiliated to Ukraine among them). According to the publication activity of Ukrainian scientists, the following areas prevailed: Education, educational research (9 documents, 20.58%); Social Sciences, interdisciplinary (6 documents, 11.76%) and Economics (4 documents, 8.82%). The highest publication activity by institution types was reported in the Ministry of Education and Science of Ukraine (its percent of published scientific papers equals 36% or 7 documents), Danylo Halytsky Lviv National Medical University goes next (5 documents, 15%) and P. L. Shupyk National Medical Academy of Postgraduate Education (4 documents, 12%). Basically, research activities by Ukrainian scientists were funded by 5 entities: Belgian Development Cooperation, the National Institutes of Health (NIH, U.S.), The United States Department of Health & Human Services, grant from the Whitney and Betty MacMillan Center for International and Area Studies at Yale, a grant from the Yale Women Faculty Forum. Based on the results of the analysis, we obtained a set of published articles and preprints to be assessed on the variety of features in upcoming studies, including citation count, most cited documents, a bibliographic relationship of journals, reference linking. Further research on the development of the national scientific E-database continues using brand new analytical methods.Keywords: content analysis, COVID-19, scientometrics, text mining
Procedia PDF Downloads 1166100 Design and Optimization of a Small Hydraulic Propeller Turbine
Authors: Dario Barsi, Marina Ubaldi, Pietro Zunino, Robert Fink
Abstract:
A design and optimization procedure is proposed and developed to provide the geometry of a high efficiency compact hydraulic propeller turbine for low head. For the preliminary design of the machine, classic design criteria, based on the use of statistical correlations for the definition of the fundamental geometric parameters and the blade shapes are used. These relationships are based on the fundamental design parameters (i.e., specific speed, flow coefficient, work coefficient) in order to provide a simple yet reliable procedure. Particular attention is paid, since from the initial steps, on the correct conformation of the meridional channel and on the correct arrangement of the blade rows. The preliminary geometry thus obtained is used as a starting point for the hydrodynamic optimization procedure, carried out using a CFD calculation software coupled with a genetic algorithm that generates and updates a large database of turbine geometries. The optimization process is performed using a commercial approach that solves the turbulent Navier Stokes equations (RANS) by exploiting the axial-symmetric geometry of the machine. The geometries generated within the database are therefore calculated in order to determine the corresponding overall performance. In order to speed up the optimization calculation, an artificial neural network (ANN) based on the use of an objective function is employed. The procedure was applied for the specific case of a propeller turbine with an innovative design of a modular type, specific for applications characterized by very low heads. The procedure is tested in order to verify its validity and the ability to automatically obtain the targeted net head and the maximum for the total to total internal efficiency.Keywords: renewable energy conversion, hydraulic turbines, low head hydraulic energy, optimization design
Procedia PDF Downloads 150